Our results show that women’s contributions tend to be accepted more often than men’s [when their gender is hidden]. However, when a woman’s gender is identifiable, they are rejected more often. Our results suggest that although women on GitHub may be more competent overall, bias against them exists nonetheless.

  • rbn@sopuli.xyz
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    1 month ago

    Thanks for pointing that out.

    Seems like a wild idea as… a) it poisons the data not only for AI but also real users like me (I swear I’m not a bot :D). b) if this approach is used more widely, AIs will learn very fast to identify and ignore such non-sense links and probably much faster than real humans.

    It sounds like a similar concept as captchas which annoy real people, yet fail to block out bots.