Emotion artificial intelligence uses biological signals such as vocal tone, facial expressions and data from wearable devices as well as text and how people use their computers, to detect and predict how someone is feeling. It can be used in the workplace, for hiring, etc. Loss of privacy is just the beginning. Workers are worried about biased AI and the need to perform the ‘right’ expressions and body language for the algorithms.

  • Apathy Tree@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    2 months ago

    This would absolutely flag me for something. I tend to have flat delivery, low pitch, avoid eye contact, etc. and when combined with other metrics, could easily flag me as not being a happy enough camper.

    I mean don’t get me wrong, I’m never going to be happy to be working, but if I showed up that day, I’m also in a good enough headspace to do my job… and if you want to fire me for that… for having stuff going on and not faking vocal patterns…

    This is why I don’t want to work anymore. It’s gotten so invasive and fraught if you happen to be anything but a happy bubbly neurotypical fake. And that’s wildly stressful. I’m not a machine, and refuse to be treated like one. If that means I have to die in poverty, well, dump me in the woods, I guess.

    This shit should never be legal.

      • joelfromaus@aussie.zone
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        Bring in Universal Basic Income. Introduce emotion tracking as job KPI. Fire me because I don’t emote per LLM datasets. Live comfortably unemployed.

        Best dystopian outcome. A guy can dream, right?

        • The Doctor@beehaw.org
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          “Our smart securicams don’t trust you” is the new “You’re not a good culture fit.”