Emotion recognition systems are finding growing use, from monitoring customer responses to ads to scanning for ‘distressed’ women in danger.

  • AllonzeeLV@lemmy.world
    link
    fedilink
    arrow-up
    40
    ·
    edit-2
    1 year ago

    Too bad it gets the emotion and not the context.

    I’d love to be fired because “I hate making money for these greedy ass capitalist douchebags” pops up on a screen whenever I come in.

    The idea that employers should even be allowed to ask what their employees are feeling, much less scan them to discern it, is a new low for our modern Orwellian dystopia.

    • thanevim@kbin.social
      link
      fedilink
      arrow-up
      14
      ·
      1 year ago

      The thing is though, I don’t see how someone like this could even work out.

      Like, you hire employee 1, they get frustrated at something overnight. You fire them for being upset. Now you have to fill the seat. Employee 2 is brought on. They get told what happened to the person they replaced. They leave or are fired for having emotion and being human. This repeats ad nauseum.

      • Radioactive Radio@lemm.ee
        link
        fedilink
        arrow-up
        12
        ·
        1 year ago

        Let’s be real, most of us would get weeded out at the interview when they start spilling all the “we’re like a family” bullshit.

      • AllonzeeLV@lemmy.world
        link
        fedilink
        arrow-up
        10
        ·
        edit-2
        1 year ago

        I’m guessing it’s going to be implemented as identifying “persistent negative attitudes” and as validation to fire anyone in non-fire-at-will locales.

        It could also be used as bullshit to deny raises and promotions if your grateful or motivated indexes weren’t high enough.

        • FringeTheory999@lemmy.world
          link
          fedilink
          arrow-up
          11
          ·
          1 year ago

          so, basically a tool to suss out which employees have undisclosed mental health issues that the employer can’t legally ask about. cool. cool.

    • SokathHisEyesOpen@lemmy.ml
      link
      fedilink
      English
      arrow-up
      12
      ·
      edit-2
      1 year ago

      What’s crazy is that this was already fully functional and in-use at least 8 years ago. Idk how this has stayed out of the headlines until now. Microsoft had a working demo of this in their visitor center in 2015 and was already using it in multiple places. As soon as you enter the room it assigns you a persistent ID, estimates your height, weight, eye color, hair color, and age. Then it tracks your mood and the overall mood of the room continuously. The ID can be persistent across any number of linked locations. They don’t ask for anyone’s permission before using it.

    • Lyrl@lemm.ee
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      Sounds like you are fighting on behalf of the whole world. I hope you get some times with yourself or a smaller circle that are positive and a break from the dumpster fires of modern civilization.

  • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 ℹ️@yiffit.net
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    1 year ago

    If they could do that, they would probably see how God damn miserable most people are. If they used that to change and make them not miserable, I don’t see it being dangerous. But more than likely it will be more “your sadness doesn’t vibe with us. You’re fired.”