In similar case, US National Eating Disorder Association laid off entire helpline staff. Soon after, chatbot disabled for giving out harmful information.

  • MeanEYE@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    1
    ·
    1 year ago

    That’s one idiot CEO if he thinks AI outperforms human. It might in replying generic answers to well known question. Everything other than that is just wapor.

    • whereisk@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 year ago

      Also, the 3 remaining will be falling over themselves looking for a new job to get away from this toxic mf.

    • ooboontoo@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      It might in replying generic answers to well known question.

      That’s kind of the point of using the LLM to replace the person reading from the script right? Moreover, generic answers to well known questions could make up the bulk of the calls and you train the LLM to hand off to a real person if it gets stuck. The reality is the people doing that job were not adding a lot of value over what an LLM can do. So if the people are more expensive and the output is exactly the same as the LLM (or the LLM is even better as this CEO claims) the business will have to lay off the people to stay competitive.

      We should be looking to retool folks that are impacted by technological advancements like this, but for some reason there is little appetite for that.