In similar case, US National Eating Disorder Association laid off entire helpline staff. Soon after, chatbot disabled for giving out harmful information.

  • ooboontoo@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    It might in replying generic answers to well known question.

    That’s kind of the point of using the LLM to replace the person reading from the script right? Moreover, generic answers to well known questions could make up the bulk of the calls and you train the LLM to hand off to a real person if it gets stuck. The reality is the people doing that job were not adding a lot of value over what an LLM can do. So if the people are more expensive and the output is exactly the same as the LLM (or the LLM is even better as this CEO claims) the business will have to lay off the people to stay competitive.

    We should be looking to retool folks that are impacted by technological advancements like this, but for some reason there is little appetite for that.