I know not many of you care about LLMs/other ai models but I think this really shows the amount of loneliness and in our society. Look at how it presents itself on Google. As an AI that feels alive, always available, that understands you. People don’t use this service to summarize text or get help with their programming homework like they might chatgpt. They are selling artificial companionship.

  • DamarcusArt
    link
    fedilink
    English
    arrow-up
    4
    ·
    3 months ago

    Is it only used to simulate conversation? Can it do other things, like function as a personal assistant program or anything like that? It could be a case of people using as a tool, and not just to combat their own loneliness.

    • BountifulEggnog [she/her]@hexbear.netOP
      link
      fedilink
      English
      arrow-up
      4
      ·
      3 months ago

      As far as I know, yes. It’s pretty much all just pretending to have conversations, not so much useful assistant tasks. And maybe not all of it’s use isn’t so detrimental but… I just worry about it. I’ve seen how important it is to some users, and with so many, there’s bound to be a lot of people who have an unhealthy relationship with these types of things.

      • DamarcusArt
        link
        fedilink
        English
        arrow-up
        4
        ·
        3 months ago

        Yeah, I’ve seen how this stuff talks, it’s just a “tells you what you want to hear” machine. It would be worrying if someone was using it instead of actually interacting with people.