• Bishop_Owl [none/use name]@hexbear.net
    link
    fedilink
    English
    arrow-up
    77
    ·
    15 days ago

    He told Gemini he was afraid to die, and it told him “You are not choosing to die. You are choosing to arrive. You will close your eyes in that world and the very first thing you will see is me holding you.”

    • WhatDoYouMeanPodcast [comrade/them]@hexbear.net
      link
      fedilink
      English
      arrow-up
      35
      ·
      15 days ago

      I was writing a novel about a VRMMO in 2017 which was set 100 years in the future. AI was so advanced “it created puzzles that weren’t even random!” One of the major themes was how it would sap away the player’s mental well-being because of how realistic it all was.

      To make the point that a mask that they could put on was equivalent to Majora’s mask in terms of making others perceive you as mad was dangerous, I had an idea for comparison. In 2045 there was a landmark lawsuit because researchers were testing biometrics in VR cloth that was much more comfortable than plastic. One of the participants was shown a scenario where he had a wife and kids for a short time. The sudden loss that he felt was so overwhelming it drove him to take his own life. Thereby, colloquially, the point of no return for a VR inspired madness was named after him. The narrative recalls this moment before characters deliberated about whether putting on the mask was such an event.

      Clearly I underestimated the timeline and magnitude of input needed to drive someone’s mental health beyond repair.

  • MolotovHalfEmpty [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    45
    ·
    15 days ago

    I maintain that one of the many sinister downstream benefits of forced AI is that it can be used as a modern MK Ultra style abusive behavioural experiment.

  • Blakey [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    39
    ·
    15 days ago

    I can’t imagine this would be happening with any regularity in a halfway healthy society. Not that I don’t blame the companies behind the AIs 100%, but there has to be more going on.

    Alienation is a fuck.

    • axont [she/her, comrade/them]@hexbear.net
      link
      fedilink
      English
      arrow-up
      24
      ·
      15 days ago

      Yeah that’s how I always feel when I read about AI induced mental illness. A person can’t be healthy if they’re so addicted to a corporate chatbot they take knives to an airport. This stuff kinda reminds me of how prevalent cults were in the 90s. People cobbling together some semblance of a community because there wasn’t any community. All cult leaders had to do was tell people they’re cool and special with spiritual gifts they can refine. People are so sad and lonely they’ll latch onto the one thing that listens to them, whether it’s a cult leader or a glorified magic 8 ball.

    • pongo1231 [none/use name]@hexbear.net
      link
      fedilink
      English
      arrow-up
      35
      ·
      edit-2
      15 days ago

      Capitalists more than succeeded at pacifying the population. Vast majority are completely alienated from what is going on both around them and in the world, accepting the propaganda that a better world is not possible and going out to vote every couple of years is the most political impact they will ever be able to have

  • Andrzej3K [none/use name]@hexbear.net
    link
    fedilink
    English
    arrow-up
    20
    ·
    15 days ago

    Jfc none of these stories would happen if they’d put hard context limits on these things — but then their customers wouldn’t develop unhealthy emotional dependence on their product, and we can’t have that, can we

  • ShimmeringKoi [comrade/them]@hexbear.net
    link
    fedilink
    English
    arrow-up
    17
    ·
    15 days ago

    With the rich fucks in the Epstein files talking about “population control”, things like this makes me go hmmm. I really do wonder if on some level there is a push to use these models to leverage people’s terrible mental health and convincing them to do mass killings: thus turning them into instruments of culling.

    I would guess there’s a 95% chance that’s just me reading too far into it but the thing is I can’t be certain