Each of these reads like an extremely horny and angry man yelling their basest desires at Pornhub’s search function.

  • afraid_of_zombies@lemmy.world
    link
    fedilink
    English
    arrow-up
    43
    arrow-down
    6
    ·
    1 year ago

    Maybe we do live in the best possible world. Wow wouldn’t it be great to get rid of this industry so you can consume porn while knowing that there is zero percent chance this wasn’t made without their consent?

    • hh93@lemm.ee
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      5
      ·
      1 year ago

      Isn’t the main problem with those models how you can create porn of everyone without their consent with those tools, too?

      • stevedidWHAT@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        ·
        1 year ago

        Sex trafficking vs virtual photoshop of your face…

        Nothing new, and it’s a huge improvement over the current status quo. Not everything needs to be a perfect solution

        • N1cknamed@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          3
          ·
          1 year ago

          Kidnapping someone isn’t nearly as easy as creating AI porn of them is… Dumb comparison.

      • gandalf_der_12te@feddit.de
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        1 year ago

        Yeah so what. It’s not as if somebody is “sold on the market” because there’s a nude picture of them. Photoshop is not a real threat to society. We gotta stop making moral imaginations more important than physical things.

        • GBU_28@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          1 year ago

          Are you actually asking?

          The jist is that LLM find similar “chunks” out content from their training set, and assemble a response based on this similarity score (similar to your prompt request).

          They know nothing they haven’t seen before, and the nicely of them is they create new things from parts of their training data.

          Obviously they are very impressive tools but the concern is you can easily take a model that’s designed for porn, feed it pictures of someone you want to shame, and have it generate lifelike porn of a non porn actor.

          That, and the line around “ethical” AI porn is blurry.

          • tal@kbin.social
            link
            fedilink
            arrow-up
            2
            ·
            edit-2
            1 year ago

            They know nothing they haven’t seen before

            Strictly speaking, you arguably don’t either. Your knowledge of the world is based on your past experiences.

            You do have more-sophisticated models than current generative AIs do, though, to construct things out of aspects of the world that you have experienced before.

            The current crop are effectively more-sophisticated than simply pasting together content – try making an image and then adding “blue hair” or something, and you can get the same hair, but recolored. And they ability to replicate artistic styles is based on commonalities in seen works, but you don’t wind up seeing chunks of material just done by that artist. But you’re right that they are considerably more limited then a human.

            Like, you have a concept of relative characteristics, and the current generative AIs do not. You can tell a human artist “make those breasts bigger”, and they can extrapolate from a model built on things they’ve seen before. The current crop of generative AIs cannot. But I expect that the first bigger-breast generative AI is going to attract users, based on a lot of what generative AIs are being used for now.

            There is also, as I understand it, some understanding of depth in images in some existing systems, but the current generative AIs don’t have a full 3d model of what they are rendering.

            But they’ll get more-sophisticated.

            I would imagine that there will be a combination of techniques. LLMs may be used, but I doubt that they will be pure LLMs.

        • GBU_28@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          1 year ago

          Ok, you know it’s trained on existing imagery right?

          Sure the net new photos aren’t net new abuses, but whatever abuses went into the training set are literally represented in the product.

          To be clear I’m not fully porn shaming here, but I wanted to clarify that these tools are informed from something already existing and cant be fully separated from the training data.