A very NSFW website called Pornpen.ai is churning out an endless stream of graphic, AI-generated porn. We have mixed feelings.

  • Knusper@feddit.de
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    8
    ·
    1 year ago

    Well, to develop such a service, you need training data, i.e. lots of real child pornography in your possession.

    Legality for your viewers will also differ massively around the world, so your target audience may not be very big.

    And you probably need investors, which likely have less risky projects to invest into.

    Well, and then there’s also the factor of some humans just not wanting to work on disgusting, legal grey area stuff.

    • Womble@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      ·
      1 year ago

      yup, just like the ai needed lots of pictures of astronaughts on horses to make pictures of those…

      • JonEFive@midwest.social
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        Exactly. Some of these engines are perfectly capable of combining differing concepts. In your example, it knows basically what a horse looks like, and what a human riding on horseback looks like. It also knows that an astronaut looks very much like a human without a space suit and can put the two together.

        Saying nothing of the morality, In this case, I suspect that an AI could be trained using pictures of clothed children perhaps combined with nude images of people who are of age and just are very slim or otherwise have a youthful appearance.

        While I think it’s repugnent in concept, I also think that for those seeking this material, I’d much rather it be AI generated than an actual exploited child. Realistically though, I doubt that this would actually have any notable impact to the prevalence of CSAM, and might even make it more accessible.

        Furthermore, if the generative AI gets good enough, it could make it difficult to determine whether an image is real or AI generated. That would make it more difficult for police to find the child and offender to try to remove them from that situation. So now we need an AI to help analyze and separate the two.

        Yeah… I don’t like living in 2023 and things are only getting worse. I’ve put way more thought into this than I ever wanted to.

        • Ryantific_theory@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          Aren’t AI generated images pretty obvious to detect from noise analysis? I know there’s no effective detection for AI generated text, and not that there won’t be projects to train AI to generate perfectly realistic images, but it’ll be a while before it does fingers right, let alone invisible pixel artifacts.

          As a counterpoint, won’t the prevalence of AI generated CSAM collapse the organized abuse groups, since they rely on the funding from pedos? If genuine abuse material is swamped out by AI generated imagery, that would effectively collapse an entire dark web market. Not that it would end abuse, but it would at least undercut the financial motive, which is progress.

          That’s pretty good for 2023.

          • JackbyDev@programming.dev
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            With StableDiffusion you can intentionally leave an “invisible watermark” that machines can easily detect but humans cannot see. The idea being that in the future you don’t accidentally train on already AI generated images. I’d hope most sites are doing that but it can be turned off easily enough. Apart from that I’m not sure.

            • Ryantific_theory@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              I could have sworn I saw an article talking about how there were noise artifacts that were fairly obvious, but now I can’t turn anything up. The watermark should help things, but outside of that it looks like there’s just a training dataset of pure generative AI images (GenImage) to train another AI to detect generated images. I guess we’ll see what happens with that.

    • d13@programming.dev
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      Unfortunately, no, you just need training data on children in general and training data with legal porn, and these tools can combine it.

      It’s already being done, which is disgusting but not surprising.

      People have worried about this for a long time. I remember a subplot of a sci-fi series that got into this. (I think it was The Lost Fleet, 15 years ago).