It seems crazy to me but ive seen this concept floated on several different post. There seems to be a number of users here that think there is some way AI generated CSAM will reduce Real life child victims.

Like the comments on this post here.

https://sh.itjust.works/post/6220815

I find this argument crazy. I don’t even know where to begin to talk about how many ways this will go wrong.

My views ( which are apprently not based in fact) are that AI CSAM is not really that different than “Actual” CSAM. It will still cause harm when viewing. And is still based in the further victimization of the children involved.

Further the ( ridiculous) idea that making it legal will some how reduce the number of predators by giving predators an outlet that doesnt involve real living victims, completely ignores the reality of the how AI Content is created.

Some have compared pedophilia and child sexual assault to a drug addiction. Which is dubious at best. And pretty offensive imo.

Using drugs has no inherent victim. And it is not predatory.

I could go on but im not an expert or a social worker of any kind.

Can anyone link me articles talking about this?

  • Skull giver@popplesburger.hilciferous.nl
    link
    fedilink
    arrow-up
    10
    ·
    9 months ago

    Most psychological research is done on (grad) students, so it’s not as easy as you may think.

    Just imagine what the research would be like. “Hello, we’re a public, government-funded institution trying to find out if CSAM is bad. Please tell us how much illegal content you consume and how many children you have raped so we can try to find a connection.”

    We’ve had a few convicted paedos with generated CSAM now, but you need more than a few to get any kind of scientific result. Even then your data set consists purely of paedos who have been found out and convicted; the majority of them never get caught.

    AI generated CSAM is very new. You can now generate illegal images that no children have ever been abused for. Previously, the issue was easy: CSAM more realistic than a drawing is the result of a child being abused, and there’s a real victim there. Now you can generate pictures of any type of person doing anything, without any real victims.

    You’re comparing social sciences to math here, and social sciences aren’t an exact science. Different populations behave differently. Paedophiles in some remote areas just get married to kids and nobody there bats an eye, to the point where people don’t even consider it a problem. This isn’t like “what’s the square root of 7”, this is like “what’s the impact of banning white shirts on domestic violence”: it’s impossible to get straight results here.