• Skwerls@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    edit-2
    9 months ago

    There’s also a difference (not sure if clinically) between people who sexualize really young kids and someone who likes kids that are under the age that whatever society has decided splits children and adults. In the USA porn depicting the latter is fine as long as everyone is over the age of adulthood, even if they dress up to look younger.

    I think in general people who refer to pedophilia are usually referring to the former and not the 30 year old dating a 17 year old or whatever. But the latter makes it a little weird. Images of fictional people don’t have ages. Can you charge anyone who has aigen porn with csam if the people depicted sorta look underage?

    Ai generated content is gonna bring a lot of questions like these that we’re gonna have to grapple with as a society.

    • hoshikarakitaridia@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 months ago

      The first part of your comment is rather confusing to me, but the latter part I fully agree with. Decoding age on appearance is a thing that will haunt us even more with AI until we face new solutions. But that is gonna be one of a list of big questions to be asked in conjunction with new AI laws.