Not a good look for Mastodon - what can be done to automate the removal of CSAM?

  • balls_expert@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    11 months ago

    CSAM definitions absolutely include illustrated and simulated forms. Just check the sources on the wikipedia link and climb your way up, you’ll see “cartoons, paintings, sculptures, …” in the wording of the protect act

    They don’t actually need a victim to be defined as such

    • priapus@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      11 months ago

      That Wikipedia broader is about CP, a broader topic. Practically zero authorities will include illustrated and simualated forms of CP in their definitions of CSAM

      • balls_expert@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        11 months ago

        I assumed it was the same thing, but if you’re placing the bar of acceptable content below child porn, I don’t know what to tell you.

        • priapus@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          ·
          11 months ago

          That’s not what I was debating. I was debating whether or not it should be reported to authorities. I made it clear in my previous comment that it is disturbing and should always be defederated.

          • balls_expert@lemmy.blahaj.zone
            link
            fedilink
            arrow-up
            1
            ·
            11 months ago

            Ah. It depends on the jurisdiction the instance is in

            Mastodon has a lot of lolicon shit in japan-hosted instances for that reason

            Lolicon is illegal under US protect act of 2003 and in plenty of countries