Not a good look for Mastodon - what can be done to automate the removal of CSAM?

  • redcalcium@lemmy.institute
    link
    fedilink
    arrow-up
    40
    arrow-down
    7
    ·
    11 months ago

    I get what you’re saying, but due to federated nature, those CSAMs can easily spread to many instances without their admins noticing them. Having even one CSAM in your server is a huge risk for the server owner.

    • MinusPi (she/they)@pawb.social
      cake
      link
      fedilink
      arrow-up
      30
      ·
      11 months ago

      I don’t see what a server admin can do about it other than defederate the instant they get reports. Otherwise how can they possibly know?

      • krimsonbun@lemmy.ml
        link
        fedilink
        arrow-up
        3
        arrow-down
        8
        ·
        11 months ago

        This could be a really big issue though. People can make instances for really hateful and disgusting crap but even if everyone defederates from them it’s still giving them a platform, a tiny tiny corner on the internet to talk about truly horrible topics.

        • andruid@lemmy.ml
          link
          fedilink
          arrow-up
          15
          ·
          11 months ago

          Again if it’s illegal content publically available, officials can charge those site admins with crime of hosting. Everyone just has a duty to defederate.

        • priapus@sh.itjust.works
          link
          fedilink
          arrow-up
          14
          ·
          11 months ago

          Those corners will exist no matter what service they use and there is nothing Mastodon can do to stop this. There’s a reason there are public lists of instances to defederate. This content can only be prevented by domain providers and governments.