Hello everyone,

We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

We keep working on a solution, we have a few things in the works but that won’t help us now.

Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.

Edit: @Striker@lemmy.world the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn’t his community it would have been another one. And it is clear this could happen on any instance.

But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what’s next very soon.

Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It’s been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn’t the first time we felt helpless. Anyway, I hope we can announce something more positive soon.

  • Cosmic Cleric@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    Fyi, admins have the protection of federal law to not be held responsible, as long as they take action when it happens.

    They have very low to zero legal risk, as long as they’re doing their job.

    IANAL, but I can read laws.

    • 𝕯𝖎𝖕𝖘𝖍𝖎𝖙@lemmy.world
      link
      fedilink
      arrow-up
      2
      arrow-down
      5
      ·
      1 year ago

      Fyi, admins have the protection of federal law to not be held responsible, as long as they take action when it happens.

      Correct, emphasis mine. As long as they take action when it happens being the key phrase here.

      IANAL but from what I understand, doing something to take action (removing content, disabling communites, banning users, all of the above) shows that they are working to remove the content. This is why previously when having conversations with people about the topic of piracy I mentioned DCMA takedown notices and how the companies I’ve worked at responded to those with extreme importance (sometimes the higher ups would walk over to the devs and make sure the content was deleted).

      I’m annoyed at people in this thread who believe that the admins did the wrong thing, because turning off communities could cause users to go to another instance - who cares, this is bigger than site engagement. I’m annoyed at people who think that the devs had access to code which could prevent this issue but chose not to implement that code - this is a larger and much more difficult problem that can’t just be coded away, it usually involves humans to verify the code is working and correct false-positives and false-negatives.

      • Cosmic Cleric@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        You misunderstood what I meant by the part that you highlighted of my comment.

        I’m speaking of Safe Harbor provisions, not having to take active DCMA actions. They’re two very different things.

        • Yes, and believe it or not, I’ve been discussing both with people.

          I use DCMA actions because they are easily understood. People get copyright strikes. People pirate music.

          Safe Harbor provisions are not as easily understood, but basically amount to (IANAL) “if the administrator removes the offending content in a reasonable amount of time when they learn about the offending content, then we’re all good”. It’s not a safe haven for illicit content, it’s more of a “well, you didn’t know so we can’t really fault you for it” sort of deal. But when admins know about the content, they need to take action.