Edit: just to be clear, I’m only talking about lemmy.blahaj.zone here. If you’re coming from Kbin or some other instance, this won’t affect you.

I have nothing against porn in general, but LemmyNSFW is a firehose of NSFW content, some of it offensive or toxic, and the admins seem to be shaky on whether they’re prepared for the content. It’s started showing up in my /all/ feed now, and I’m worried.

Essentially the entirety of Lemmy’s porn is getting uploaded to one instance, and I am not at all confident in their ability to moderate it. The idea of a massive instance like that that’s still so young and untested and still trying to figure out whether they are going to allow underage content or not being allowed on my feed makes me really uncomfortable. I could just disable NSFW, but not all NSFW is porn and not all of it comes from that instance.

In addition to the lack of moderation, things I’ve seen that seem to be allowed include: misogyny, slurs for trans people, objectification, straight up rape

They don’t even have any system of verification for anyone posting on there so they could be spreading CSEM or revenge porn into people’s caches unknowingly. There’s a potential legal risk here too.

Defederating seems more than reasonable. LemmyNSFW is just way too lax on their policies. If people here want to look at porn, they can always make an alt account over there.

  • Melmi@lemmy.blahaj.zoneOP
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 year ago

    I feel like you have to be arguing in bad faith at this point. You’re still on this whole thing about “obscenity” and me trying to silence people or trying to get rid of the stuff they like. They can literally do whatever they want as long as they aren’t hurting people.

    I’m literally just concerned about how easy it is for non-consentual pornography to slip through the moderation team. I’m not making up this issue. Most prominently, Pornhub, the #1 porn host as far as I’m aware, got in a swamp of legal trouble and ended up banning non-verified content because it was infested with CSAM and revenge porn, and they aren’t the only ones. This isn’t an imagined problem like transgender people raping people in bathrooms, unless you’re suggesting that that’s also a risk that we take but that we should allow it because expression is good.

    If skirt spinning memes had a risk of having non-consentual pornography memes in them, I would be very cautious around those communities and maybe not want them on my feed even if the memes were usually great. But that’s not the case, unlike for pornography which has a proven track record of having illegal and unethical content mixed in with the good that is difficult to separate unless you implement verification–which LemmyNSFW is not doing. Again, nothing against the people posting the porn, and frankly nothing against the porn itself. I’m not some anti-porn crusader, I like porn from time to time too. It’s only the 1% of people who are the issue, the people who post literal child pornography and other non-consentual content. People can assume that risk, that’s their prerogative, but I just think that we should prevent that risk from spreading onto other servers where the images will potentially be downloaded and cached on the server itself.

    • ToastedPlanet@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Without any evidence of wrong doing by LemmyNSFW, worrying about CSAM material and revenge porn from them is an imagined problem like transgender people raping people in bathrooms. As I pointed out, there is evidence of CSAM material. Using that as evidence to justify defederation is reasonable. What is is not reasonable is using fear of potential risks as a justification to defederate. I am not accusing you or anyone else of doing anything. I have been describing why defederating without evidence is undesirable. I am in fact saying that free speech is worth the risk of hate speech. And that peoples’ freedom to express themselves is worth the risk that they might do something wrong, like posting CSAM or revenge porn. If the LemmyNSFW moderators fail to curtail hate speech and wrong doing, then we, as an instance, have reasons to take action.

      There are people who consider trans people and their content to be unethical. They have made discussing trans people illegal in schools, banned drag shows and discouraged pride parades through a climate of fear. To them there is no difference between a trans person’s content or someone on LemmyNSFW’s content. We should not divide ourselves without reason. And by defederating we are telling LemmyNSFW there is something they cannot do, regardless of how small a thing that is, that they cannot post or comment here without making another account.

      Also pornography collectively is not an entity with a track record. The concern, about the abundance of porn on LemmyNSFW, reminds me of the MAP acronym slur directed at the trans community. Which tries to use the abundance of different kinds of people in the community as some kind of risk that pedophiles are accepted in the trans community, when they are not. If our moderates failed to stop CSAM material from being posted here, other instances would be justified in defederating from us. I’m sure there are people who think the trans community has a proven track record of CSAM material and would expect a high risk of it from us.

      I am arguing against arguments. Specifically the arguments in this thread that seem eerily similar to other arguments that I dislike. I am sure that this similarity is not intentioned, but I am arguing against what I see in the thread. I am not sure how else I can argue in good faith. If I have misrepresented your argument, I apologize.

      I think what is being argued is that the potential risk of harmful content from LemmyNSFW is not worth continuing to federate with them. I am arguing this is equivalent to society wanting to ostracize a group, like trans people, because the potential risk of harmful content is not worth the risk of continued association. Specifically, what I am saying is equivalent is the reasoning. Which argues that action against a group should be determined by the potential risk posed by that group and not by evidence of wrong doing by that group.