• forgotmylastusername@lemmy.ml
    link
    fedilink
    arrow-up
    46
    arrow-down
    7
    ·
    5 months ago

    One the biggest problems with the internet today is bad actors know how to manipulate or dodge the content moderation to avoid punitive consequences. The big social platforms are moderated by the most naive people in the world. It’s either that or willful negligence. Has to be. There’s just no way these tech bros who spent their lives deep in internet culture are so clueless about how to content moderate.

    • blazeknave@lemmy.world
      link
      fedilink
      arrow-up
      33
      arrow-down
      4
      ·
      5 months ago

      I know them. I worked in this industry. They’re not naive. What basis do you have for these comments?

      I think you’re conflating with business executives running said social and gaming companies. Stop calling them techbros. Meta is not a tech startup. They’re a transnational corporation. They have capitalist execs running the companies.

    • Fudoshin ️🏳️‍🌈@feddit.uk
      link
      fedilink
      arrow-up
      16
      ·
      5 months ago

      bad actors know how to manipulate or dodge the content moderation to avoid punitive consequences.

      People have been doing that since the dawn of the internet. People on my old forum in the 90s tried to circumvent profanity filters on phpBB.

      Even now you can get round Lemmy.World filters against “removed-got” by adding a hyphen in it.

      Nothing new under the sun.

    • Jknaraa@lemmy.ml
      link
      fedilink
      arrow-up
      8
      ·
      5 months ago

      The thing is that words can have a very broad range of meaning depending on who uses them and how (among many other factors), but you can’t accurately code all of that into a form that computers can understand. Even ignoring bad actors it makes certain things very difficult, like if you ever want to search for something that just happens to share words with something completely different which is very popular.

    • d-RLY?@lemmy.ml
      link
      fedilink
      arrow-up
      4
      ·
      5 months ago

      Auto-moderation is both lazy and is only going to get worse. Not saying there isn’t some value on things being hard-banned (like very specific spam like shit that just keeps responding to everything with the same thing non-stop). But these mega outlets/sites want to just use full automation to ban shit without any human interactions. At least unless you or another corp has connections on the inside to get a person or people to fix it. Just like how they make it so fucking hard to ever reach a person when calling (or trying to even find) a support line.

      This automated shit just blacklists more and more shit and can completely fuck over people that use those sites for income (and they even can’t reach a person when their income is cut off for false reasons and don’t get back-pay for the period of a strike/ban). The bad guys will always just keep moving to a new word or phrase as the old ones get banned. So we as users are actually losing words and phrases and the actual shit is just on to the next one without issues.

  • Lath@kbin.social
    link
    fedilink
    arrow-up
    28
    arrow-down
    1
    ·
    5 months ago

    That’s what you get for all the teabagging you’ve been doing…

  • phx@lemmy.ca
    link
    fedilink
    arrow-up
    27
    arrow-down
    1
    ·
    5 months ago

    It’s dumb, but it’s also possible that a combination of those terms hads been adopted by some group distributing CSAM.

    At one point, “cheese pizza” was a term they apparently used on YouTube videos etc due to it having the same abbreviation as CP (Child Pornography).

    Sick fucks ruining everything for everyone

    • d-RLY?@lemmy.ml
      link
      fedilink
      arrow-up
      15
      ·
      5 months ago

      I agree with you is the TL;DR, and the rest is just my mad ranting opinions about companies being allowed to just auto-censor us. So feel free to completely ignore the rest. lol.

      It is like just banning words and phrases just because bad people use them has just become the norm. I really really can’t stand the way that channels on YT constantly have to self-censor basically everything (even if the video is just reporting on or trying to explain bad shit that is or has happened). And it never seems to actually stop the actual issues from happening. Just means the bad people just move on to a new word or phrase that is then itself banned. It isn’t about actually stopping fucked-up shit from happening. It is just about making sure advertisers and other sources of money don’t throw a fit.

      We always hear about how places like China are bad in-part for censoring words and speech. But in the US and other western nations we pretend we are allowed to freely speak uncensored. We have always had censoring of speech, it is just that the real rulers of the country are allowed to do it instead. Keeps the government’s hands free from legally being the enforcers of doing it to us. Shit like CP is fucked, and it should be handled for what it is, but allowing for-profit companies and especially their algorithms/AI to decide what we can and can’t say or search for without any level of human interactions that very much lead to false bans is also fucked.

      It is waaaay too easy for all the mega corps to completely take down channels and block creators from revenue of their own work just completely automated. But the accused channel can’t ever get a real person to both get clear understanding of what and who is attacking them, and to explain why their strike/bans aren’t valid. I have heard that even channels that have gotten written/legal permission from a big studio to use a clip of music or segment from video (music being the worst) will STILL catch automated strikes for copyright violations.

      We don’t need actual government censors, because the mega corps with all the money are allowed to do it for them. We have rights but they don’t really matter if they can say a private company or org made up of people from various mega corps are allowed to do it for them.

    • Schadrach@lemmy.sdf.org
      link
      fedilink
      arrow-up
      2
      ·
      5 months ago

      At one point, “cheese pizza” was a term they apparently used on YouTube videos etc due to it having the same abbreviation as CP (Child Pornography).

      This in turn was why the Podesta emails led to the whole pizza gate thing - there were a bunch of emails with weird phrasings like going to do cheese pizza for a couple of hours that just aren’t how people talk or write and so internet weirdos thought it was pedo code and then it kinda went insane from there.

  • IzzyScissor@lemmy.world
    link
    fedilink
    arrow-up
    26
    ·
    5 months ago

    Remember, searching for “halo” is banned because it could potentially be linked to pedophilia, but editing a video of the president to look like a pedophile is fine because “it wasnt done with AI.”

  • baatliwala@lemmy.world
    link
    fedilink
    arrow-up
    27
    arrow-down
    3
    ·
    5 months ago

    Barely 2 years ago I noticed that people were posting porn on Insta, and it was publicly visible just because they tagged #cum as #cüm. I don’t think this is possible now, but basically corporations are dumb and people posting disallowed content can be creative as hell.

  • rawrthundercats@lemmy.ml
    link
    fedilink
    arrow-up
    26
    arrow-down
    3
    ·
    5 months ago

    How do we know they didn’t type something more explicit to get the result and just change what’s in the search bar? Has anyone verified this?

    • 7heo@lemmy.mlOP
      link
      fedilink
      arrow-up
      35
      ·
      5 months ago

      I actually don’t know, I’m not sure it is possible (I never used Instagram, the search might be auto-submitting for all I know) but intentionally flagging yourself as potential child abuser, for clout, is a bit extreme…

  • nicetriangle@kbin.social
    link
    fedilink
    arrow-up
    23
    ·
    edit-2
    5 months ago

    I had a post of mine flagged for multiple days on there because it had an illustration of a woman in a full length wool coat completely covering her and not in any way sexual. Shit is so stupid

      • Astro@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        1
        ·
        5 months ago

        It’s the “Kids Online Safety Act”. Basically it’s using the old “think of the children!” move, but in reality conservatives are trying to push anything queer back into the dark.

        • LucidBoi@lemmy.world
          link
          fedilink
          arrow-up
          4
          ·
          5 months ago

          Think of the children! Let us scan all of your images, files and messages! For the sake of children of course! Nothing suspicious here…

      • makeasnek@lemmy.ml
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        3
        ·
        5 months ago

        Do you have to pay comcast extra to stream netflix? No? Net neutrality still exists.

        • brygphilomena@lemmy.world
          link
          fedilink
          arrow-up
          6
          arrow-down
          2
          ·
          5 months ago

          This is a bad take. The public wasn’t the one that would need to pay Comcast, they would have charged Netflix.

          • makeasnek@lemmy.ml
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            5 months ago

            A total breakdown in net neutrality could have very well had that result.

  • ComradeChairmanKGB
    link
    fedilink
    arrow-up
    2
    ·
    5 months ago

    I’m fine with abducting children for a Super-Soldier program. But I draw the line at having photos of them on Instagram. Honestly, a deserved warning. Be better 👏

    • unalivejoy@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      5 months ago

      This should be obvious. Photos of classified military assets shouldn’t be posted online.