Sammeeeeeee@lemmy.world to Technology@lemmy.worldEnglish · 11 months agoStanford researchers find Mastodon has a massive child abuse material problemwww.theverge.comexternal-linkmessage-square49fedilinkarrow-up1108arrow-down154 cross-posted to: technology@lemmy.worldtechnology@lemmy.ml
arrow-up154arrow-down1external-linkStanford researchers find Mastodon has a massive child abuse material problemwww.theverge.comSammeeeeeee@lemmy.world to Technology@lemmy.worldEnglish · 11 months agomessage-square49fedilink cross-posted to: technology@lemmy.worldtechnology@lemmy.ml
minus-squarewhenigrowup356@lemmy.worldlinkfedilinkEnglisharrow-up6·11 months agoShouldn’t it be possible to create open-source bots that use the same databases as the researchers to automatically flag and block that kind of content?
minus-squareozymandias117@lemmy.worldlinkfedilinkEnglisharrow-up5·11 months agoThose databases are highly regulated, as they are, themselves CSAM Apple tried to do fuzzy hashes to download them to devices, and it wasn’t able to reliably identify things at all
Shouldn’t it be possible to create open-source bots that use the same databases as the researchers to automatically flag and block that kind of content?
Those databases are highly regulated, as they are, themselves CSAM
Apple tried to do fuzzy hashes to download them to devices, and it wasn’t able to reliably identify things at all