Sammeeeeeee@lemmy.world to Technology@lemmy.worldEnglish · 2 years agoStanford researchers find Mastodon has a massive child abuse material problemwww.theverge.comexternal-linkmessage-square47fedilinkarrow-up1115arrow-down164 cross-posted to: technology@lemmy.worldfediverse@lemmy.mltechnology@beehaw.org
arrow-up151arrow-down1external-linkStanford researchers find Mastodon has a massive child abuse material problemwww.theverge.comSammeeeeeee@lemmy.world to Technology@lemmy.worldEnglish · 2 years agomessage-square47fedilink cross-posted to: technology@lemmy.worldfediverse@lemmy.mltechnology@beehaw.org
minus-squarewhenigrowup356@lemmy.worldlinkfedilinkEnglisharrow-up6·2 years agoShouldn’t it be possible to create open-source bots that use the same databases as the researchers to automatically flag and block that kind of content?
minus-squareozymandias117@lemmy.worldlinkfedilinkEnglisharrow-up4·2 years agoThose databases are highly regulated, as they are, themselves CSAM Apple tried to do fuzzy hashes to download them to devices, and it wasn’t able to reliably identify things at all
Shouldn’t it be possible to create open-source bots that use the same databases as the researchers to automatically flag and block that kind of content?
Those databases are highly regulated, as they are, themselves CSAM
Apple tried to do fuzzy hashes to download them to devices, and it wasn’t able to reliably identify things at all