corb3t@lemmy.world to Technology@lemmy.ml · edit-22 years agoStanford researchers find Mastodon has a massive child abuse material problemwww.theverge.comexternal-linkmessage-square197fedilinkarrow-up1263arrow-down168file-textcross-posted to: technology@lemmy.worldtechnology@lemmy.worldfediverse@lemmy.mltechnology@beehaw.orgtechnews@radiation.party
arrow-up1195arrow-down1external-linkStanford researchers find Mastodon has a massive child abuse material problemwww.theverge.comcorb3t@lemmy.world to Technology@lemmy.ml · edit-22 years agomessage-square197fedilinkfile-textcross-posted to: technology@lemmy.worldtechnology@lemmy.worldfediverse@lemmy.mltechnology@beehaw.orgtechnews@radiation.party
minus-squareballs_expert@lemmy.blahaj.zonelinkfedilinkarrow-up5arrow-down1·2 years agoThere is a database of known files of CSAM and their hashes, mastodon could implement a filter for those at the posting interaction and when federating content Shadow banning those users would be nice too
minus-squarediffuselight@lemmy.worldlinkfedilinkarrow-up1·2 years agoThey are talking about AI generated images. That’s the volume part.
There is a database of known files of CSAM and their hashes, mastodon could implement a filter for those at the posting interaction and when federating content
Shadow banning those users would be nice too
They are talking about AI generated images. That’s the volume part.