@[email protected] to [email protected] • edit-22 years agoStanford researchers find Mastodon has a massive child abuse material problemwww.theverge.comexternal-linkmessage-square233fedilinkarrow-up1262arrow-down169file-textcross-posted to: [email protected][email protected][email protected][email protected]
arrow-up1193arrow-down1external-linkStanford researchers find Mastodon has a massive child abuse material problemwww.theverge.com@[email protected] to [email protected] • edit-22 years agomessage-square233fedilinkfile-textcross-posted to: [email protected][email protected][email protected][email protected]
minus-square@[email protected]linkfedilink4•2 years agoThere is a database of known files of CSAM and their hashes, mastodon could implement a filter for those at the posting interaction and when federating content Shadow banning those users would be nice too
minus-square@[email protected]linkfedilink1•2 years agoThey are talking about AI generated images. That’s the volume part.
There is a database of known files of CSAM and their hashes, mastodon could implement a filter for those at the posting interaction and when federating content
Shadow banning those users would be nice too
They are talking about AI generated images. That’s the volume part.