#cybertipline — Public Fediverse posts
Live and recent posts from across the Fediverse tagged #cybertipline, aggregated by home.social.
-
Six Months Of ‘AI CSAM Crisis’ Headlines Were Based On Misleading Data
-
Stanford researchers find Mastodon has a massive child abuse material problem
Mastodon is rife with child sexual abuse material (#CSAM), according to a new study from Stanford’s Internet Observatory
In just two days, researchers found 112 instances of known CSAM across 325,000 posts on the platform — with the first instance showing up after just five minutes of searching.
Decentralized networks don’t use the same approach to #moderation as mainstream sites. Instead, each decentralized instance is given control over moderation, which can create #inconsistency across the Fediverse.
That’s why the researchers suggest that networks like Mastodon employ more robust #tools for moderators, along with #PhotoDNA integration and #CyberTipline reporting
https://www.theverge.com/2023/7/24/23806093/mastodon-csam-study-decentralized-network
A significant portion of the child abuse material researchers uncovered was from networks in #Japan, where there are “significantly more lax laws” that “exclude computer-generated content as well as manga and anime,” according to the report.
“We found that on one of the largest Mastodon instances in the Fediverse (based in Japan), 11 of the top 20 most commonly used #hashtags were related to #pedophilia,” the researchers wrote.** For nuance read the report itself; https://stacks.stanford.edu/file/druid:vb515nd6874/20230724-fediverse-csam-report.pdf
Thanks @JDGooiker