#SocialMedia #Mastodon #ContentModeration #CSAM: During a two-day test, researchers at the Stanford Internet Observatory found over 600 pieces of known or suspected child abuse material across some of Mastodon’s most popular networks, according to a report shared exclusively with The Technology 202.
Researchers reported finding their first piece of content containing child exploitation within about five minutes. They would go on to uncover roughly 2,000 uses of hashtags associated with such material. David Thiel, one of the report’s authors, called it an unprecedented sum.
“We got more photoDNA hits in a two-day period than we’ve probably had in the entire history of our organization of doing any kind of social media analysis, and it’s not even close,” said Thiel, referring to a technique used to identify pieces of content with unique digital signatures. Mastodon did not return a request for comment."