Online platform safety advances with groundbreaking AI technology aimed at identifying and preventing child exploitation content from being uploaded to the internet.
Revolutionary development: Thorn and AI company Hive have created a first-of-its-kind artificial intelligence model designed to detect previously unknown child sexual abuse materials (CSAM) at the point of upload.
- The model expands Thorn’s existing Safer detection tool by adding a new “Predict” feature that leverages machine learning technology
- Training data includes real CSAM content from the National Center for Missing and Exploited Children’s CyberTipline
- The system generates risk scores to assist human content moderators in making faster decisions about flagged content
Technical implementation: The AI model employs sophisticated machine learning algorithms to identify potential CSAM content across various online platforms.
- The technology can be integrated into social media, e-commerce platforms, and dating applications
- Since 2019, the Safer tool has successfully identified more than 6 million potential CSAM files
- The system is designed to improve its accuracy through continued use and exposure to more content across the internet
Future enhancements: Thorn plans to expand the system’s capabilities to provide more comprehensive protection against child exploitation.
- Development is underway for an AI text classifier to identify conversations indicating potential child exploitation
- While currently not designed to detect AI-generated CSAM, future updates may address this emerging threat
- The technology is part of a broader strategy combining detection with preventative measures
Looking ahead: As online platforms face increasing pressure to protect vulnerable users, this AI-powered approach represents a significant step forward in content moderation technology, though questions remain about its effectiveness against evolving threats like AI-generated content and the balance between automation and human review in content moderation decisions.
AI trained on real child sex abuse images to detect new CSAM