The National Center for Missing and Exploited Children said it received more than 1 million reports of AI-related child sexual abuse material (CSAM) in 2025. The "vast majority" of that content was reported by Amazon, which found the material in its training data, according to an investigation by Bloomberg. In addition, Amazon said only that it obtained the inappropriate content from external sources used to train its AI services and claimed it could not provide any further details about where the CSAM came from.
Amazon provided Engadget with the following statement to explain why it doesn’t have data that can provide any further action on what it found.
“When we set up this reporting channel in 2024, we informed NCMEC that we would not have sufficient information to create actionable reports, because of the third-party nature of the scanned data. The separate channel ensures that these reports would not dilute the efficacy of our other reporting channels. Because of how this data is sourced, we don't have the data that comprises an actionable report.”
Amazon discovered a 'high volume' of CSAM in its AI training data but isn't saying where it came from
- Jim Burton
- Posts: 2476
- Joined: Fri Jun 28, 2024 10:33 pm
Amazon discovered a 'high volume' of CSAM in its AI training data but isn't saying where it came from
https://www.engadget.com/ai/amazon-disc ... 49228.html
Committee Member: Mu. Editorial Lead: Yesmap
Adult-attracted gay man; writer. Attraction to minors is typical variation of human sexuality.
Adult-attracted gay man; writer. Attraction to minors is typical variation of human sexuality.
Online
Re: Amazon discovered a 'high volume' of CSAM in its AI training data but isn't saying where it came from
Is it actually stated in any reports on this that the "CSAM" is actual photos, and not artwork or stories? Because it seems like the article wants you to think that it is, but "CSAM" is also used for fiction and artwork nowadays. Of course, what exactly the "abuse" in these are, remains a mystery...
