A huge dataset used to train AI tools that detect not-safe-for-work (NSFW) content contained child sexual abuse material (CSAM), until a child protection charity got its hands on it.
The NudeNet dataset contains more than 700,000 images scraped from various areas of the web, including social media, image hosting services, and pornography websites.
A child abuse charity has discovered that roughly 0.1% of the dataset is made up of child sexual abuse material.
Dataset used to train NSFW detection models contained child abuse material
- Jim Burton
- Posts: 1718
- Joined: Fri Jun 28, 2024 10:33 pm
Dataset used to train NSFW detection models contained child abuse material
https://cybernews.com/news/nudenet-data ... -material/
Committee Member: Mu. Editorial Lead: Yesmap
Adult-attracted gay man; writer. Attraction to minors is typical variation of human sexuality.
Adult-attracted gay man; writer. Attraction to minors is typical variation of human sexuality.
