Dataset used to train NSFW detection models contained child abuse material

A place to talk about news articles relevant to MAPs.
Post Reply
User avatar
Jim Burton
Posts: 1718
Joined: Fri Jun 28, 2024 10:33 pm

Dataset used to train NSFW detection models contained child abuse material

Post by Jim Burton »

https://cybernews.com/news/nudenet-data ... -material/
A huge dataset used to train AI tools that detect not-safe-for-work (NSFW) content contained child sexual abuse material (CSAM), until a child protection charity got its hands on it.

The NudeNet dataset contains more than 700,000 images scraped from various areas of the web, including social media, image hosting services, and pornography websites.

A child abuse charity has discovered that roughly 0.1% of the dataset is made up of child sexual abuse material.
Committee Member: Mu. Editorial Lead: Yesmap

Adult-attracted gay man; writer. Attraction to minors is typical variation of human sexuality.
Post Reply