Re: A History of Gendered Censorship and the Costs of Faith-Based ‘Porn’ Panics
Posted: Mon Dec 29, 2025 5:35 pm
Yes, I think I posted it in a mega thread (now merged).
Public and Private Support Forums for MAPs and Allies
http://forum.map-union.org/
Full “feature-length” AI films of child sexual abuse may now be “inevitable” unless urgent action is taken, experts warn, as rapidly improving technology means AI video is now “indistinguishable” from genuine imagery.
New data, published today (Friday, July 11) by the Internet Watch Foundation (IWF) shows confirmed reports of AI-generated child sexual abuse imagery have risen 400%, with AI child sexual abuse discovered on 210 webpages in the first six months of 2025 (January 1 – June 30).
In the same period in 2024, IWF analysts found AI child sexual abuse imagery on 42 webpages. Each page can contain multiple images or videos.
[...]
Minister for Safeguarding and Violence Against Women and Girls, Jess Phillips said:
“These statistics are utterly horrific. Those who commit these crimes are just as disgusting as those who pose a threat to children in real life.
“AI-generated child sexual abuse material is a serious crime, which is why we have introduced two new laws to crack down on this vile material.
[...]
Rani Govender, Policy Manager for Child Safety Online at the NSPCC, said:
“It is deeply worrying to see how rapid advances in AI are being exploited to create increasingly realistic and extreme child sexual abuse material, which is then being spread online. These new figures make it clear that this vile activity will only get worse without the right protections in place.
“Young people are reaching out to Childline in distress after seeing AI-generated sexual abuse content created in their likeness. The emotional impact on them can be devastating and long lasting, leaving them embarrassed, anxious and deeply shaken.
[...]
Frances Frost, Director of Communications and Advocacy at the Lucy Faithfull Foundation said:
“Through our anonymous Stop It Now helpline, we speak to thousands of people every year seeking our support to change their online sexual behaviour towards children. So far this year we're seeing double the number of people contacting us concerned about their own use of AI images than did last year. Crucially, these people are not viewing these AI images in isolation - 91% of the people who contact us to say they are viewing AI images say that they have also viewed sexual images of children that weren't created with AI.
“Illegal AI imagery causes real harm to real children however it is created. It generates demand for child sexual abuse images and normalises sexual violence towards children. Children who have previously been victims of sexual abuse are revictimised. AI images also make it harder for authorities to identify real cases of children who are being abused.
This is the only argument I could see myself giving even a minimal degree of validity to; as for the rest… no woman who has been raped feels offended hearing one of her ministers say that her rapist is as disgusting as someone who plays around with AI prompts?Jim Burton wrote: Mon Jan 05, 2026 10:56 pm https://www.iwf.org.uk/news-media/news/ ... in-a-year/
AI images also make it harder for authorities to identify real cases of children who are being abused.
Nearly half of pornography users have accessed adult sites without age verification checks since new laws came into force, new research shows.
About 45 per cent of 1,469 adults who admitetd viewing porn avoided submitting their personal information.
A similar number watched content that made them uncomfortable, says the Lucy Faithfull Foundation online safety charity.
Around a third used a virtual private network to avoid age checks on websites that do require them since government-mandated rules were introduced in July.
A VPN can be used to mask a users’ location, allowing them to connect to the internet as though they were in a different country.
Now the charity is sounding the alarm, warning that adults who don’t want to share their identity are turning to riskier sites where they are more likely to see child abuse images or other unsafe or illegal material.