Page 1 of 2

Artificial intelligence will increase images of child pornography

Posted: Mon Jul 22, 2024 3:10 pm
by Pegasus
The use of artificial intelligence (AI) to create child pornography online will grow exponentially and fake images will make it increasingly difficult to identify real victims, the European police agency warned on Monday (22).
Criminals are already using AI tools to commit crimes ranging from online fraud and cyber attacks to the creation of explicit images of minors, Europol warned.
In a 37-page report, the agency claimed that there are already reports of "AI-generated and AI-assisted child sexual abuse material".
"The use of AI, which allows abusers to generate and alter child pornography material, is set to proliferate further in the near future," warned the Hague-based agency.
AI-generated pornographic images "increase the amount of illicit material in circulation and complicate the identification of victims and perpetrators".
More than 30 million children were victims of sexual exploitation and abuse on the internet last year, according to an investigation by the British University of Edinburgh in May.

The crimes range from so-called sextortion, in which criminals demand money from victims to keep images private, to abuse by AI to create deepfakes, videos that are not real, according to the Childlight Global Safety Institute program.
The rise of AI has led to growing fears around the world about the possibility of its malicious use.
"The sheer volume of self-generated sexual material constitutes
"The volume of self-generated sexual material now constitutes a significant and growing part of the child sexual abuse material available online," Europol said.
"Even if the content is entirely artificial and does not represent a real victim, AI-generated child pornography material continues to contribute to the objectification and sexualization of children," it warned.

Re: Artificial intelligence will increase images of child pornography

Posted: Mon Jul 22, 2024 3:19 pm
by BLueRibbon
Related:

https://www.map-union.org/blog/press-re ... -deception

These bastards really show their true colors when they hunt people for AI.

Re: Artificial intelligence will increase images of child pornography

Posted: Tue Jul 23, 2024 6:11 am
by Fragment
Pegasus wrote: Mon Jul 22, 2024 3:10 pm "Even if the content is entirely artificial and does not represent a real victim, AI-generated child pornography material continues to contribute to the objectification and sexualization of children," it warned.
Which if they are not real fucking children should not be a fucking crime.

Re: Artificial intelligence will increase images of child pornography

Posted: Tue Jul 23, 2024 11:43 am
by Pegasus
Fragment wrote: Tue Jul 23, 2024 6:11 am
Pegasus wrote: Mon Jul 22, 2024 3:10 pm "Even if the content is entirely artificial and does not represent a real victim, AI-generated child pornography material continues to contribute to the objectification and sexualization of children," it warned.
Which if they are not real fucking children should not be a fucking crime.

I completely agree.

Re: Artificial intelligence will increase images of child pornography

Posted: Sat Jul 27, 2024 12:42 am
by ephebo2026
Pegasus wrote: Mon Jul 22, 2024 3:10 pm The use of artificial intelligence (AI) to create child pornography online will grow exponentially and fake images will make it increasingly difficult to identify real victims, the European police agency warned on Monday (22).
Criminals are already using AI tools to commit crimes ranging from online fraud and cyber attacks to the creation of explicit images of minors, Europol warned.
In a 37-page report, the agency claimed that there are already reports of "AI-generated and AI-assisted child sexual abuse material".
"The use of AI, which allows abusers to generate and alter child pornography material, is set to proliferate further in the near future," warned the Hague-based agency.
AI-generated pornographic images "increase the amount of illicit material in circulation and complicate the identification of victims and perpetrators".
More than 30 million children were victims of sexual exploitation and abuse on the internet last year, according to an investigation by the British University of Edinburgh in May.

The crimes range from so-called sextortion, in which criminals demand money from victims to keep images private, to abuse by AI to create deepfakes, videos that are not real, according to the Childlight Global Safety Institute program.
The rise of AI has led to growing fears around the world about the possibility of its malicious use.
"The sheer volume of self-generated sexual material constitutes
"The volume of self-generated sexual material now constitutes a significant and growing part of the child sexual abuse material available online," Europol said.
"Even if the content is entirely artificial and does not represent a real victim, AI-generated child pornography material continues to contribute to the objectification and sexualization of children," it warned.
ADOLESCENTS ARE NOT CHILDREN.. THIS IS A SCIENTIFIC FACT YOU MORONS.

Re: Artificial intelligence will increase images of child pornography

Posted: Sat Jul 27, 2024 10:53 am
by Artaxerxes II
The genie is out of the bottle, so AI can't be clamped down on. Any actual ban would be too ineffective to enforce. Which is why, every time you hear about the government banning "AI-generated CSAM", keep in mind that it is merely a pretext for a wider censorship campaign that will include more than just AI-generated youth erotica, but political content that challenges the government as well.

All the fear mongering over CP is merely a cover-up, because most people won't investigate further if they think the laws will only be applied for the "baddies". How foolish they are in thinking that such laws won't affect them adversely, and when they realise their mistake, it'll be too late.

Re: Artificial intelligence will increase images of child pornography

Posted: Sat Jul 27, 2024 11:44 am
by Jim Burton
I have not seen us address their concern yet:

How is it possible to prove that an image was made by AI and involved no real minors?

This cuts to the heart of their objection to AI generated images. The idea that real youth erotica/PIM will be indistinguishable from this vast body of work, and thus almost impossible to prosecute. The idea "verification" watermarks could be applied to real PIM in a deceptive manner.

All I can think of right now is that programs are made with a predetermined electronic signature to each successive production of each successive download, and the sig is overlaid in some way within the output file. These could be verified by cross-referencing to a database that is encrypted somehow as to make copying that data impossible for forgers.

Without an answer to this, there are very valid concerns about generative AI, at least from a law-enforcement perspective.

Re: Artificial intelligence will increase images of child pornography

Posted: Mon Jul 29, 2024 10:07 am
by BLueRibbon
Jim Burton wrote: Sat Jul 27, 2024 11:44 am I have not seen us address their concern yet:

How is it possible to prove that an image was made by AI and involved no real minors?

This cuts to the heart of their objection to AI generated images. The idea that real youth erotica/PIM will be indistinguishable from this vast body of work, and thus almost impossible to prosecute. The idea "verification" watermarks could be applied to real PIM in a deceptive manner.

All I can think of right now is that programs are made with a predetermined electronic signature to each successive production of each successive download, and the sig is overlaid in some way within the output file. These could be verified by cross-referencing to a database that is encrypted somehow as to make copying that data impossible for forgers.

Without an answer to this, there are very valid concerns about generative AI, at least from a law-enforcement perspective.
Discussed in another thread - https://forum.map-union.org/viewtopic.php?p=557#p557

It would not be hard to watermark images produced entirely from text prompts. It would be possible to then exclude such images from definitions of AI PIM.

Re: Artificial intelligence will increase images of child pornography

Posted: Sat Aug 03, 2024 3:31 am
by Rin
Pegasus wrote: Mon Jul 22, 2024 3:10 pm "Even if the content is entirely artificial and does not represent a real victim, AI-generated child pornography material continues to contribute to the objectification and sexualization of children,"
An argument they also use against lolicon and shotacon, which is quite a bit weaker politically than the victimhood argument, because "objectification and sexualization" is something that has been present in all entertainment media.

And it is actually impossible to prevent users from generating CP using AI, any user with a beefy enough PC can run models locally and disable the censorship filters, for this very reason it is extremely difficult to prosecute them in the places where it is illegal too.They will lose in the long run, the punitive legal system that has been in place for centuries simply doesn't really work in the digital age.

Re: Artificial intelligence will increase images of child pornography

Posted: Sat Aug 03, 2024 10:07 am
by PorcelainLark
Pegasus wrote: Mon Jul 22, 2024 3:10 pm The crimes range from so-called sextortion, in which criminals demand money from victims to keep images private, to abuse by AI to create deepfakes, videos that are not real, according to the Childlight Global Safety Institute program.
I feel like society needs become a lot less judgemental of people sexually, in order for this to stop being a problem. Like, there's often an expression of aggression towards people who appear in a sexual context (e.g. calling people "sluts" under porn videos). To the point where people have been completely overwhelmed by the reaction to their nudes being leaked in the past. In the West, this is the direction I feel we are already on the path to. For example, the attempt to blackmail Jeff Bezos over his dick pics failing. However, there's still a deep sexual conservatism in non-Western countries.
For example, I can't imagine seeing the Islamic world reacting to this in a way that is level headed. Say an older Muslim was made aware of a deepfake of their grand daughter, I can't imagine they would react in a sympathetic way.
Sooner or later we need to address the danger of sexual shame, not just as MAPs but as a society.