https://www.mpg.de/25788438/chat-contro ... e-scanning
A proposal by the Council of the European Union on chat monitoring, aimed at preventing the distribution of Child Sexual Abuse Material (CSAM), is now entering trilogue negotiations between the Council, the European Commission, and the European Parliament. The draft drops the earlier plan for mandatory surveillance. Instead, messaging services such as WhatsApp and Signal would be allowed to voluntarily install software for automated chat monitoring, and the scope of such monitoring could even be expanded. Mandatory measures are to be reconsidered in the future. The proposal also seeks to make it easier for users to report chats suspected of involving CSAM and introduces mandatory age verification for users. Carmela Troncoso, Director at the Max Planck Institute for Security and Privacy, is among the authors of a commentary welcoming the removal of the mandatory chat monitoring requirement. However, the signatories warn that several elements of the current proposal do little to help combat the spread of CSAM and could have unwanted side effects. In this interview, Carmela Troncoso discusses how the current draft differs from earlier ones, the implications of voluntary monitoring, the risks involved, and possible alternative approaches.
https://www.thehindu.com/news/national/ ... 7.ece/amp/
The National Commission for Protection of Child Rights (NCPCR), along with the School Education Department, held a State-level conference on key child right issues with a focus on education, health, the Juvenile Justice Act and the POCSO Act here on Monday.
NCPCR Member Secretary Sanjeev Sharma, in his address, stressed the urgent need for addressing children’s mental health, and said schools must play a central role in tackling rising psychological challenges.
Child-rights violations are not just statistical points but narratives that affect individual lives and the future of the nation, he said. He explained NCPCR’s commitment to strengthening institutional capacity through workshops and training, and revealed that in the last six months, it disposed of over 26,000 cases, rescued approximately 2,800 children and repatriated around 1,800 to child care institutions in their home districts.
https://www.euractiv.com/news/council-a ... led-talks/
The Council has reached a position on the Child Sexual Abuse Regulation after years of difficult negotiations, meaning that talks with Parliament can finally start.
The file, which includes measures aimed at combating the spread of online child sexual abuse material (CSAM), stalled after privacy and security experts raised concerns.
The original Commission proposal would have given law enforcement authorities a right to ask tech companies to scan their services for CSAM and grooming activity under mandatory detection orders.
https://eu.usatoday.com/story/tech/2025 ... 425612007/
A former employee of Meta, the owner of social media platforms Facebook, Instagram, WhatsApp, and Threads, testified as part of a major lawsuit that the tech company had a policy allowing 17 strikes before it suspended accounts engaged in the “trafficking of humans for sex.”
Vaishnavi Jayakumar, former head of safety and well-being for Instagram, also testified that in March 2020, Meta did not have a specific way for people to report child sexual abuse material (CSAM) on Instagram, according to federal court documents filed Friday, Nov. 21, in the Northern District of California.
"It was very surprising to me,” Jayakumar said, adding that she tried to raise this issue “multiple times,” but was told it would require too much work to build.
Jayakumar's concerns heightened when she learned of what she called the "17x" policy at Meta.
FOX43, Pa. Senate passes AI bill targeting child sexual abuse material
https://www.fox43.com/article/news/loca ... b4236c8f1f