Page 1 of 1
Why does the internet sucks at moderating anything except pedophilia?
Posted: Sat Aug 23, 2025 9:35 am
by Liyowo
We all know it, there are seemingly no consequences for bad actions on the internet most of the time. Recently, a streamer died after being bullied and tormented live for months with hundreds of thousands of views, despite people reporting it.
That is just one big recent exemple but I think most of us are familiar with how lawless the internet seems to be, how utterly incompetant it seems to be at making bad actions and crimes have consequences. Even neo-nazis have been thriving on social media, which appears to be powerless against this.
But then, if social media platforms were so powerless and incompetent at moderation, how come they're so effective at suppressing anything MAP related? They fail to moderate anything, except liking children, then all of a sudden they're remarkably effective.
What that tells me is these platforms can moderate, even suppress things, but choose not to do it for anything except pedophilia. They are okay with profiting from abuse, bullying and all kind of extremisms, but somehow liking children is the only thing where social media platforms draw a line.
Re: Why does the internet sucks at moderating anything except pedophilia?
Posted: Sat Aug 23, 2025 2:10 pm
by Officerkrupke
I literally don’t get it either OP. It really stems from the opinion that child sexual abuse is worse than physical or verbal abuse.
Re: Why does the internet sucks at moderating anything except pedophilia?
Posted: Sat Aug 23, 2025 7:54 pm
by FairBlueLove
My unpopular opinion? In a large part, just because in these times it is fashionable doing so. Herd behavior.
Re: Why does the internet sucks at moderating anything except pedophilia?
Posted: Sat Aug 23, 2025 8:30 pm
by Not Forever
I think this mainly comes from journalism and from people who fuel outrage on social media. Take this example: no one really gets outraged if they stumble across a file-sharing site with pirated content. The fight against piracy only exists because of economic interests, and it goes as far as the money invested in trying to stop it.
But when it comes to pedophilia? Just look at how much Roblox is paying today for the fallout of poor public communication on the issue.
Sexophobia is the one constant, which is why anything related to sex is under the tightest scrutiny: criticism of sexualization, the fight against revenge porn, teenagers taking nude pictures of themselves, and so on. And at the very top of it all stands pedophilia—the ultimate evil. The boogeyman of both the right and the left, the one villain that unites every political side.
Re: Why does the internet sucks at moderating anything except pedophilia?
Posted: Sun Aug 24, 2025 5:34 am
by PorcelainLark
From ChatGPT:
In the U.S., platforms are required by law to report CSAM to the National Center for Missing & Exploited Children (NCMEC). These reports often trigger immediate law enforcement intervention—unlike hate speech or misinformation, where reporting is largely voluntary or complaint-driven.
18 U.S. Code § 2258A - Reporting requirements of providers
https://www.law.cornell.edu/uscode/text/18/2258A
Child sexual exploitation has prompted legal reforms that override usual content protections. In the U.S., FOSTA-SESTA (2018) amended Section 230 to strip away immunity for platforms facilitating sex trafficking. This creates strong incentives to proactively block CSAM.
Section 230, Sex trafficking – Backpage.com and FOSTA-SESTA (2012–17)
https://en.wikipedia.org/wiki/Section_2 ... 2%80%9317)
In both the EU and UK, authorities are advancing swift detection mandates. The EU’s proposed “Child Sexual Abuse Regulation” would require digital platforms to scan uploads for CSAM proactively. Likewise, the UK's Online Safety Act empowers regulators to levy steep fines on non-compliant platforms.
How the EU is fighting child sexual abuse online
https://www.europarl.europa.eu/topics/e ... use-online
Enforcing the Online Safety Act: Platforms must start tackling illegal material from today
https://www.ofcom.org.uk/online-safety/ ... from-today
Unlike some forms of disinformation or ideological speech, CSAM detection benefits from mature tech tools—especially hash-based recognition systems that flag known illegal material immediately upon upload. This enables robust, near-instant enforcement.
An Update on Voluntary Detection of CSAM
https://technologycoalition.org/resourc ... n-of-csam/
Conservative activism also plays a role—groups like the National Center on Sexual Exploitation have long driven legislation (e.g., FOSTA/SESTA) that encourages platforms to clamp down on anything remotely associated with sexual content involving minors.
The (not so) secret governors of the internet: Morality policing and platform politics
https://journals.sagepub.com/doi/abs/10 ... 5231193694
TL;DR There's legal penalties which are much more severe and there's more robust infrastructure in place for detection.