We all know it, there are seemingly no consequences for bad actions on the internet most of the time. Recently, a streamer died after being bullied and tormented live for months with hundreds of thousands of views, despite people reporting it.
That is just one big recent exemple but I think most of us are familiar with how lawless the internet seems to be, how utterly incompetant it seems to be at making bad actions and crimes have consequences. Even neo-nazis have been thriving on social media, which appears to be powerless against this.
But then, if social media platforms were so powerless and incompetent at moderation, how come they're so effective at suppressing anything MAP related? They fail to moderate anything, except liking children, then all of a sudden they're remarkably effective.
What that tells me is these platforms can moderate, even suppress things, but choose not to do it for anything except pedophilia. They are okay with profiting from abuse, bullying and all kind of extremisms, but somehow liking children is the only thing where social media platforms draw a line.
Why does the internet sucks at moderating anything except pedophilia?
- Officerkrupke
- Posts: 84
- Joined: Wed Jun 11, 2025 3:47 pm
Re: Why does the internet sucks at moderating anything except pedophilia?
I literally don’t get it either OP. It really stems from the opinion that child sexual abuse is worse than physical or verbal abuse.
- FairBlueLove
- Posts: 273
- Joined: Thu Jul 25, 2024 5:38 pm
Re: Why does the internet sucks at moderating anything except pedophilia?
My unpopular opinion? In a large part, just because in these times it is fashionable doing so. Herd behavior.
When society judges without understanding, it silences hearts that yearn for connection.
-
- Posts: 33
- Joined: Wed Jun 11, 2025 8:36 pm
Re: Why does the internet sucks at moderating anything except pedophilia?
I think this mainly comes from journalism and from people who fuel outrage on social media. Take this example: no one really gets outraged if they stumble across a file-sharing site with pirated content. The fight against piracy only exists because of economic interests, and it goes as far as the money invested in trying to stop it.
But when it comes to pedophilia? Just look at how much Roblox is paying today for the fallout of poor public communication on the issue.
Sexophobia is the one constant, which is why anything related to sex is under the tightest scrutiny: criticism of sexualization, the fight against revenge porn, teenagers taking nude pictures of themselves, and so on. And at the very top of it all stands pedophilia—the ultimate evil. The boogeyman of both the right and the left, the one villain that unites every political side.
But when it comes to pedophilia? Just look at how much Roblox is paying today for the fallout of poor public communication on the issue.
Sexophobia is the one constant, which is why anything related to sex is under the tightest scrutiny: criticism of sexualization, the fight against revenge porn, teenagers taking nude pictures of themselves, and so on. And at the very top of it all stands pedophilia—the ultimate evil. The boogeyman of both the right and the left, the one villain that unites every political side.
- PorcelainLark
- Posts: 697
- Joined: Thu Aug 01, 2024 9:13 pm
Re: Why does the internet sucks at moderating anything except pedophilia?
From ChatGPT:
https://www.law.cornell.edu/uscode/text/18/2258A
https://en.wikipedia.org/wiki/Section_2 ... 2%80%9317)
https://www.europarl.europa.eu/topics/e ... use-online
Enforcing the Online Safety Act: Platforms must start tackling illegal material from today
https://www.ofcom.org.uk/online-safety/ ... from-today
https://technologycoalition.org/resourc ... n-of-csam/
https://journals.sagepub.com/doi/abs/10 ... 5231193694
TL;DR There's legal penalties which are much more severe and there's more robust infrastructure in place for detection.
18 U.S. Code § 2258A - Reporting requirements of providersIn the U.S., platforms are required by law to report CSAM to the National Center for Missing & Exploited Children (NCMEC). These reports often trigger immediate law enforcement intervention—unlike hate speech or misinformation, where reporting is largely voluntary or complaint-driven.
https://www.law.cornell.edu/uscode/text/18/2258A
Section 230, Sex trafficking – Backpage.com and FOSTA-SESTA (2012–17)Child sexual exploitation has prompted legal reforms that override usual content protections. In the U.S., FOSTA-SESTA (2018) amended Section 230 to strip away immunity for platforms facilitating sex trafficking. This creates strong incentives to proactively block CSAM.
https://en.wikipedia.org/wiki/Section_2 ... 2%80%9317)
How the EU is fighting child sexual abuse onlineIn both the EU and UK, authorities are advancing swift detection mandates. The EU’s proposed “Child Sexual Abuse Regulation” would require digital platforms to scan uploads for CSAM proactively. Likewise, the UK's Online Safety Act empowers regulators to levy steep fines on non-compliant platforms.
https://www.europarl.europa.eu/topics/e ... use-online
Enforcing the Online Safety Act: Platforms must start tackling illegal material from today
https://www.ofcom.org.uk/online-safety/ ... from-today
An Update on Voluntary Detection of CSAMUnlike some forms of disinformation or ideological speech, CSAM detection benefits from mature tech tools—especially hash-based recognition systems that flag known illegal material immediately upon upload. This enables robust, near-instant enforcement.
https://technologycoalition.org/resourc ... n-of-csam/
The (not so) secret governors of the internet: Morality policing and platform politicsConservative activism also plays a role—groups like the National Center on Sexual Exploitation have long driven legislation (e.g., FOSTA/SESTA) that encourages platforms to clamp down on anything remotely associated with sexual content involving minors.
https://journals.sagepub.com/doi/abs/10 ... 5231193694
TL;DR There's legal penalties which are much more severe and there's more robust infrastructure in place for detection.