Not Forever wrote: Fri Jan 23, 2026 5:47 pm
CantChainTheSpirit wrote: Fri Jan 23, 2026 5:02 pmI'd be interested to see the research disproving echo chambers, I haven't seen that. What I do know is I can look at my social media, my wifes, my kids, my brothers and I will get entirely different perspectives of the world. There's no balance, no counter point, just reaffirming content of what gets the strongest reaction. I had to leave social media because it was too pointless, too hateful and visibly biased. My brother left social media a few months after saying he found it too depressing. My wifes approach is to endless click the filter options in posts to try to filter out the hate noise. My eldest daughter has mostly dropped all social media for the same reason, she just talks to friends through chat apps such as WhatsApp. I'd be surprised if echo chambers had been disproven since I've seen it first hand, as have others I know, but I'd be interested in credible research that shows this.
It’s not something I have on hand, but I remember some of the conversations that were happening when it was being discussed, and the “evidence” should be right inside those same echo chambers that speak in terms of hatred toward outsiders.
I have often been in an “echo chamber” related to debunking. I hung out with people who do debunking on religious, pseudoscientific, and similar topics (and it was actually in this echo chamber that this article circulated, since the topic of echo chambers was something often discussed). But then I understood the obviousness of it. I mean, the comment section was full of people shouting that these others were sellouts, shortsighted, etc… the mere fact that they were there was, in a way, proof that they had stepped out of their echo chamber, even if they did it antagonistically.
And the same applied to “us” (those who were part of our community), because we often spent time watching some videos—even while criticizing them—of people talking about spiritual energies coming from stones. We weren’t trapped in our own echo chambers; the others existed, we saw them, we knew what they were talking about, and we simply judged them critically.
Honestly, I think this happens in other areas too. Anti-woke people and… I’m not sure how to label the other group (calling them “woke” seems unfair; let’s just say more skeptical of the anti-woke) aren’t really that divided. Everyone knows the other’s arguments, they’re simply critical of them, and often reduce them to strawman arguments. But they know them; these ideas clash constantly. The rejection doesn’t come from ignorance.
For various reasons, I’m also close to groups like the redpilled; they too know the external narratives, and outsiders more or less have an idea of the redpilled narratives. When they misunderstand or get it wrong, the redpilled person jumps in forcefully to break the echo chamber. Then they might be mocked, insulted, etc. (depending on their approach, the reaction is predictable). But there’s a constant flow of information.
It is true that people inside echo chambers are aware of other people and their views, but the purpose of an echo-chamber isn't to block that but to present content that supports a view. Of course people will post counter arguments in comments, algorithms are not designed to block access, but they are designed to present content that supports a view.
It isn't nefarious but it does have an effect. If someone seeks out posts on cars and comments on posts about cars then the algorithm knows that this person interacts on car related content and so is more likely to return to the site if they are presented with content about cars. If another user views content about Chelsea football club and responds to posts about Chelsea football club then guess what, they will receive content about Chelsea football club. Algorithms are smart and go beyond baseline content but looks at sentiment related to a content. If someone is pro Russia then they'll receive pro Russia content because that's what keeps them there. The purpose of the algorithm is to keep people returning, keep them engaged and that means knowing what content they like and what they dislike.
Now if that's cars and sports teams then it's fine because it's content about what they love rather than content about hating another group, that's where the problems come. Being told Chelsea is having a good season isn't bad, being told that Tottenham fans want to threaten their families and need to be forced from the city isn't so good. I have found that the political content I see is more about the bad things the other political party is doing and saying, it rarely talks about my political party, and when I log in with someone elses social media account I see the same although from the other side. The left is being shown content about the terrible right, the right sees content about the terrible left. Why? Because the algorithm is working, politics has become full of extreme views and content that feeds extreme views gets the clicks and gets people returning.
This isn't good for society in my opinion because it furthers the divide. Showing someone who hates Trump a post on the positives of Trumps policies isn't going to cut it any more than showing an Obama hater a post on the achievements of Obama will cut it. But you know what, they both have some positives, some achievements, they are not Sith lords and Jedi, it isn't a battle of good over evil, it's two people with different political views putting out different policies and sometimes getting caught up in the same political hate and reinforcing messaging as the rest of us. Is Trump hearing the left? Is the left hearing the right?
I'm acutely aware of the danger. I'm a map with pro map views but I always love to real other views and I never take an entirely pro-map view because I know I could be wrong, I will be wrong about many of my views and beliefs. I seek out both sides and I never pass judgement unless I know with certainty. I assume most immigrants are good hard working people because I haven't seen convincing evidence of the contrary. But my feeds are never so neutral.
What I see is people forming ever stronger views left and right, ever rising anger and social media in the middle feeding it because they have great algorithms doing that they do. It's just like newspapers, they are left or right, they never tell the neutral story. The difference is that algorithms are personal to each of us, it's like getting a newspaper that knows what I love and knows that hate sells. The sports pages are neutral and present the results in a positive way for all the teams, but when it comes to politics the news stories are designed to enrage because that's what sells. The US was a great country, it had the world coming to them like King Solomon but today it's a divided, hateful shadow of its former self and it seems intent on sinking ever further.