Echo Chambers

From SI410
Jump to: navigation, search
Political cartoon on Echo Chambers.[1]

An Echo Chamber in technology and media is characterized by an environment in which one encounters only opinions and beliefs similar to their own, hears outside opinions, but chooses not to consider them or actively dismisses them.[2] Those who participate in echo chambers are often unintentionally experiencing confirmation bias. This bias occurs when people believe an idea to be true, and only view evidence that confirms their beliefs. When they encounter information that presents a rebuttal, they cast doubts on it.[3] Yet, people are also far less likely to question rumors about opposing viewpoints.[4] Echo chambers often occur on social media sites and can lead to political polarization and extremism.[5] As a result, people’s realities can be impacted, alternative perspectives can begin to be disbelieved, and real-life implications can occur.[6]

Echo chambers are often associated and confused with several other phrases. Two of those concepts are epistemic bubbles and filter bubbles. The difference between an epistemic bubble and an echo chamber is that, in an epistemic bubble, insiders of the group are not exposed to alternative perspectives, unintentionally or not. Echo chambers, however, mistrust those outside sources.[7] Filter bubbles, on the other hand, are a state of isolation that can occur from personalized algorithms.[8] Echo chambers, however, refer to the overall concept of only being exposed to information from like-minded individuals, and filter bubbles can be a method for echo chambers to occur, but just refer to the results of algorithms.

Generalization of Echo Chambers

In the discussion of Echo Chambers in social media, we can see different reasons why people are shielded from outside opinions. One of them is religion. When people are dedicated to their religion, some will involve themselves deeply into that religion and surround themselves with a similar community. These echo chambers can make people prone to no longer listening or trying to understand other people's perspectives[9]. Although the term echo chambers is viewed negatively, some people argue that the term is just a derogatory term for community. Since religion can be a touchy topic for a 'normal' conversation, surrounding yourself with like-minded people can make expressing opinions easier. Mass media seems to strengthen the negative connotation on echo chambers by highlighting the two distinct sides to every argument. Bora Zivkovic, who is the editor of the blog network at Scientific American, stated, "By refusing to acknowledge the existence of many stands on any issue, by refusing to assign Truth-values to any, by looking down at anyone who holds an opinion that is not their own, the mainstream press fosters the atmosphere of a bipolar world in which enmity rules and the wagons need to be circled - the atmosphere that is so conducive to formation and defense of echo-chambers and yet so devoid of airing of any alternatives."[10] The echo chambers/communities that have been created in our society are being heavily impacted by how media perceives them. Looking also at politics, news outlets put such a heavy emphasis on opposing sides that it is hard for people to not create echo chambers based on their opinions.

Outside of the influences of media, we can also view external sources that cause people to create echo chambers, even at an early age. Kids are easily swayed by their parents, teachers, friends, etc. So when they are surrounded by their family who thinks one way, at one point or another, they will realize that not everyone shares their beliefs and they'll have to learn how to deal with it. An example is their parents' choice whether to give their child the flu vaccine or not. There are strong positions for and against vaccines and since that decision rests in the parents' hands, children often follow what their parents believe. So parents who didn't grow up getting a flu vaccine every year don't take their kids to get it either, they mostly perceive it as something old people get[11]. According to CDC, 51% of children received a flu vaccine in the 2019-2020 season, which is 2.6 percentage points than the last season[12].

Social Media

When looking at the different types of echo chambers that online communities and social networks can create, there is a clear distinction to be made between the two groups. First, there are those sites that create the echo chambers inherently through how their platform is designed, such as with targeted content from algorithms. Then, there are sites where users create the echo chambers themselves. The perception of these two different types of sites, both from users’ perspectives and in that of the media, is often quite varied.

Platform Created Echo Chambers

The feedback loop between users and advertising companies.[13]

Social networking sites, like Facebook and Twitter, are often blamed and accused of being the biggest proponents of echo chambers, due to their personalization algorithms. Many people now get their news through these sites and these algorithms cater to one’s interests. This curation has led to what is known as a “filter bubble.”[14]These filter bubbles present news and information that the user wants to see. So if a user only subscribes to news from one side of the political spectrum, not only will their newsfeeds be primarily filled with this information, but the areas of the sites dedicated to discovering new content will be skewed towards their preexisting beliefs as well. Sites that create these filter bubbles tend to receive higher criticism due to this positive feedback loop that their algorithms can create. This is backed up by studies that show that Facebook and Twitter have higher segregation than sites without these algorithms such as Reddit.[15]

User Created Echo Chambers

The flipside of platform-created echo chambers are websites that give the power to create and moderate communities and content to the users, thus allowing echo chambers to be made without the influence of algorithms. Users gravitate towards information that they already agree with; it’s been discussed that users tend to aggregate in communities they are already familiar with, which contributes to echo chambers by creating confirmation bias, segregation, and polarization [16]. Although platform-created echo chambers appear to have a greater level of segregation and polarization than user-created echo chambers, biases still exist in user-controlled platforms like Reddit as well. In particular, studies have shown the existence of significant bias in these platforms towards certain classifications of users (such as gender, religion, age, etc).[17] This has created criticism for the amount of control these platforms give users over the conversations and membership to communities, which allow them to turn into highly homogeneous echo chambers.[18] This is true to some extent and has been shown to have had real-life effects, such as a decrease in the trust of vaccinations during the COVID-19 pandemic.[19][20] Despite this, however, it is shown that the levels of polarization are lower than they may appear.[21] Analysis of interactions on Reddit, for example, have shown high levels of cross-community interactions, contrary to the sentiment of the users of the platform.[21] Beyond this, looking specifically at the possibility of political echo chambers, shows that there is a more balanced discussion than might first be imagined.[22] This strengthens the findings that without the overreaching control of algorithms, users are less likely to create echo chambers on their own.

This shows that the actual political leaning of Reddit is actually quite balanced, despite what is commonly believed about it creating extremified echo chambers. [22]

Ethical Implications

Echo chambers, while most present in online communities, create real-life effects and ethical implications. They can lead its members to distrust those on the outside of the chamber with different beliefs, even if those beliefs are true.[23] This distrust has resulted in confirmation bias, segregation, and polarization of users with different viewpoints [24]. These chambers often have political and scientific consequences.

Echo Chambers in Politics

Echo chambers are often spoken about with regards to political issues due to the highly obvious grouping often connected to political parties. One example of this is Rush Limbaugh, a conservative radio host, whose radio show was categorized as an echo chamber.[25] In a review done by the New York Times, his radio show was described as being aimed at conservatives in the “silent majority” and reinforced theories against Democrats, framing them as “evil-doers, [with] the media [as] their evil allies.”[26] The radio show created an echo chamber for its listeners, being fed information that confirmed their biases. In the same review, the New York Times credits Donald Trump’s win and the success of his style of politics to Rush Limbaugh, showing the real-life implication of this echo chamber.[27]

Political polarization

It has been discussed that echo chambers online contribute to higher levels of political polarization [28]. There is empirical evidence that supports the claim that Americans who are repeatedly exposed to biased information that favors a certain political view that is close to their own will eventually develop more extreme political standpoints and become less tolerant of those with opposing political views [29]. It has also been discussed that politically liberal and conservative users live on different sides of the internet and share very minimal overlap in the sources they trust for political news [30]. Concerns regarding transparency in this context when it comes to filtering have also been discussed, as a person may believe that they have an accurate view of an issue or political landscape when they are only being shown the side of the argument that they already agree with [31].

Echo Chambers in Medicine

Echo chambers have also had real-life implications for the medical industry. In a 2017 study on 2.6 million Facebook users over 7 years, it was found that consumption of content about vaccinations was dominated by the echo chamber effect and increased polarization.[32] And as recently as the COVID-19 pandemic, anti-vaxxer groups on social media networks have influenced opinions on the safety of new vaccinations.[33] These groups often reinforce conspiracy theories on vaccinations and encourage their members to not vaccinate themselves or their families. They often do so via alleged stories about side effects from unverified sources, with their members being more moved by these stories than scientific data provided by health agencies.[34]

Prevention of Echo Chambers

Company Driven Prevention

How Twitter determines what action to take with misinformation.[35]

Many companies, specifically social media networks, have been working to combat echo chambers. During the rise of COVID-19 and the 2020 Presidential Election, for example, Twitter began adding disclaimers on Tweets that may contain false information. These labels informed users whether a Tweet contained misleading information, disputed claims, or unverified claims.[35] Information that has been verified to be false by experts falls under the misleading information category.[35] Information that is contested or just unknown falls under the disputed claim category.[35] Finally, information that is unconfirmed, so it could be true or false, is labeled as unverified information.[35] Tweets with one of these labels appended to them will have links to Twitter’s curated page or with links to trusted third-party sources that have verified information. Tweets with a warning label will be blocked from viewing but can be viewed if the user acknowledges that the information may not be accurate. Additionally, the systems that are marking tweets are not amplifying tweets with warnings or labels. [35] Twitter’s system for identifying tweets with misinformation is still in its early stages and leaders of this initiative at Twitter are open to adjusting labels and other metrics.

In 2017, Facebook also made changes to their network’s “Trending” page by showing multiple news sources for a topic or event with the intent of providing users a variety of viewpoints.[36]

User Driven Prevention

A large issue with echo chambers and filter bubbles is a lack of awareness that they exist; many people are unfamiliar with the concept of filter bubbles or echo chambers, and are therefore unaware of the bias in the information that is brought forward to them by search engines and social media feeds [37]. It has been discussed that this is due to the fact that algorithms are complicated and not many people understand how they work or the impact they have on information and society [31]. If a user is aware that they are in an echo chamber, it is possible for users to take individual preventive measures to create a more accurate and diverse view of information online.

Most control resides within social media companies, however, the user is able to take some steps to prevent being in an echo chamber.[38] The algorithms that categorize users can only do this because they know what users like and do not like. Their whole goal is to increase user engagement. [39] Thus, to prevent the algorithm from classifying users, one method recommended for users is to be conservative with their likes on social media.[40] Furthermore, users can be mindful of the amount of followers people they follow have. At the University of Southern California, Professor Kristina Lerman’s research says that people who have a large disparity between followers and following have an outsized influence on social media.[40] Lastly, on social media apps like Facebook and Twitter, users can update how they want their feed to give them information. The default mode on their feed is set up to give them personalized content that they should, in theory, engage with better. Users now have the option to have information given in chronological order instead of a personalized order.[41] This way users will be given the most up-to-date information and not necessarily information that will reinforce the user’s current beliefs.


  1. Ostergren, M. (2018, February 1). [Political Cartoon of Echo Chambers]. Retrieved March 12, 2021, from
  2. Echo chamber. (n.d.). Retrieved March 12, 2021, from
  3. Heshmat, S. (2015, April 23). What is confirmation bias? Retrieved March 12, 2021, from
  4. Nicholas, DiFonzo. (2011, April 22). The Echo-Chamber Effect. Retrieved March 18, 2021, from
  5. Barberá, P., Jost, J. T., Nagler, J., Tucker, J. A., & Bonneau, R. (2015). Tweeting From Left to Right: Is Online Political Communication More Than an Echo Chamber? Psychological Science, 26(10), 1531–1542.
  6. How filter bubbles distort reality: Everything you need to know. (2019, November 14). Retrieved March 12, 2021, from
  7. C. Thi Nguyen Associate Professor of Philosophy. (2021, February 17). The problem of living inside echo chambers. Retrieved March 12, 2021, from
  8. What is a filter Bubble? - definition from Techopedia. (n.d.). Retrieved March 12, 2021, from
  9. Exit the Echo Chamber. (2020, June 11). Retrieved April 17, 2021, from
  10. `Echo-chamber' is just a derogatory term for `community'. (2013, Janurary 14). Retrieved April 17, 2021, from
  11. 'Echo chamber' surrounds parental decisions about childhood flu vaccine. (2018, November 29). Retrieved April 17, 2021, from
  12. Flu Vaccination Coverage, United States, 2019–20 Influenza Season. (2020, October 1). Retrieved April 17, 2021, from
  13. (2017, February 14). [The feedback loop between users and advertising companies.]. Retrieved March 17, 2021, from
  14. Hosanagar, K. (2016, November 25). Blame the echo chamber on Facebook. but blame yourself, too. Retrieved March 12, 2021, from
  15. Cinelli, M., Morales, G. D., Galeazzi, A., Quattrociocchi, W., & Stanini, M. (2021). The echo chamber effect on social media. Proceedings of the National Academy of Sciences March 2021, 118(9). doi:10.1073/pnas.2023301118
  16. Del Vicario, Michela, et al. “The Spreading of Misinformation Online.” PNAS Proceedings of the National Academy of Sciences of the United States of America, vol. 113, no. 3, 19 Jan. 2016, pp. 554–559. EBSCOhost, doi:10.1073/pnas.1517441113.
  17. Ferrer, X., van Nuenen, T., Such, J. M., & Criado, N. (2020). Discovering and categorizing language biases in Reddit. arXiv preprint arXiv:2008.02754.
  18. Wiggers, K. (2020, August 07). Researchers quantify bias in Reddit content sometimes used to train ai. Retrieved March 24, 2021, from
  19. Ulen, Thomas S., Democracy and the Internet: Cass R. Sunstein, Republic.Com. Princeton, Nj. Princeton University Press. Pp. 224. 2001. Available at SSRN: or
  20. Burki, T. (2020). The online anti-vaccine movement in the age of covid-19. The Lancet Digital Health, 2(10). doi:10.1016/s2589-7500(20)30227-2
  21. 21.0 21.1 De Francisci Morales, G., Monti, C. & Starnini, M. No echo in the chambers of political interactions on Reddit. Sci Rep 11, 2818 (2021).
  22. 22.0 22.1 Tyler, "Reddit's leftward political bias", Data Insights, 2019.
  23. C. Thi Nguyen Associate Professor of Philosophy. (2021, February 17). The problem of living inside echo chambers. Retrieved March 12, 2021, from
  24. Cite error: Invalid <ref> tag; no text was provided for refs named vicario
  25. Jamieson, K. H., & Cappella, J. N. (2010). Echo Chamber Rush Limbaugh and the Conservative Media Establishment. Oxford: Oxford University Press, USA.
  26. Barbaro, M. (Host). (2021, February 22). The Legacy of Rush Limbaugh [Audio podcast]. Retrieved from
  27. Barbaro, M. (Host). (2021, February 22). The Legacy of Rush Limbaugh [Audio podcast]. Retrieved from
  28. Hong, Sounman, and Sun Hyoung Kim. “Political Polarization on Twitter: Implications for the Use of Social Media in Digital Governments.” Government Information Quarterly, vol. 33, no. 4, 2016, pp. 777–782., doi:10.1016/j.giq.2016.04.007
  29. Borgesius, Frederik J. Zuiderveen, et al. “Should We Worry About Filter Bubbles?” Internet Policy Review, 31 Mar. 2016,
  30. Panke, Stefanie, and John Stephens. “Beyond the Echo Chamber: Pedagogical Tools for Civic Engagement Discourse and Reflection.” Journal of Educational Technology & Society, vol. 21, no. 1, Jan. 2018, pp. 248–263. EBSCOhost,
  31. 31.0 31.1 Pariser, Eli. “The Filter Bubble: What the Internet Is Hiding From You.” The Filter Bubble: What the Internet Is Hiding From You, Penguin Books, 2012, pp. 6–16
  32. Schmidt AL, Zollo F, Scala A, Betsch C, Quattrociocchi W. Polarization of the vaccination debate on Facebook. Vaccine. 2018 Jun 14;36(25):3606-3612. doi: 10.1016/j.vaccine.2018.05.040. PMID: 29773322.
  33. Thompson, D. (2021, January 29). Anti-Vaxxers wage campaigns Against covid-19 shots. Retrieved March 12, 2021, from
  34. Ortiz-Sánchez, E., Velando-Soriano, A., Pradas-Hernández, L., Vargas-Román, K., Gómez-Urquiza, J. L., Cañadas-De la Fuente, G. A., & Albendín-García, L. (2020). Analysis of the Anti-Vaccine Movement in Social Networks: A Systematic Review. International journal of environmental research and public health, 17(15), 5394.
  35. 35.0 35.1 35.2 35.3 35.4 35.5 Roth, Y. (2020, May 11). Updating our approach to misleading information. Retrieved March 12, 2021, from
  36. Cathcart, W. (2019, November 07). Continuing our updates to trending. Retrieved March 12, 2021, from
  37. Allred, Kristen. “The Causes and Effects of ‘Filter Bubbles’ and How to Break Free.” Medium, Medium, 13 Apr. 2018,
  38. Michael, Cusumano (2021, January 15). Social Media Companies Should Self-Regulate. Now. Retrieved March 19, 2021, from
  39. (2016, July 24). The Reason Your Feed Became An Echo Chamber — And What To Do About It. Retrieved March 12, 2021, from
  40. 40.0 40.1 Christopher, Seneca(2020, September 17). How to Break Out of Your Social Media Echo Chamber. Retrieved March 18, 2021, from
  41. Natt, Garun. (2020, March 6). How to switch your Twitter feed to a chronological timeline. Retrieved March 18, 2021, from