Echo Chambers

From SI410
Revision as of 13:54, 1 April 2021 by Emchoe (Talk | contribs) (User Driven Prevention)

Jump to: navigation, search
Political cartoon on Echo Chambers.[1]

An Echo Chamber in technology and media is characterized by an environment in which one encounters only opinions and beliefs similar to their own, hears outside opinions, but chooses not to consider them or actively dismisses them.[2] Those who participate in echo chambers are often unintentionally experiencing confirmation bias. This bias occurs when people believe an idea to be true, and only view evidence that confirms their beliefs. When they encounter information that presents a rebuttal, they cast doubts on it.[3] Yet, people are also far less likely to question rumors about opposing viewpoints.[4] Echo chambers often occur on social media sites and can lead to political polarization and extremism.[5] As a result, people’s realities can be impacted, alternative perspectives can begin to be disbelieved, and real-life implications can occur.[6]

Similar Phrases

Echo chambers are often associated and confused with several other phrases. Two of those concepts are epistemic bubbles and filter bubbles. The difference between an epistemic bubble and an echo chamber, is that, in an epistemic bubble, insiders of the group are not exposed to alternative perspectives, unintentionally or not. While in an echo chamber, insiders are exposed to alternative perspectives, but distrust those outside sources.[7]

Filter bubbles, on the other hand, do relate more closely to echo chambers. This bubble is a state of isolation that can occur from personalized algorithms.[8] Echo chambers, however, refer to the overall concept of only being exposed to information from like-minded individuals, and filter bubbles can be a method for echo chambers to occur, but just refer to the results of algorithms.

Social Media

When looking at the different types of echo chambers that online communities and social networks can create, there is a clear distinction to be made between two groups. First there are those sites that create the echo chambers inherently through how their platform is designed, such as with targeted content from algorithms. Then, there are sites where users create the echo chambers themselves. The perception of these two different types of sites, both from users’ perspectives and in that of the media, is often quite varied.

Platform Created Echo Chambers

The feedback loop between users and advertising companies.[9]

Social networking sites, like Facebook and Twitter, are often blamed and accused of being the biggest proponents of echo chambers, due to their personalization algorithms. Many people now get their news through these sites and these algorithms cater to one’s interests. This curation has led to what is known as a “filter bubble.”[10]These filter bubbles present news and information that the user wants to see. So if a user only subscribes to news from one side of the political spectrum, not only will their newsfeeds be primarily filled with this information, but the areas of the sites dedicated to discovering new content will be skewed towards their preexisting beliefs as well. Sites that create these filter bubbles tend to receive higher criticism due to this positive feedback loop that their algorithms can create. This is backed up by studies that show that Facebook and Twitter have higher segregation than sites without these algorithms such as Reddit.[11]

User Created Echo Chambers

The flipside of platform created echo chambers are websites that give the power to create and moderate communities and content to the users, thus allowing echo chambers to be made without the influence of algorithms. Although platform created echo chambers appear to have a greater level of segregation and polarization than user created echo chambers, biases still exist in user-controlled platforms like Reddit as well. In particular, studies have shown the existence of significant bias in these platforms towards certain classifications of users (such as gender, religion, age, etc).[12] This has created criticism for the amount of control these platforms give users over the conversations and membership to communities, which allow them to turn into highly homogeneous echo chambers.[13] This is true to some extent, and has been shown to have had real-life effects, such as a decrease in the trust of vaccinations during the COVID-19 pandemic.[14][15] Despite this, however, it is shown that the levels of polarization are lower than it may appear.[16] Analysis of interactions on Reddit, for example, have shown high levels of cross-community interactions, contrary to the sentiment of the users of the platform.[16] Beyond this, looking specifically at the possibility of political echo chambers, shows that there is a more balanced discussion than might first be imagined.[17] This strengthens the findings that without the overreaching control of algorithms, users are less likely to create echo chambers on their own.

This shows that the actual political leaning of Reddit is actually quite balanced, despite what is commonly believed about it creating extremified echo chambers. [17]

Ethical Implications

Echo chambers, while mostly present in online communities, do have real-life effects and ethical implications. They can lead its members to distrust those on the outside of the chamber with different beliefs, even if those beliefs are true.[18] These chambers often have political and scientific consequences.

Echo Chambers in Politics

Echo chambers are often spoken about with regards to political issues due to the highly obvious grouping often connected to political parties. One example of this is Rush Limbaugh, a conservative radio host, whose radio show was categorized as an echo chamber.[19] In a review done by the New York Times, his radio show was described as being aimed at conservatives in the “silent majority” and reinforced theories against Democrats, framing them as “evil-doers, [with] the media [as] their evil allies.”[20] The radio show created an echo chamber for its listeners, being fed information that confirmed their biases. In the same review, the New York Times credits Donald Trump’s win and the success of his style of politics to Rush Limbaugh, showing the real-life implication of this echo chamber.[21]

Political polarization

It has been discussed that echo chambers online contribute to higher levels of political polarization [22]. There is empirical evidence that supports the claim that Americans who are repeatedly exposed to biased information that favors a certain political view that is close to their own will eventually develop more extreme political standpoints and become less tolerant of those with opposing political views [23]. Concerns regarding transparency when it comes to filtering have also been discussed, as a person may believe that they have an accurate view of an issue or political landscape when they are only being shown the side of the argument that they already agree with [24].

Echo Chambers in Medicine

Echo chambers have also had real-life implications for the medical industry. In a 2017 study on 2.6 million Facebook users over 7 years, it was found that consumption of content about vaccinations was dominated by the echo chamber effect and increased polarization.[25] And as recently as the COVID-19 pandemic, anti-vaxxer groups on social media networks have influenced opinions on the safety of new vaccinations.[26] These groups often reinforce conspiracy theories on vaccinations and encourage their members to not vaccinate themselves or their families. They often do so via alleged stories about side effects from unverified sources, with their members being more moved by these stories than scientific data provided by health agencies.[27]

Prevention of Echo Chambers

Company Driven Prevention

How Twitter determines what action to take with misinformation.[28]

Many companies, specifically social media networks, have been working to combat echo chambers. During the rise of COVID-19 and the 2020 Presidential Election, for example, Twitter began adding disclaimers on Tweets that may contain false information. These labels informed users whether a Tweet contained misleading information, disputed claims, or unverified claims.[28] Information that has been verified to be false by experts falls under the misleading information category.[28] Information that is contested or just unknown falls under the disputed claim category.[28] Finally, information that is unconfirmed, so it could be true or false, is labeled as unverified information.[28] Tweets with one of these labels appended to them will have links to Twitter’s curated page or with links to trusted third-party sources that have verified information. Tweets with a warning label will be blocked from viewing, but can be viewed if the user acknowledges that the information may not be accurate. Additionally, the systems that are marking tweets are not amplifying tweets with warnings or labels. [28] Twitter’s system for identifying tweets with misinformation is still in its early stages and leaders of this initiative at Twitter are open to adjusting labels and other metrics.

In 2017, Facebook also made changes to their network’s “Trending” page by showing multiple news sources for a topic or event with the intent of providing users a variety of viewpoints.[29]

User Driven Prevention

A large issue with echo chambers and filter bubbles is a lack of awareness that they exist; many people are unfamiliar with the concept of filter bubbles or echo chambers, and are therefore unaware of the bias in the information that is brought forward to them by search engines and social media feeds [30]. It has been discussed that this is due to the fact that algorithms are complicated and not many people understand how they work or the impact they have on information and society Cite error: Invalid <ref> tag; refs with no content must have a name. If a user is aware that they are in an echo chamber, it is possible for users to take individual preventive measures to create a more accurate and diverse view of information online.

Though most control resides within social media companies, the user is able to take some steps to prevent being in an echo chamber.[31] The algorithms that categorize users can only do this because they know what users like and do not like. Their whole goal is to increase user engagement. [32] Thus, to prevent the algorithm from classifying users, one method recommended for users is to be conservative with your likes on social media.[33] Furthermore, users can be mindful of the amount of followers people they follow have. At the University of Southern California, Professor Kristina Lerman’s research says that people who have a large disparity between followers and following have an outsized influence on social media.[33] Lastly, on social media apps like Facebook and Twitter, users can update how they want their feed to give them information. The default mode on their feed is set up to give them personalized content that they should, in theory, engage with better. Users now have the option to have information given in chronological order instead of a personalized order.[34] This way users will be given the most up-to-date information, and not necessarily information that will reinforce the user’s current beliefs.

References

  1. Ostergren, M. (2018, February 1). [Political Cartoon of Echo Chambers]. Retrieved March 12, 2021, from https://medium.com/@maryostergren/why-we-must-resist-the-allure-of-political-echo-chambers-404ebf3aaa49
  2. Echo chamber. (n.d.). Retrieved March 12, 2021, from https://www.oxfordlearnersdictionaries.com/us/definition/english/echo-chamber
  3. Heshmat, S. (2015, April 23). What is confirmation bias? Retrieved March 12, 2021, from https://www.psychologytoday.com/us/blog/science-choice/201504/what-is-confirmation-bias
  4. Nicholas, DiFonzo. (2011, April 22). The Echo-Chamber Effect. Retrieved March 18, 2021, from https://www.nytimes.com/roomfordebate/2011/04/21/barack-obama-and-the-psychology-of-the-birther-myth/the-echo-chamber-effect
  5. Barberá, P., Jost, J. T., Nagler, J., Tucker, J. A., & Bonneau, R. (2015). Tweeting From Left to Right: Is Online Political Communication More Than an Echo Chamber? Psychological Science, 26(10), 1531–1542. https://doi.org/10.1177/0956797615594620
  6. How filter bubbles distort reality: Everything you need to know. (2019, November 14). Retrieved March 12, 2021, from https://fs.blog/2017/07/filter-bubbles/
  7. C. Thi Nguyen Associate Professor of Philosophy. (2021, February 17). The problem of living inside echo chambers. Retrieved March 12, 2021, from https://theconversation.com/the-problem-of-living-inside-echo-chambers-110486
  8. What is a filter Bubble? - definition from Techopedia. (n.d.). Retrieved March 12, 2021, from https://www.techopedia.com/definition/28556/filter-bubble
  9. (2017, February 14). [The feedback loop between users and advertising companies.]. Retrieved March 17, 2021, from https://www.sticky.digital/danger-of-living-in-a-filter-bubble/
  10. Hosanagar, K. (2016, November 25). Blame the echo chamber on Facebook. but blame yourself, too. Retrieved March 12, 2021, from https://www.wired.com/2016/11/facebook-echo-chamber/
  11. Cinelli, M., Morales, G. D., Galeazzi, A., Quattrociocchi, W., & Stanini, M. (2021). The echo chamber effect on social media. Proceedings of the National Academy of Sciences March 2021, 118(9). doi:10.1073/pnas.2023301118
  12. Ferrer, X., van Nuenen, T., Such, J. M., & Criado, N. (2020). Discovering and categorising language biases in reddit. arXiv preprint arXiv:2008.02754.
  13. Wiggers, K. (2020, August 07). Researchers quantify bias in reddit content sometimes used to train ai. Retrieved March 24, 2021, from https://venturebeat.com/2020/08/07/researchers-quantify-bias-in-reddit-content-sometimes-used-to-train-ai/
  14. Ulen, Thomas S., Democracy and the Internet: Cass R. Sunstein, Republic.Com. Princeton, Nj. Princeton University Press. Pp. 224. 2001. Available at SSRN: https://ssrn.com/abstract=286293 or http://dx.doi.org/10.2139/ssrn.286293
  15. Burki, T. (2020). The online anti-vaccine movement in the age of covid-19. The Lancet Digital Health, 2(10). doi:10.1016/s2589-7500(20)30227-2
  16. 16.0 16.1 De Francisci Morales, G., Monti, C. & Starnini, M. No echo in the chambers of political interactions on Reddit. Sci Rep 11, 2818 (2021). https://doi.org/10.1038/s41598-021-81531-x
  17. 17.0 17.1 Tyler, "Reddit's leftward political bias", Data Insights, 2019.
  18. C. Thi Nguyen Associate Professor of Philosophy. (2021, February 17). The problem of living inside echo chambers. Retrieved March 12, 2021, from https://theconversation.com/the-problem-of-living-inside-echo-chambers-110486
  19. Jamieson, K. H., & Cappella, J. N. (2010). Echo Chamber Rush Limbaugh and the Conservative Media Establishment. Oxford: Oxford University Press, USA.
  20. Barbaro, M. (Host). (2021, February 22). The Legacy of Rush Limbaugh [Audio podcast]. Retrieved from https://www.nytimes.com/2021/02/22/podcasts/the-daily/rush-limbaugh-conservatism-donald-trump.html?showTranscript=1.
  21. Barbaro, M. (Host). (2021, February 22). The Legacy of Rush Limbaugh [Audio podcast]. Retrieved from https://www.nytimes.com/2021/02/22/podcasts/the-daily/rush-limbaugh-conservatism-donald-trump.html?showTranscript=1.
  22. Hong, Sounman, and Sun Hyoung Kim. “Political Polarization on Twitter: Implications for the Use of Social Media in Digital Governments.” Government Information Quarterly, vol. 33, no. 4, 2016, pp. 777–782., doi:10.1016/j.giq.2016.04.007
  23. Borgesius, Frederik J. Zuiderveen, et al. “Should We Worry About Filter Bubbles?” Internet Policy Review, 31 Mar. 2016, policyreview.info/articles/analysis/should-we-worry-about-filter-bubbles.
  24. Pariser, Eli. “The Filter Bubble: What the Internet Is Hiding From You.” The Filter Bubble: What the Internet Is Hiding From You, Penguin Books, 2012, pp. 6–16
  25. Schmidt AL, Zollo F, Scala A, Betsch C, Quattrociocchi W. Polarization of the vaccination debate on Facebook. Vaccine. 2018 Jun 14;36(25):3606-3612. doi: 10.1016/j.vaccine.2018.05.040. PMID: 29773322.
  26. Thompson, D. (2021, January 29). Anti-Vaxxers wage campaigns Against covid-19 shots. Retrieved March 12, 2021, from https://www.webmd.com/vaccines/covid-19-vaccine/news/20210129/anti-vaxxers-mounting-internet-campaigns-against-covid-19-shots
  27. Ortiz-Sánchez, E., Velando-Soriano, A., Pradas-Hernández, L., Vargas-Román, K., Gómez-Urquiza, J. L., Cañadas-De la Fuente, G. A., & Albendín-García, L. (2020). Analysis of the Anti-Vaccine Movement in Social Networks: A Systematic Review. International journal of environmental research and public health, 17(15), 5394. https://doi.org/10.3390/ijerph17155394
  28. 28.0 28.1 28.2 28.3 28.4 28.5 Roth, Y. (2020, May 11). Updating our approach to misleading information. Retrieved March 12, 2021, from https://blog.twitter.com/en_us/topics/product/2020/updating-our-approach-to-misleading-information.html
  29. Cathcart, W. (2019, November 07). Continuing our updates to trending. Retrieved March 12, 2021, from https://about.fb.com/news/2017/01/continuing-our-updates-to-trending/
  30. Allred, Kristen. “The Causes and Effects of ‘Filter Bubbles’ and How to Break Free.” Medium, Medium, 13 Apr. 2018, medium.com/@10797952/the-causes-and-effects-of-filter-bubbles-and-how-to-break-free-df6c5cbf919f.
  31. Michael, Cusumano (2021, January 15). Social Media Companies Should Self-Regulate. Now. Retrieved March 19, 2021, from https://hbr.org/2021/01/social-media-companies-should-self-regulate-now
  32. (2016, July 24). The Reason Your Feed Became An Echo Chamber — And What To Do About It. Retrieved March 12, 2021, from https://www.npr.org/sections/alltechconsidered/2016/07/24/486941582/the-reason-your-feed-became-an-echo-chamber-and-what-to-do-about-it
  33. 33.0 33.1 Christopher, Seneca(2020, September 17). How to Break Out of Your Social Media Echo Chamber. Retrieved March 18, 2021, from https://www.wired.com/story/facebook-twitter-echo-chamber-confirmation-bias/
  34. Natt, Garun. (2020, March 6). How to switch your Twitter feed to a chronological timeline. Retrieved March 18, 2021, from theverge.com/2020/3/6/21167920/twitter-chronological-feed-how-to-ios-android-app-timeline/