Difference between revisions of "Filter Bubble"

From SI410
Jump to: navigation, search
(Online Personalization)
(Added See Also Section)
Line 26: Line 26:
 
===Information Bias===
 
===Information Bias===
 
Many social media algorithms are designed to display the content that users most likely want to see. For example, the Facebook News Feed Algorithm was developed to keep people connected to the people, places, and things they want to be connected to. <ref> Backstrom, Lars. "News Feed FYI: Helping Make Sure You Don’t Miss Stories from Friends." Facebook Newsroom. N.p., 29 June 2016. Web. 16 Apr. 2017. <https://newsroom.fb.com/news/2016/06/news-feed-fyi-helping-make-sure-you-dont-miss-stories-from-friends/>. </ref> The News Feed Algorithm decides what content to display based on the people and content that users have previously interacted with. The Facebook algorithm limits the autonomy of the user by deciding what gets displayed on their news feeds for them. The content displayed becomes biased because many people enclose themselves with like minded individuals on their social media networks. When people are only exposed to content that reinforces their previously held beliefs, it becomes difficult for them to accept the legitimacy of opposing beliefs, ultimately enclosing them in their own cultural and ideological bubbles. <ref> Resnick, Paul. "Bursting Your (Filter) Bubble: Strategies for Promoting Diverse Exposure." CSCW '13: Compilation Publication of CSCW '13 Proceedings & CSCW '13 Companion: February 23-27, 2013, San Antonio TX, USA. By Amy Bruckman and Scott Counts. New York, NY: Association for Computing Machinery, 2013. N. pag. Print. </ref>
 
Many social media algorithms are designed to display the content that users most likely want to see. For example, the Facebook News Feed Algorithm was developed to keep people connected to the people, places, and things they want to be connected to. <ref> Backstrom, Lars. "News Feed FYI: Helping Make Sure You Don’t Miss Stories from Friends." Facebook Newsroom. N.p., 29 June 2016. Web. 16 Apr. 2017. <https://newsroom.fb.com/news/2016/06/news-feed-fyi-helping-make-sure-you-dont-miss-stories-from-friends/>. </ref> The News Feed Algorithm decides what content to display based on the people and content that users have previously interacted with. The Facebook algorithm limits the autonomy of the user by deciding what gets displayed on their news feeds for them. The content displayed becomes biased because many people enclose themselves with like minded individuals on their social media networks. When people are only exposed to content that reinforces their previously held beliefs, it becomes difficult for them to accept the legitimacy of opposing beliefs, ultimately enclosing them in their own cultural and ideological bubbles. <ref> Resnick, Paul. "Bursting Your (Filter) Bubble: Strategies for Promoting Diverse Exposure." CSCW '13: Compilation Publication of CSCW '13 Proceedings & CSCW '13 Companion: February 23-27, 2013, San Antonio TX, USA. By Amy Bruckman and Scott Counts. New York, NY: Association for Computing Machinery, 2013. N. pag. Print. </ref>
 +
 +
==See Also==
 +
*[[Facebook]]
 +
*[[Fake News]]
 +
*[[Social Networking]]
  
 
==References==
 
==References==

Revision as of 20:11, 22 April 2017

Example of a visual filter bubble.

Filter Bubbles are pockets of information isolated by algorithms. They occur on websites in which algorithms sort content based on user relevance to present content deemed interesting to the user. They are often criticized for preventing users from being exposed to opposing viewpoints and content that has been deemed irrelevant, and over time transform news feed into a reflection of the user's own ethnocentric ideologies[1]. The term first appeared in Eli Pariser's novel Filter Bubble, which discussed the effects of personalization on the internet as well as the dangers of ideological isolation[2].

Abstract

The goal of a relevance algorithm is to learn a user's preferences in order to curate content on their feed. This allows a feed to be personalized, being unique depending on the user. Microsoft researcher Tarleton Gillespie explains, "navigate massive databases of information, or the entire web. Recommendation algorithms map our preferences against others, suggesting new or forgotten bits of culture for us to encounter. Algorithms manage our interactions on social networking sites, highlighting the news of one friend while excluding another's. Algorithms designed to calculate what is 'hot' or 'trending' or 'most discussed' skim the cream from the seemingly boundless chatter that's on offer. Together, these algorithms not only help us find information, they provide a means to know what there is to know and how to know it, to participate in social and political discourse, and to familiarize ourselves with the publics in which we participate"[3]. This style of curation fosters user engagement and activism within the site. The concept has a snowballing effect, as more user activism teaches the algorithm to better select content. Soon, the cycle isolates a user. While seeming harmless, when this concept is drawn across political ideologies, it can hinder a person's ability to see alternative points of view. In Stanford University's overview of democracy, a citizen must "listen to the views of other people" and reminds people not to be "so convinced of the rightness of your views that you refuse to see any merit in another position. Consider different interests and points of view"[4]. Thus, to facilitate a constructive democracy, it is important to recognize alternative views.

Social Media as News Media Platforms

In 2016, Pew Research Center collected data on how modern American's use social media. They found 62% of adults turn to social media sites for news information. Of that, 18% do so often, 26% sometimes, and 18% hardly ever [5]. In additional research, Facebook was the most common outlet for Millennials to get news, at 61%, with CNN being second with 44%. Millennials rely on Facebook for political news almost as much as Baby Boomers' rely on local TV for news (60%). For Generation X, the majority, 51%, turn to Facebook for news [6]. Another major influence on political discourse is the extent of political content on Facebook. With 9-in-10 Millenials on Facebook, 77% of Gen Xers, and 57% of Baby Boomers, much of the content they see if political. 24% of Millenials say half or more of their Facebook content is political, while 66% say its more than none, but less than half. For Gen Xer's, 18% say half and 71% less half but more than none of their content is political [7] Facebook also confirmed that their content is 99% authentic and remaining 1% is fake news or clickbaits. [8] Social media algorithms are optimized for engagement and allows us to consume news differently than we used to. They present us with information that is shared by our network and rarely those of opposing viewpoints. This could ultimately isolate people into a stream that confirms their assumptions and suspicions about the world by tailoring people and pages one follows.[9]

Discourse

In a Wired article, Mostafa El-Bermawy shared his findings on the 2016 Presidential Election. Utilizing Google Trends, Google AdWords, and other social media analytic software, El-Bermawy compared numbers between Donald Trump and Hillary Clinton. He found Trump’s social media presence heavily out weighed Clinton’s, averaging more shares on posts and overall followers. The author pointed out the second most popular article shared on social media had earned 1.5 million shares. However, he had never seen or heard of this article. El-Bermawy asked friends around the New York area, only to find the same ignorance. He believes this is the work of filter bubbles, removing a popular article that contested his liberal views [10].

Recent political campaigns have pushed filter bubble's into mainstream discourse. Some argue they are a driving factor behind upset political victories, such as Brexit and Trump's campaign [11]. Social media rely on a relevance based algorithm to sort displayed content[12]. For the first time ever, 62% of American adults receive their news from social media[13]. As media platforms, these sites control the flow of information and political discourse, isolating users in their own cultural or ideological convictions. This became apparent in the Wall Street Journal's article titled "Red Feed, Blue Feed"[14]. The way we consume information has changed. Social media allows us to consume news differently because our news feed is curated and limited by the things that will not make you uncomfortable. Isolating people in their own filter bubbles while creating profound confusion on different political sides. [15]

Quotes

“Your computer monitor is a kind of one-way mirror, reflecting your own interests while algorithmic observers watch what you click.” ― Eli Pariser[16]

Ethical Implications

Online Personalization

Most people do not realize that their Google results are personalized to their past searches. Users may start to lose their independence on their social media experience as their identies are structured based on the filter bubbles that they create for themselves by their online searches. In addition to Google, many media outlets and social networks are personalizing content for users based on their searches. [17] There are many positives and negatives to this personalization, however. For businesses, personalization offers increased relevance for consumers, more clicks on links, and overall increased revenue. [17] Users are benefited because they are more easily able to find content relevant to them. [18] Personalization may be helpful as far as information overload, but it also may not expose users to diverse information, which brings up the issue of what force should determine who sees what and whether that is ethical in the first place.[17]

Security

The information that algorithm's collect about users based on what and whom they search and interact with in order to create filter bubbles is not in any way private. Prior searches come up on social media sites as advertisements that anyone could see when the user is browsing their social media. This happens because most website use browser cookies to track their users behavior. A cookie is data sent from a server and stored on the user's computer by the user's browser. The browser then returns the cookie to the server the next time that page is referenced. [19]

Information Bias

Many social media algorithms are designed to display the content that users most likely want to see. For example, the Facebook News Feed Algorithm was developed to keep people connected to the people, places, and things they want to be connected to. [20] The News Feed Algorithm decides what content to display based on the people and content that users have previously interacted with. The Facebook algorithm limits the autonomy of the user by deciding what gets displayed on their news feeds for them. The content displayed becomes biased because many people enclose themselves with like minded individuals on their social media networks. When people are only exposed to content that reinforces their previously held beliefs, it becomes difficult for them to accept the legitimacy of opposing beliefs, ultimately enclosing them in their own cultural and ideological bubbles. [21]

See Also

References

More to come. [Kennedy Kaufman]
  1. http://www.npr.org/sections/alltechconsidered/2016/07/24/486941582/the-reason-your-feed-became-an-echo-chamber-and-what-to-do-about-it
  2. https://www.youtube.com/watch?v=SG4BA7b6ORo
  3. http://www.tarletongillespie.org/essays/Gillespie%20-%20The%20Relevance%20of%20Algorithms.pdf "The Relevance of Algorithms"
  4. https://web.stanford.edu/~ldiamond/iraq/WhaIsDemocracy012004.htm "What is Democracy?"
  5. http://www.journalism.org/2016/05/26/news-use-across-social-media-platforms-2016/
  6. http://www.journalism.org/2015/06/01/facebook-top-source-for-political-news-among-millennials/
  7. http://www.journalism.org/2015/06/01/facebook-top-source-for-political-news-among-millennials/
  8. https://www.facebook.com/zuck/posts/10103253901916271
  9. https://medium.com/@tobiasrose/empathy-to-democracy-b7f04ab57eee
  10. https://www.wired.com/2016/11/filter-bubble-destroying-democracy/
  11. https://www.theguardian.com/media/2017/jan/08/eli-pariser-activist-whose-filter-bubble-warnings-presaged-trump-and-brexit
  12. https://cs.illinois.edu/sites/default/files/files2/sigir11_cwang.pdf
  13. http://www.journalism.org/2016/05/26/news-use-across-social-media-platforms-2016/
  14. "Red Feed, Blue Feed"
  15. https://medium.com/@tobiasrose/empathy-to-democracy-b7f04ab57eee
  16. The Filter Bubble: What the Internet is Hiding From You
  17. 17.0 17.1 17.2 Catone, Josh. "Why Web Personalization May Be Damaging Our World View" Mashable (3 June 2011. Retrieved 22 April 2017).
  18. Cite error: Invalid <ref> tag; no text was provided for refs named name
  19. "What Is an Internet Cookie?" HowStuffWorks. N.p., 01 Apr. 2000. Web. 16 Apr. 2017. <http://computer.howstuffworks.com/internet/basics/question82.htm>
  20. Backstrom, Lars. "News Feed FYI: Helping Make Sure You Don’t Miss Stories from Friends." Facebook Newsroom. N.p., 29 June 2016. Web. 16 Apr. 2017. <https://newsroom.fb.com/news/2016/06/news-feed-fyi-helping-make-sure-you-dont-miss-stories-from-friends/>.
  21. Resnick, Paul. "Bursting Your (Filter) Bubble: Strategies for Promoting Diverse Exposure." CSCW '13: Compilation Publication of CSCW '13 Proceedings & CSCW '13 Companion: February 23-27, 2013, San Antonio TX, USA. By Amy Bruckman and Scott Counts. New York, NY: Association for Computing Machinery, 2013. N. pag. Print.