Difference between revisions of "Filter Bubble"

From SI410
Jump to: navigation, search
(edits)
(Formalized the tone of the header paragraph.)
Line 1: Line 1:
A '''Filter Bubble''' is the isolation or dilution of outside information by algorithms. This commonly occurs when algorithms sort displayed content based on assumptions about user relevance. In other words, it displays content it believes is interesting to you. Users become less exposed to opposing viewpoints and content that isn't deemed relevant. Feeds become bubbles of a user's own ethnocentric ideologies<ref>http://www.npr.org/sections/alltechconsidered/2016/07/24/486941582/the-reason-your-feed-became-an-echo-chamber-and-what-to-do-about-it</ref>. The term first appeared in Eli Pariser's novel '''Filter Bubble''', where he noted how personalization is changing the web and the dangers of becoming isolated in one's own beliefs<ref>https://www.youtube.com/watch?v=SG4BA7b6ORo</ref>.  
+
A '''Filter Bubble''' is the isolation or dilution of outside information by algorithms. They commonly occur on websites in which algorithms sort displayed content based on assumptions about user relevance in an attempt to present only content that is estimated to be interesting to the user. A frequently discussed consequence of these algorithms is that they may prevent users from being exposed to opposing viewpoints and content that has been deemed irrelevant, and over time transform news feed into a reflection of the user's own ethnocentric ideologies<ref>http://www.npr.org/sections/alltechconsidered/2016/07/24/486941582/the-reason-your-feed-became-an-echo-chamber-and-what-to-do-about-it</ref>. The term first appeared in Eli Pariser's novel '''Filter Bubble''', which discussed the effects of personalization on the internet as well as the dangers of ideological isolation<ref>https://www.youtube.com/watch?v=SG4BA7b6ORo</ref>.  
  
 
==Abstract==
 
==Abstract==

Revision as of 15:26, 5 April 2017

A Filter Bubble is the isolation or dilution of outside information by algorithms. They commonly occur on websites in which algorithms sort displayed content based on assumptions about user relevance in an attempt to present only content that is estimated to be interesting to the user. A frequently discussed consequence of these algorithms is that they may prevent users from being exposed to opposing viewpoints and content that has been deemed irrelevant, and over time transform news feed into a reflection of the user's own ethnocentric ideologies[1]. The term first appeared in Eli Pariser's novel Filter Bubble, which discussed the effects of personalization on the internet as well as the dangers of ideological isolation[2].

Abstract

The goal of a relevance algorithm is to learn a user's preferences in order to curate content on their feed. This allows a feed to be personalized, being unique depending on the user. Microsoft researcher Tarleton Gillespie explains, "navigate massive databases of information, or the entire web. Recommendation algorithms map our preferences against others, suggesting new or forgotten bits of culture for us to encounter. Algorithms manage our interactions on social networking sites, highlighting the news of one friend while excluding another's. Algorithms designed to calculate what is 'hot' or 'trending' or 'most discussed' skim the cream from the seemingly boundless chatter that's on offer. Together, these algorithms not only help us find information, they provide a means to know what there is to know and how to know it, to participate in social and political discourse, and to familiarize ourselves with the publics in which we participate"[3]. This style of curation fosters user engagement and activism within the site. The concept has a snowballing effect, as more user activism teaches the algorithm to better select content. Soon, the cycle isolates a user. While seeming harmless, when this concept is drawn across political ideologies, it can hinder a person's ability to see alternative points of view. In Stanford University's overview of democracy, a citizen must "listen to the views of other people" and reminds people not to be "so convinced of the rightness of your views that you refuse to see any merit in another position. Consider different interests and points of view"[4]. Thus, to facilitate a constructive democracy, it is important to recognize alternative views.

Social Media as News Media Platforms

In 2016, Pew Research Center collected data on how modern American's use social media. They found 62% of adults turn to social media sites for news information. Of that, 18% do so often, 26% sometimes, and 18% hardly ever [5]. In additional research, Facebook was the most common outlet for Millennials to get news, at 61%, with CNN being second with 44%. Millennials rely on Facebook for political news almost as much as Baby Boomers' rely on local TV for news (60%). For Generation X, the majority, 51%, turn to Facebook for news [6]. Another major influence on political discourse is the extent of political content on Facebook. With 9-in-10 Millenials on Facebook, 77% of Gen Xers, and 57% of Baby Boomers, much of the content they see if political. 24% of Millenials say half or more of their Facebook content is political, while 66% say its more than none, but less than half. For Gen Xer's, 18% say half and 71% less half but more than none of their content is political [7]

Discourse

In a Wired article, Mostafa El-Bermawy shared his findings on the 2016 Presidential Election. Utilizing Google Trends, Google AdWords, and other social media analytic software, El-Bermawy compared numbers between Donald Trump and Hillary Clinton. He found Trump’s social media presence heavily out weighed Clinton’s, averaging more shares on posts and overall followers. The author pointed out the second most popular article shared on social media had earned 1.5 million shares. However, he had never seen or heard of this article. El-Bermawy asked friends around the New York area, only to find the same ignorance. He believes this is the work of filter bubbles, removing a popular article that contested his liberal views [8].

Recent political campaigns have pushed filter bubble's into mainstream discourse. Some argue they are a driving factor behind upset political victories, such as Brexit and Trump's campaign [9]. Social media rely on a relevance based algorithm to sort displayed content[10]. For the first time ever, 62% of American adults receive their news from social media[11]. As media platforms, these sites control the flow of information and political discourse, isolating users in their own cultural or ideological convictions. This became apparent in the Wall Street Journal's article titled "Red Feed, Blue Feed"[12].

Quotes

“Your computer monitor is a kind of one-way mirror, reflecting your own interests while algorithmic observers watch what you click.” ― Eli Pariser[13]

References

  1. http://www.npr.org/sections/alltechconsidered/2016/07/24/486941582/the-reason-your-feed-became-an-echo-chamber-and-what-to-do-about-it
  2. https://www.youtube.com/watch?v=SG4BA7b6ORo
  3. http://www.tarletongillespie.org/essays/Gillespie%20-%20The%20Relevance%20of%20Algorithms.pdf "The Relevance of Algorithms"
  4. https://web.stanford.edu/~ldiamond/iraq/WhaIsDemocracy012004.htm "What is Democracy?"
  5. http://www.journalism.org/2016/05/26/news-use-across-social-media-platforms-2016/
  6. http://www.journalism.org/2015/06/01/facebook-top-source-for-political-news-among-millennials/
  7. http://www.journalism.org/2015/06/01/facebook-top-source-for-political-news-among-millennials/
  8. https://www.wired.com/2016/11/filter-bubble-destroying-democracy/
  9. https://www.theguardian.com/media/2017/jan/08/eli-pariser-activist-whose-filter-bubble-warnings-presaged-trump-and-brexit
  10. https://cs.illinois.edu/sites/default/files/files2/sigir11_cwang.pdf
  11. http://www.journalism.org/2016/05/26/news-use-across-social-media-platforms-2016/
  12. "Red Feed, Blue Feed"
  13. The Filter Bubble: What the Internet is Hiding From You

More to come. [Kennedy Kaufman]