Bias in Information

From SI410
Jump to: navigation, search
Back • ↑Topics • ↑Categories

Bias in information occurs when searches for information produce varied results, leading to subsequent differing interpretations of these results. When users search for information, they are searching for “the resolution of uncertainty” [1]. Searchers may face confusion as they try to understand their search results. This can then lead to discrepancies in the searcher's knowledge of a topic. The act of filtering results in a specific way by only allowing certain pieces of information to be accessible to an observer can drastically change the value and meaning of the content provided. Online search engines enable the persistence of biased information.[2] The prevalence of bias in information among search engines has lead to ethical concerns regarding privacy, the filtering of search results, and types of biases that may occur among social groups as a result.

Bias in searching for information[3]

Types of Bias

There are different types of bias in information which can be grouped into four main categories: general bias, research bias, news bias, and prejudices. The types of bias in group bias are confirmation bias and groupthink/bandwagon bias. Research bias consists of selection bias, anchoring bias, response bias, and non-response bias. The types of news bias include commercial bias, bad news bias, status quo bias, access bias, visual bias, fairness bias, narrative bias, expediency bias, glory bias, and spin. The types of prejudice include classism, racism, and sexism.

General Bias

Confirmation Bias

Confirmation bias is defined as users interpreting information as a confirmation to their current beliefs. This is common online, particularly on media sites where users and publishers only present information that prove their points. [4] A filter bubble best explains the concept of confirmation bias. A filter bubble is usually an algorithm that guesses what a user might be willing to see on their social media based on what they agree with. This usually isolates other perspectives, leaving the user with little information about opposed views and filtering out what the user would not want to see. [5] Confirmation bias has the potential to lead to self-fulfilling modes of thinking that may inhibit civil discussion or debate, thus forming an echo chamber for similar thoughts creating a larger product: Groupthink.

Groupthink/Bandwagon Bias

Groupthink or Bandwagon bias may occur in settings where large groups of people share a common motive for wanting to come together. Out of fear of becoming isolated from the group, participants typically try to maintain a harmonious work environment. So they will refrain from sharing their honest opinions on controversial decisions. [4] Sometimes, groupthink bias is associated with explaining the idea of strong differing points of views, such as conservatism and liberalism, where members of the group may conform to others ideas in order to feel like member. [6] Pearl Harbor is one of the major examples of the groupthink bias. While measures were taken to warn officers at Pearl Harbor, the warnings were not taken seriously. There was a large consensus that the Japanese would never dare to attack Pearl Harbor, and this bias led the United States Army and Navy to overlook the possible attack. [7]

Research Bias

Selection Bias

Selection bias is common in research, where researchers decide the number of users and the type of users to use for research. This results in non-random participants, which makes it nearly impossible to validate the actual findings found in the research. [4] Researchers can tailor how they select participants in such a way as to yield results in line with their bias. There are several subcategories to selection bias. Sampling bias is an unintentional failure to exclude certain groups from a study. This happens when certain groups are not equally represented or not represented at all in a non-random sample. This results in an unintentional selection, skewing the results of the study. [8] Time interval studies also cause selection bias because the researchers conducting the study decide to end the trial when they are satisfied with the results, leading to a failure in including certain results in the conclusion. Susceptibility bias is also another example of selection bias. It involves the study of diseases in which links are drawn in the trial that are not truly linked. [9]

Anchoring Bias

Anchoring bias occurs when users or researchers use a single piece of information to make subsequent decisions. In reality, there should be a certain amount of information to allow researchers to decide which information is best in hopes of avoiding bias. Once an Anchor is set, users will continue to base all actions and decisions based off of that anchor which is particularly difficult to remove once established. [4] Certain factors influence anchoring from happening. These include, mood, experience, personality, and cognitive ability. Anchoring bias is often seen in negotiations, such as in business. For example, during real estate negotiations, knowing the asking price will influence the buyers for the rest of the buying process. [10] Also, studies have noticed that the more specific the asking price is at the beginning of a negotiation, the smaller the increments will be in the negotiation process. [11]

Response Bias

Response Bias refers to the wide range of tendencies that people can have when answering a survey or questionnaire inaccurately[12]. Response Biases can drastically affect the validity of surveys or questionnaires. Social norms, the wording of a particular question, or the desire of the participant to answer in a way that would confirm the researchers' null hypothesis are all examples of just a few of the possible causes of response bias. Characteristics that can cause response bias are unfamiliar content, fatigue, faulty recall. Similarly, a person's own attitudes, behaviors, and personal traits can skew the results of a study. A misleading question can also result in the misinterpretation of a participant's response. For example, if a user is offered a survey and they are given the options "very likely, likely, and not at all", they are not left with many choices such as "not likely" or anything in between that range. This bias manifests itself in this particular example in that the participant is given more positive response choices than negative ones, so the changes are higher that they would select a positive answer, skewing the results of the survey. [13]

Non-Response Bias

Non-Response Bias is when the results of surveys, questionnaires or elections become inaccurate because there are too many non-participating subjects.[14]. The population that is left that actually did participate in the survey is no longer representative of the target population because there are too few participants in order to gather reliable data. The most common recommended protection against non-response bias is to reduce the amount of non-response from participants[15]. This type of bias can often be seen with mailed surveys which get very little responses and can alter the results of the survey. [16]

News Media Bias

News Bias, or Media Bias, is the bias of journalists or news organizations within the mainstream media in the reported events and stories. Bias in reports will influence the receivers of the information by favoring the opinions of the journalist presenting the story. Common news networks often align with a specific political party or belief, thus filtering their content and their consumers.

Bias Description
Commercial Bias Commercial bias is defined as the news that must remain "new," therefore news outlets rarely ever double-check their sources. This information may already be reported or take the form of stories that are considered as "old." This leads to a bias in the information released in the news as a form of "new" content. [4]
Bad News Bias Bad news bias is defined as news outlets highlighting stories that are scary or threatening in order to generate more views. News providers attempt to pique the interest of viewers with shocking stories to benefit themselves. This creates a bias in the types of stories as less concerning stories are being shared via news. [4]
Status Quo Bias Status quo bias is defined as the preference people have for things to stay the same, which causes news outlets to stick to their typical routine. This type of bias stems from people fearing the consequences of changing their preferences to something "new."[17] News outlets exhibit status quo bias by reporting on the same types of stories to avoid losing viewers.
Access Bias Journalists and readers may compromise the transparency of the news in order to gain access to powerful people as story sources. News outlets create a bias in the information they report on as they are simply leveraging the power of well-known public figures.
Visual Bias Stories with a visual hook are more likely to attract a larger audience. News outlets focus on stories that have some type of visual appeal to their audience. [18] This leads to bias as these type of stories and information get more coverage than they would otherwise merit.
Fairness Bias Fairness bias is defined as reporters presenting their opposing viewpoints in order to seem "fair" regardless of their opinions. This bias is most prevalent in news reporting on politics. [18] News outlets seek to create the idea that politicians are always in opposition and can never agree. This can lead to bias in which news outlets are targeting one party or another.
Narrative Bias News outlets present a story as a narrative with a beginning, middle, and an ending. However, many real-life news stories are reported before there is a final ending. In other words, viewers don't get information to gain a better understanding of the topic just the main part with a limited conclusion. Journalists try to tackle the problem by inserting a provisional ending, therefore making the reports seem more conclusive than they actually are. They are focused on having a nice way of ending the story as opposed to giving the viewers an answer to how the story ends. There is also a bias that comes from being employed by an agency that has its own specific political stances on issues. This type of bias in news reporting attempts to create drama throughout the narrative storyline as it generally leads to more interesting stories and increases the number of viewers.[18]
Expediency Bias Expediency bias is defined as news outlets that seek to report on information that can be obtained quickly, easily, and inexpensively.[18] News outlets are extremely competitive and seek to report on information that seems attractive and appealing to the larger audience. This leads to bias in this type of information as it is obtained quickly and easily. Therefore, reporters should focus on "fact checking" their sources in case they need to discover other resources to ensure their credibility.
Glory Bias Glory bias is prevalent when news reporters insert themselves into the story that they are reporting on. [18] This type of bias leads to journalists attempting to establish a cultural identity as a knowledgeable insider. In reality, journalists should observe and "keep track" of the details in the stories so they are reported without bias.
Spin "Spin" involves emphasizing certain aspects of a news story with the hope that other aspects can be ignored.[19] It matters because the way a subject is presented can be seen as "the truth" regardless of the underlying facts.

An example of this is when house prices are low and people share that it is bad for "sellers". Alternatively, when housing prices are "up", people say this is bad for "buyers". In reality, it is a lose-lose situation regardless of the scenarios. [20]

Prejudices

Classism

Classism is discrimination on the basis of social class. It is a set of practices and beliefs reinforced by institutions, cultures, and individuals that assign differential values to people according to their socioeconomic class. This type of discrimination often benefits those in the upper class at the expense of those in the lower class.

Racism

Racism is the belief that a particular race is superior or inferior to another race.

Sexism

Sexism is discrimination based on a person's sex or gender. It particularly affects women and girls who are often depicted as weaker or inferior due to their gender.

Search Engine Results

The First 10 results

A search engine can provide thousands of results as a list of ranked items where the top items are considered to be the most "relevant" pages. These results are based on the user's given query. If one is researching on a specific topic and if the top results don’t provide the user with what they are searching for they will repeatedly refine the contents of the query until they find exactly what they are looking for. As a result, the first few links that appear when a user performs their search may repeatedly be ranked highest among the returned results from query to query. However, this may exclude a number of important, potentially opposing pieces of content. An example being Safiya Umoja Noble's research of the misrepresentative results given after searching the term "black girls".[21] In the context of news and media, this may lead to a number of self-fulfilling biases and/or discrimination.

Information Overload

Information overload means that one is overwhelmed and trying to process a lot of information. The amount of information that is readily available to the public has only increased over time.[22]. Information overload is exhibited through the thousands of results given by search engines and can make it seemingly impossible for an average user to parse through all of this information. Bias in information has its relevancy here because the average searcher does not have the ability to process and organize all of the information that is available for themselves. Therefore, they have to trust and depend on the search results that the search engine provides for them. For example, in libraries and museums, it is witnessed that one human could not possibly read all of the books in an extensive library or fully study all aspects of a museum. Excessive information can prevent a user from understanding certain information, so it prevents them from making an informed decision.

Search Engines

A search engine is a software system that is designed to carry out a web search on a particular query or phrase that is provided by a user. The information provided from search results can include many different types of media such as articles, documents, images, videos, and infographics. Search engines provide easy access to information that is available in specific locations - libraries and museums. Search engines are the most common form of finding information today. Google, for example, processes 40,000 queries a second [23], which accounts for 3.5 billion searches a day and 1.2 trillion searches a year.

How Search Engines Work

Search engines.png

A search engine can provide thousands of results in seconds and has the ability to do so because of its background work. In the background, there are three major steps: web crawling, indexing, and the algorithm the search engine performs[24]. A web crawler searches the World Wide Web to discover specific documents which can be added to the search engine’s personal collection. Every time a document is updated, or a new document is found, a crawler will add a copy of this document to a collection. This collection of documents is kept by the search engine in a data center. This can be organized and searched through based on what a user is looking for. The algorithm is a search engine that must decide how to organize the documents to provide the user with a ranked set of relevant results. The ranking of these results is based conceptually on how many other pages link to these resulting pages weighted by how popular each of the other page is. For example if many pages were to cite wikipedia as their source, wikipedia's rank would go up. This ranking protocol is referred to as PageRank. [25] Before these three steps can occur a user must write a query for the technology to compute its results. This network of pages contributing to the rank of each other does not inherently correct for circular ranking, and hence inherently promotes information Silo-ing.

Search Engine Optimization (SEO)

Search Engine Optimization refers to the process of optimizing a website or web page's online visibility from a web search through organic search results. SEO is attempting to improve search engine rankings, especially on Google. SEO is a major internet marketing strategy and is examined to detect the most useful keywords and search terms typed into search engines to produce a higher ranked result for a given website.

Popular Search Engines

Google

Google is the world's largest and most popular search engine, dominating the search engine market with over 75% of the market. Major reasons for its popularity is due to the ease of use, name recognition, and their innovative algorithms that produce personalized search results. Currently, Google is starting to face a lot of backlash due to the idea that their search results and its engine is now being controlled more through paid searches and targeted contented as opposed to organic search results based on the highest quality results or the most trustworthy results [26].

It is difficult for a company like Google to balance out the ethical implications of what they show in their searches and being a successful company that is trying to make money. Google appears to be transparent about their search engine process as they have detailed into how they search engine works listed on their website [27]. Users of the search engine should realize that Google is a for-profit company and so they have not the obligation to provide you with the results that are totally unbias and want would be deemed proper. They need to balance being a great tool with how to make money so leaning too far, either way, will lead to losing users, but not taking everything on the top of a Google search as the best source is important for users to keep in mind.

Bing

Another popular, large brand in regards to search engines is Bing. Bing was created by Microsoft in 2009 in response to Google and its increasing dominance on the search engine market and as an improvement upon its previous search engine, MSN Search. Although Bing has never been as large as Google it is similar in quality in many expert opinions and actually has a better video search. It is interesting to see the difference in search results between Google and Bing, since the two are competitors and employ similar algorithms in regards to the construction of the search engine itself. Experts comparing the two have shown the results are similar however provide different top results based on their own formula [28].

YouTube

YouTube is the largest search engine in regards to video content. With over 1.5 billion users logging in each month, YouTube has a ton of power with what videos people are watching and what information is being spread. Many people do not first think of YouTube when the term search engine is discussed, but it is actually a major way that information is spread with its enormous database of videos. It is important to look at YouTube when talking about bias in information, since the way its algorithms are constructed play a major role in what information is being consumed by their users [29]. YouTube, like Google, has found themselves into some ethical dilemmas as it tries to balance out how to keep people watching their videos and consuming content while keeping what people watch fluid. A recent article from the New York Times exposed an algorithm that YouTube was employing that resulted in the radicalization of the videos they were watching. The intent of this algorithm was to provide users videos that would make them stay on YouTube but instead resulted in users being fed extremely graphic and intense content. [30]

DuckDuckGo

DuckDuckGo is a search engine that may not be as popular as Google but is unique in its approach in fielding search results for a user. As opposed to Google, DuckDuckGo does not collect data on its users or track their search history in order to feed search results. By utilizing crowdsourcing, DuckDuckGo aims to provide the best and most quality search results as opposed to ones more targeted or manipulated [31]. There are other approaches to the search engine than just what is done amongst the most popular search engines and DuckDuckGo is an example of an alternate approach that is working well. In 2018, the search engine eclipsed 9 billion searches and its growth has tripled between 2015 and 2018 [32]. A trajectory like this is a look into a possible change into what people want from their search engines, and the value users have on unbias results.

Amazon

As the world's largest online marketplace, Amazon is a major search engine and plays a ton of power in what users believe are the best products and most reliable sellers. With over 2 million sellers, many sellers are being left out of searches and since Amazon is a for-profit company, their search results are more than likely to benefit them in a biased manner [33]. Although Amazon may not be malicious in their intent, it is always to realize the background of a search result, especially on e-commerce websites.

Ethical Concerns

Searching for information is today's reality and the process inevitably causes many ethical concerns to arise. These ethical concerns come from the bias involved in the search engine design, the filtering of results, and the privacy of the user.

Privacy

Along with the process of finding optimal results, a search engine will track certain information about a user. The time and date of each query that is searched, along with the IP address of the information, are all stored. Although unlikely, pooling similar IP address can get a list of searches by a specific user[34]. According to David Shoemaker's Self-exposure and exposure of the self, privacy is breached when information associated with a users identity is used outside of their control. Further, it is not necessarily individual pieces of data that are a part of one's self-identity, however, patterning from data mining often is[35]. The example of using IP addresses to find a user's searches could result in patterning and a serious breach of privacy. In addition, the IP address that is shared with the search engine is on your local router. This provides specific information on geolocation. The use of geolocation can be used against users in specific scenarios, for example, in China, the use of Google is prohibited and different search engines are provided.[36]

Along with the ability to ban specific phrases in certain locations, a search engine uses past searches and the documents looked at as part of their algorithms. When a document is viewed frequently, its ranking increases on the list of results because users find it relevant. Youtube and Netflix adopted the recommender system that conducts personalized information filtering using search and view history or tracking cookies. The methods that companies use to gather data are problematic because users are not informed and notifications used to ask for users' consent are too vague or hard for users to comprehend.[37]

Bias

Bias in algorithm[38]

Bias can be introduced into the process at each step because of the nature of search engines and how it reflects its results. In “Values in technology and disclosive computer ethics”, Brey discusses the idea that technology has “embedded values” meaning that computers and their software are not “morally neutral” [39]. Computers can favor specific values because of their design and structure.

Brey discusses three types of biases to relate and define in search engines :

  • Preexisting Bias
  • Technical Bias
  • Emergent Bias

Preexisting bias occurs when values and attitudes exist prior to the development of the software. Software system breakdowns occur when an order of documents is provided after a search. If the algorithm of the system always favors certain documents over others without any interference from outside sponsors, it might receive the documents first that reflect the values of the creator of the algorithm.

Technical bias occurs due to the limitations of the software. Because of the nature of search engines and how people use them it is impossible to display certain results. It is impossible for people to view certain results. The documents that have been gathered can have certain limitations. In many situations, this type of information can lead to bias due to the fact that there might be more information for specific things in comparison to other topics.

Emergent bias occurs when the system is being used in a way that is not intended by its designers. When a user enters a phrase, the wording of the phrase is very important. Different words with the same meaning can have different connotations which can provide different results.

Social Bias

Studies show that search engines reinforce many social biases and stereotypes. For example, Google Images has been criticized for its lack of diversity in its search results [4]. Another example, if one searches for the word “doctor," the male to female ratio is significantly disproportionate. Even if there are currently more practicing male than female doctors, Google Images showing very few photos of female doctors can reinforce harmful bias. This type of bias in search results can lead to Confirmation bias, discouraging minorities from pursuing those things that interest them due to the lack of visible representation.

Another report shows that when searching for three black teenagers, the results include a series of mugshot photos. On the other hand, when searching for three white teenagers, images of smiling young adults appear [4]. Google search results are affected by preexisting bias, technical bias, and emergent bias. Results like these then perpetuate societal stereotypes and values. Engineers and users must be mindful of the implications of these biases and work to overcome them in technology and society.

Stereotype and Discrimination Reinforcement

Consequences of social and preexisting bias in information can also implicitly undermine disadvantaged groups by associating them with negative traits and not associating them with positive traits. One research study found that people with African-American sounding names needed to send out more resumes than people with white-sounding names in order to get a callback.[40] This showcases the bias that is rampant in the corporate industry as people are deemed unworthy by the names on their resumes instead of the skills and experiences they possess. In machine learning and facial recognition algorithms created by large technology companies like IBM, Microsoft, and Amazon, women with darker skin were found to be misgendered about 1/3 of the time compared to their lighter-skinned counterparts.[41] This reinforces the detrimental notions that women with darker skin are less feminine.[42] This creates issues because instead of ensuring groups of people get equal treatment based on the merit of their individual characteristics, they get unfair assumptions attached to their identity based on phenotypical features.

Attempts at Inclusion

In 2018, Pinterest rolled out a feature that allows users to select their skin tone so that results can be better catered towards them. The most common example is that when women search for "women hairstyles" the results lack diversity and predominantly show white women [43].

Pinterest engineer, Laksh Bhasin, wrote on Pinterest's blog, that in order to prevent further bias across the Internet, Pinterest prioritized user's privacy with this feature. To prevent targeted ads and to protect personal information (like ethnicity) from being collected, a user must select the skin tone in every new search. [44].

User Bias
Bias in autocomplete suggestions for different search engines[45]

User input plays a huge role in the existence of bias in search engines. When users type the same term in the search box, different search engines like Yahoo, Google, and Bing give different autocomplete words. For example, Google might produce positive search suggestions for the term "Hillary Clinton" while Yahoo and Bing might give negative ones. This kind of bias originates from different user behavior and features for different search engines.[45] The majority of Bing users are between 55 and 64 years old, while Google has much younger users compared to both Bing and Yahoo.[46] Besides age, the users have different economic, social, and cultural backgrounds that make a huge contribution to search history and search behavior for search engines.

Filtering Results

The search engine's algorithm contributes to the types of results that are presented to individuals. In order to find the most relevant documents, it filters to identify and categorize documents based on the selected subject. Due to the nature of receiving a list of relevant documents, it is impossible for all of the results to be shown at once or viewed. Search results can be influenced by advertisements and specific companies or sites sponsoring their own documents to be prioritized. This leads to certain documents receiving less attention and thus less awareness of the information within.

Filtering that is used to customize information poses concerns to privacy issues and limits the information that users are exposed to. For instance, Netflix's recommender system allows priorities to information similar to what users searched for. As Xavier Amatriain, the Director of Algorithms Engineering at Netflix, says, "over 75% of what people watch comes from our commendations."[47] It is very likely that the similarities among the information can trap users in a loop that isolates them from having the choice of accessing new information. [37]

Filter Bubbles

Search engines and other services that make use of filtering algorithms to tailor user's results towards their interests creates "filter bubbles." Eli Pariser describes these search engines as "creating a unique universe of information for each of us." [48] Pariser coined the term filter bubble to describe this unique universe of information. He believes that filter bubbles present three new dynamics to personalization:[48]

  1. Each user is alone in their filter bubble.
  2. The filter bubble is invisible.
  3. Each user does not choose to enter their prospective filter bubble.

The implications of this is that every user who uses the filtering mechanisms of search engines or other services is trapped into their own filter bubble of personalization unknowingly. These users don't choose to enter the filter bubbles yet search algorithms can create the filter bubbles for them based on their search history.[48]

Filter bubbles provide the benefit of creating a personalized universe of information for each user. However, it most certainly yields some negative consequences. As the filter bubble is created, it only shows users information that is similar to their previous interests or suggests content that is assumed the user would enjoy or engage with.[48] This can lead to users confining their interests and restricting them from searching for new or different. It results in users not being able to branch out and explore other content.[48]

Filter bubbles grow smaller and more precise over time. As filtering and search algorithms gather more data on their users, they are able to provide more accurate recommendations which in turn leads to more restricted search results. As the filter bubble grows smaller, the amount of unique content which each user sees diminishes as well.[48]

See Also

References

  1. “Information.” Wikipedia, Wikimedia Foundation, 9 Apr. 2019, en.wikipedia.org/wiki/Information.
  2. Snow, Jackie, MIT Technology Review, Bias already exists in search engine results, and it’s only going to get worse, https://www.technologyreview.com/s/610275/meet-the-woman-who-searches-out-search-engines-bias-against-women-and-minorities/, Feb 26 2018
  3. “Should the Google Search Engine Be Answerable To Competition Regulation Authorities?” Economic and Political Weekly, 7 Sept. 2018, www.epw.in/engage/article/should-google-search-engine-be.
  4. 4.0 4.1 4.2 4.3 4.4 4.5 4.6 4.7 Ching, Teo Choong, and Teo Choong Ching. “Types of Cognitive Biases You Need to Be Aware of as a Researcher.” UX Collective, UX Collective, 27 Sept. 2016, uxdesign.cc/cognitive-biases-you-need-to-be-familiar-with-as-a-researcher-c482c9ee1d49.
  5. “What Is a Filter Bubble? - Definition from Techopedia.” Techopedia.com, www.techopedia.com/definition/28556/filter-bubble.
  6. “Does Liberal Truly Mean Open-Minded?” Psychology Today, Sussex Publishers, www.psychologytoday.com/us/blog/real-men-dont-write-blogs/201103/does-liberal-truly-mean-open-minded.
  7. Janis, I. L. (1982). Groupthink: Psychological Studies of Policy Decisions and Fiascoes. Boston: Houghton Mifflin. ISBN 0-395-31704-5.
  8. “Sampling Bias.” Sampling Bias - Medical Definition from MediLexicon, www.medilexicon.com/dictionary/10087.
  9. Horwitz, R I, et al. “The Role of Susceptibility Bias in Epidemiologic Research.” Archives of Internal Medicine, U.S. National Library of Medicine, May 1985, www.ncbi.nlm.nih.gov/pubmed/3994467.
  10. Northcraft, Gregory B; Neale, Margaret A (1987). "Experts, amateurs, and real estate: An anchoring-and-adjustment perspective on property pricing decisions". Organizational Behavior and Human Decision Processes. 39 (1): 84–97. doi:10.1016/0749-5978(87)90046-X
  11. Janiszewski, Chris; Uy, Dan (2008). "Precision of the Anchor Influences the Amount of Adjustment". Psychological Science. 19 (2): 121–127. doi:10.1111/j.1467-9280.2008.02057.x. PMID 18271859
  12. "Response Bias". Wikipedia, Wikimedia Foundation, 18 April 2019. en.m.wikipedia.org
  13. “Response Bias: Definition and Examples.” Statistics How To, 12 Oct. 2017, www.statisticshowto.datasciencecentral.com/response-bias/.
  14. "Participation Bias". Wikipedia, Wikimedia Foundation. 19 April 2019.
  15. "Estimating Nonresponse Bias in Mail Surveys". Armstrong, J. Scott. Journal of Marketing Research Vol14. No. 3. Special Issue. 19, April 2019
  16. “Stat Trek.” Nonresponse Bias: Definition, stattrek.com/statistics/dictionary.aspx?definition=nonresponse_bias.
  17. “Status Quo Bias.” Behavioraleconomics.com | The BE Hub, www.behavioraleconomics.com/resources/mini-encyclopedia-of-be/status-quo-bias/.
  18. 18.0 18.1 18.2 18.3 18.4 “Media / Political Bias.” Media, rhetorica.net/bias.htm.
  19. " Allen, Dr. Steven J. “Deception and Misdirection - Media Bias: 8 Types [a Classic, Kinda].” Capital Research Center: America’s Investigative Think Tank, 24 Nov. 2015, capitalresearch.org."
  20. " Allen, Dr. Steven J. “Deception and Misdirection - Media Bias: 8 Types [a Classic, Kinda].” Capital Research Center: America’s Investigative Think Tank, 24 Nov. 2015, capitalresearch.org."
  21. Noble, Safiya Umoja. “Critical Surveillance Literacy in Social Media: Interrogating Black Death and Dying Online.” Black Camera, vol. 9, no. 2, 2018, p. 147., doi:10.2979/blackcamera.9.2.10.
  22. “Information Overload.” Wikipedia, Wikimedia Foundation, 28 Mar. 2019, en.wikipedia.org/wiki/Information_overload.
  23. “Google Search Statistics.” Google Search Statistics - Internet Live Stats, www.internetlivestats.com/google-search-statistics/.
  24. “How Do Search Engines Work? - BBC Bitesize.” BBC News, BBC, 23 Oct. 2018, www.bbc.com/bitesize/articles/ztbjq6f.
  25. “PageRank.” Wikipedia, Wikimedia Foundation, 10 Apr. 2019, en.wikipedia.org/wiki/PageRank.
  26. Davies, Dave. “The 7 Most Popular Search Engines in the World.” Search Engine Journal, 7 Jan. 2018, www.searchenginejournal.com/.
  27. https://www.google.com/search/howsearchworks/
  28. Gordon, Whitson. “Search Engine Showdown: Google vs. Bing.” Lifehacker, Lifehacker, 1 Nov. 2015, lifehacker.com/.
  29. Davies, Dave. “The 7 Most Popular Search Engines in the World.” Search Engine Journal, 7 Jan. 2018, www.searchenginejournal.com/.
  30. Tufekci, Zeynep. “Youtube, the Great Radicalizer.” The New York Times, The New York Times, 10 Mar. 2018, www.nytimes.com/.
  31. Burgess, Matt, and Victoria Woollaston. “DuckDuckGo: What Is It and How Does It Work?” WIRED, 1 Feb. 2017, www.wired.co.uk/.
  32. Schwartz, Barry. “DuckDuckGo Broke 9 Billion Searches in 2018, and It’s Growing.” Search Engine Land, 4 Jan. 2019, searchengineland.com/.
  33. “Number of Sellers on Amazon Marketplace.” Marketplace Pulse, www.marketplacepulse.com/.
  34. Weissman, Cale Guthrie. “What Is an IP Address and What Can It Reveal about You?” Business Insider, Business Insider, 18 May 2015, www.businessinsider.com/ip-address-what-they-can-reveal-about-you-2015-5.
  35. Shoemaker, David. Self-exposure and exposure of the self: informational privacy and the presentation of identity, 2009.
  36. Replacement of Google with Alternative Search Systems in China - Documentation and Screen Shots, cyber.harvard.edu/filtering/china/google-replacements/.
  37. 37.0 37.1 Fröding, Barbro, and Martin Peterson. “Why Virtual Friendship Is No Genuine Friendship.” SpringerLink, Springer Netherlands, 6 Jan. 2012, link.springer.com/article/10.1007/s10676-011-9284-4.
  38. Noble, Safiya Umoja. Algorithms of Oppression: How Search Engines Reinforce Racism. New York University Press, 2018.
  39. Brey, Philip. “Values in Technology and Disclosive Computer Ethics (Chapter 3) - The Cambridge Handbook of Information and Computer Ethics.” Cambridge Core, Cambridge University Press, www.cambridge.org/core/books/cambridge-handbook-of-information-and-computer-ethics/values-in-technology-and-disclosive-computer-ethics/4732B8AD60561EC8C171984E2F590C49.
  40. Bertrand, M. & Mullainathan, S. (2004). Are Emily and Greg More Employable than Lakisha and Jamal? A Field Experiment on Labor Market Discrimination. American Economic Review. https://www.nber.org/papers/w9873
  41. Buolamwini, Joy. “Artificial Intelligence Has a Racial and Gender Bias Problem.” Time, Time, 7 Feb. 2019, time.com/5520558/artificial-intelligence-racial-gender-bias/.
  42. Fahs, Breanne. "The dreaded body: disgust and the production of “appropriate” femininity." Journal of Gender Studies 26.2 (2017): 184-196.
  43. Gershgorn, Dave. “Pinterest Is Redesigning Biased Algorithms to Make Its Search More Inclusive.” Quartz, Quartz, 16 May 2018, qz.com/1278772/pinterest-introduces-filter-by-skin-tone-to-make-its-search-more-inclusive/.
  44. Pinterest Engineering. “Building a More Inclusive Way to Search.” Medium, Medium, 26 Apr. 2018, medium.com/@Pinterest_Engineering/building-a-more-inclusive-way-to-search-789f4c92fd73.
  45. 45.0 45.1 ipullrank. “Dr. Epstein, You Don't Understand How Search Engines Work.” IPullRank, 14 Sept. 2016, ipullrank.com/dr-epstein-you-dont-understand-how-search-engines-work/.
  46. Sentance, Rebecca. “What Are the Differences in How Age Demographics Search the Internet?” UserZoom, 11 Dec. 2018, www.userzoom.com/blog/what-are-the-differences-in-how-age-demographics-search-the-internet/.
  47. 47.0 47.1 Amatriain, Xavier. “Machine Learning & Recommender Systems at Netflix Scale.” InfoQ, InfoQ, 16 Jan. 2014, www.infoq.com/presentations/machine-learning-netflix.
  48. 48.0 48.1 48.2 48.3 48.4 48.5 Pariser, Eli. The Filter Bubble: What the Internet is Hiding From You. The Penguin Press, New York, 2011.
Back • ↑Top of Page