Difference between revisions of "Data Mining and Manipulation"

From SI410
Jump to: navigation, search
 
(19 intermediate revisions by one other user not shown)
Line 1: Line 1:
Advancements in computer algorithms have presented a myriad of ethical and morality questions.  While the implementation of computer algorithms in software products provides real-time feedback to help design better user-interfaces and experiences, algorithms can also be used as a weapon, to unintentionally inflict harm to millions of clients, as in the 2010 case when Facebook manipulated its users' emotions.  This monumental Facebook case presents a discussion for examining transparency,__________,_________, embedded in computer algorithms.
+
[[File:3903662-3x2-340x227.jpg|300px|thumbnail|right|Algorithms are silently influencing the way people interact online]]
 +
Advancements in computer algorithms have given society the tools to process and analyze data efficiently.  Unfortunately, algorithms have presented a myriad of ethical and morality questions.  While the implementation of computer algorithms in software products provides real-time feedback to design better user-interfaces and experiences, algorithms can also be used as a weapon, to unintentionally inflict harm to millions of clients, as in the 2010 case when Facebook manipulated its users' emotions.  This monumental Facebook case presents a discussion for examining ethical issues related to transparency and informed consent embedded in computer algorithms.
  
 
== Algorithm Development ==
 
== Algorithm Development ==
Algorithms are mathematical and logic processes that serve as instructions for processing data or performing calculations.  Algorithms have experienced major breakthroughs that have allowed it to replace traditional data mining technologies.  Data mining technologies have historically used statistical modeling software such as [[wikipedia:Vertica|HP's Vertica Analytics Platform]] and powerful computers to process large amounts of data.  These traditional tools offer great analysis but at great costs.  These technologies are extremely expensive especially considering HP's Vertical Software costs nearly $100,000-$150,000 per terabyte.<ref> Ex-Vertica CEO: Hadoop is pulling the rug from under the database industry, Derrick Harris, November 2, 2013, https://gigaom.com/2013/11/02/ex-vertica-ceo-hadoop-is-pulling-the-rug-from-under-the-database-industry/ </ref>  Algorithms provide a resource effective solution that cuts back costs, physical space, and time.
+
Algorithms are mathematical and logic processes that serve as instructions for processing data or performing calculations.<ref> The algorithm, Antony Funnell, March 25, 2012, http://www.abc.net.au/radionational/programs/futuretense/the-algorithm/3901466 </ref> Algorithms have experienced major breakthroughs that have allowed it to replace traditional data mining technologies.  Data mining technologies have historically used statistical modeling software such as [[wikipedia:Vertica|HP's Vertica Analytics Platform]] and powerful computers to process large amounts of data.  These traditional tools offered great analysis but at great costs.  These technologies are extremely expensive especially considering HP's Vertical Software costs nearly $100,000-$150,000 per terabyte.<ref> Ex-Vertica CEO: Hadoop is pulling the rug from under the database industry, Derrick Harris, November 2, 2013, https://gigaom.com/2013/11/02/ex-vertica-ceo-hadoop-is-pulling-the-rug-from-under-the-database-industry/ </ref>  As a result, algorithms provide a resource effective solution that cuts back costs, physical space, and time.
  
 
As the [[wikipedia:Internet of Things|"Internet of Things"]] drastically multiplied over the years, traditional data mining software and computers became inefficient.  As such, there was a growing need in commercial markets to develop new technologies that were able to process and analyze data quickly and cheaply.  [[wikipedia:Algorithms|Algorithms]] within the context of [[wikipedia:Mathematics|Mathematics]] and [[wikipedia:Computer Science|Computer Science]], were refined and found to help alleviate some of the logistical problems in data mining.  Even though algorithms have been around since the dawn ages, Computer Science algorithms have recently seen development and application to technological platforms.  The benefit of algorithms is that it filters out data real-time and automatically.  Algorithms allow developers to build in implementation directly into the application/website and allows developers to incorporate the algorithms' analysis in real-time.
 
As the [[wikipedia:Internet of Things|"Internet of Things"]] drastically multiplied over the years, traditional data mining software and computers became inefficient.  As such, there was a growing need in commercial markets to develop new technologies that were able to process and analyze data quickly and cheaply.  [[wikipedia:Algorithms|Algorithms]] within the context of [[wikipedia:Mathematics|Mathematics]] and [[wikipedia:Computer Science|Computer Science]], were refined and found to help alleviate some of the logistical problems in data mining.  Even though algorithms have been around since the dawn ages, Computer Science algorithms have recently seen development and application to technological platforms.  The benefit of algorithms is that it filters out data real-time and automatically.  Algorithms allow developers to build in implementation directly into the application/website and allows developers to incorporate the algorithms' analysis in real-time.
Line 8: Line 9:
 
== Applications ==
 
== Applications ==
 
[[File:Edgerank-socialbakers.png|200px|thumbnail|left|Facebook News Feed Algorithm]]
 
[[File:Edgerank-socialbakers.png|200px|thumbnail|left|Facebook News Feed Algorithm]]
Algorithms are widely used in the field of consumerism.  Financial companies have long used mathematical algorithms to determine portfolio investments, predict stock change, and inform clients of possible future decisions.  However, the social media site [[wikipedia:Facebook|Facebook]] has recently pioneered the data mining industry over the past decade with its implementation of a variety of algorithms in collecting and processing data from 1.23 billion active users.<ref> Facebook passes 1.23 billion monthly active users, 945 million mobile users, and 757 million daily users, Emil Protalinski http://www.anderson.ucla.edu/faculty/jason.frand/teacher/technologies/palace/index.htm</ref>  It has especially pioneered [[wikipedia:machine learning|machine learning algorithms]] to analyze the online behavior of a user's status updates, comments, likes, and groups.  The most famous Facebook algorithm is the news feed algorithm which recently received spotlight in the 2010 case.  By effectively drawing patterns from the sustained usage of millions of users, Facebook designers and developers are able to get feedback on design changes and functionalities.  This in turn helps Facebook create a more user-friendlier social networking site that attracts and retains more users.
+
Algorithms are widely used in the field of consumerism.  Financial companies have long used mathematical algorithms to determine portfolio investments, predict stock change, and inform clients of possible future decisions.  However, the social media site [[wikipedia:Facebook|Facebook]] has recently pioneered the data mining industry over the past decade with its implementation of a variety of algorithms in collecting and processing data from 1.23 billion active users.<ref> Facebook passes 1.23 billion monthly active users, 945 million mobile users, and 757 million daily users, Emil Protalinski http://www.anderson.ucla.edu/faculty/jason.frand/teacher/technologies/palace/index.htm</ref>  It has especially pioneered [[wikipedia:machine learning|machine learning algorithms]] to analyze the online behavior of a user's status updates, comments, likes, and groups.  The most famous Facebook algorithm is the news feed algorithm which recently received spotlight in the 2010 case.  By effectively drawing patterns from the sustained usage of millions of users, Facebook designers and developers were able to get feedback on design changes and functionalities.  This in turn helped Facebook create a more user-friendlier social networking site that attracts and retains more users.
 
== 2010 Case of Data Manipulation ==
 
== 2010 Case of Data Manipulation ==
 
[[File:Facebook-eye-e1403978392750.jpg|200px|thumbnail|right|Monumental Facebook Ethics Case]]
 
[[File:Facebook-eye-e1403978392750.jpg|200px|thumbnail|right|Monumental Facebook Ethics Case]]
 
=== Overview ===
 
=== Overview ===
In 2010, Facebook was involved in a controversy that revolved around its News Feed algorithm used to manipulate users.  A team of Facebook [[wikipedia:data scientist|data scientists]], led by [[wikipedia:Adam Kramer|Adam Kramer]], sought to know if displaying all "negative" or "positive" statuses, pictures, news, and comments were to alter the behavior in the same way a user was influenced.  For users flooded with inspirational quotes, ideas, pictures, and upbeat statuses, were they more likely to conform to the behavior of the environment and even adopt the same mindset?  Thus, this team of data scientists set out on a research project in collaboration with PNAS that altered the News Feeds of 689,003 users.  Facebook did not need the consent from these "guinea pigs" as they had agreed to the terms and conditions outlined in the Facebook's data use policy. It explicitly states that user information will be used "for internal operations, including troubleshooting, data analysis, testing, research and service improvement". <ref> Data Policy, January 30th, 2015 https://www.facebook.com/policy.php </ref>
+
In 2010, Facebook was involved in a controversy that revolved around its News Feed algorithm.  A team of Facebook [[wikipedia:data scientist|data scientists]], led by [[wikipedia:Adam Kramer|Adam Kramer]], sought to know if displaying all "negative" or "positive" statuses, pictures, news, and comments were to alter the behavior in the same way a user was influenced.<ref name=forbes> Facebook Manipulated 689,003 Users' Emotions for Science, Kashmir Hill, June 29, 2014, http://www.forbes.com/sites/kashmirhill/2014/06/28/facebook-manipulated-689003-users-emotions-for-science/#3705a9d0704d </ref> For the users flooded with inspirational quotes, ideas, pictures, and upbeat statuses, were more likely to conform to the behavior of the environment and even adopt the same mindset?  Thus, this team of data scientists set out on a research project in collaboration with [[wikipedia:PNAS|PNAS]] that altered the News Feeds of 689,003 users.  Facebook, without the consent from these "guinea pigs", employed developers to tweak the algorithms for these users.  The reason why they did not need consent from users beforehand was because these users had agreed to the terms and conditions outlined in the Facebook's data use policy when they first registered on the site.<ref name=forbes /> It explicitly states in the terms and conditions during initial user registration that user information will be used "for internal operations, including troubleshooting, data analysis, testing, research and service improvement". <ref> Data Policy, January 30th, 2015 https://www.facebook.com/policy.php </ref>
 
+
The experiment that was carried out involved the Facebook algorithm that generates what users see in the News Feed.  Developers modified the architecture to help these researchers with their study.
+
This experiment was conducted between January 11th-18th, 2012.  Over the course of the week, data mining technologies and algorithms filtered forward either strictly "positive" or "negative" feeds.  The result, according to Adam Kramer, was contagious, "When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred."  Facebook concluded that the emotions exhibited by a user's community directly influences his/her own behavior on the social network site, as evidenced by the PNAS research paper published, "We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues." <ref> Experimental evidence of massive-scale emotional contagion through social networks, Adam Kramer, October 23, 2015, http://www.pnas.org/content/111/24/8788.full </ref>
+
  
 +
This experiment was conducted between January 11th-18th, 2012.<ref name=forbes />  Over the course of the week, algorithms filtered forward either strictly "positive" or "negative" feeds for these users.  The result, according to Adam Kramer, was contagious, "When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred."<ref name=forbes />  Facebook concluded that the emotions exhibited by a user's community directly influences his/her own behavior on the social network site, as evidenced by the PNAS research paper published, "We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues." <ref> Experimental evidence of massive-scale emotional contagion through social networks, Adam Kramer, October 23, 2015, http://www.pnas.org/content/111/24/8788.full </ref>
  
 
=== Public Perception ===
 
=== Public Perception ===
Public perception of this research study initially caused widespread anger and terror.  People didn't see Facebook researching how News Feeds' algorithms change users' perception and behavior on the site, but rather how Facebook has become a moral agent in emotionally manipulating its users.   
+
Public perception of this research study initially caused widespread anger and terror.  News media and journalists were extremely quick to jump to conclusions and most headlines created this rhetoric that Facebook was manipulating its users.  This still to this day, is the reputation and rhetoric surrounding this case.
  
 
=== Facebook's Response ===
 
=== Facebook's Response ===
 +
Facebook responded to the initial criticism of the study by focusing more on privacy and data use rather than the ethical dilemmas of emotional manipulation.<ref name=forbes />  Facebook maintains the fact that the research conducted used data that wasn't linked to a specific user's Facebook account. <ref name=forbes />  However, this statement is correct; Facebook didn't use a specific user's data, but rather altered the composite News Feed algorithm.  In short, Facebook continually maintains the fact that its research in understanding how people react to emotional content is important in improving the social media site and to make it as relevant and engaging as possible. 
  
When universities conduct studies on people, they have to run them by an ethics board first to get approval — ethics boards that were created because scientists were getting too creepy in their experiments, getting subjects to think they were shocking someone to death in order to study obedience and letting men live with syphilis for study purposes. A 2012 profile of the Facebook data team noted, “Unlike academic social scientists, Facebook’s employees have a short path from an idea to an experiment on hundreds of millions of people.” (Update 6/30/14): Cornell University released a statement Monday morning saying its ethics board — which is supposed to approve any research on human subjects — passed on reviewing the study because the part involving actual humans was done by Facebook not by the Cornell researcher involved in the study. Though the academic researchers did help design the study — as noted when it was published — so this seems a bit disingenuous.
+
== Ethical Concerns ==
 +
=== Informed Consent ===
 +
Facebook conducted this study because researchers found a loophole in its data use policy that allowed the team to continue without consent.  Many critics have been extremely vocal in expressing that Facebook makes its own rules to follow.  The very fact that it manipulated its users' emotions has been compared to mass human experimentation by the Nazis.  According to University of Maryland technology and law Professor James Grimmelmann, a research study that subjects people to psychological changes is experimentation that undoubtedly requires informed consent. <ref name=slate> Facebook's Unethical Experiment, Katy Waldman, http://www.slate.com/articles/health_and_science/science/2014/06/facebook_unethical_experiment_it_made_news_feeds_happier_or_sadder_to_manipulate.html </ref> 
  
In its initial response to the controversy around the study — a statement sent to me late Saturday night — Facebook doesn’t seem to really get what people are upset about, focusing on privacy and data use rather than the ethics of emotional manipulation and whether Facebook’s TOS lives up to the definition of “informed consent” usually required for academic studies like this. “This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account,” says a Facebook spokesperson. “We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.”
+
Upon further examination into Facebook's research paper, Facebook seemingly justifies its Data Use Policy as consistent with the definition of informed consent.<ref name=slate />  Many critics feel that Facebook should have a consent process that actively notifies study participants rather that passively confirming their participation.
  
Ideally, Facebook would have a consent process for willing study participants: a box to check somewhere saying you’re okay with being subjected to the occasional random psychological experiment that Facebook’s data team cooks up in the name of science. As opposed to the commonplace psychological manipulation cooked up by advertisers trying to sell you stuff.
+
Sarah Tanksalvala from Thomson Reuters further argues that participants also must be "given completely neutral circumstances in which they’re comfortable to leave before the study starts or at any time during it."<ref name="reuters">Facebook's study illustrates the need for informed consent, Sarah Tanksalvala, February 9, 2015, http://endnote.com/blog/facebooks-study-illustrates-need-informed-consent </ref> Essentially, participants must not feel pressure at any time during the participation process.  If they do, Tanksalvala advocates that they must be given the freedom to pull out of the research project.
 
+
 
+
 
+
== Ethical Concerns ==
+
  
 +
=== Transparency ===
 +
There are some major ethical concerns with Facebook's behavior and decisions recently regarding transparency.  Facebook avoided obtaining informed consent by finding a clause within its data use policy.  While Facebook could legally have conducted this research, the way it went about doing so wasn't transparent.  In fact, Gregory McNeal of Forbes noted that Facebook added the phrase "research" months after it conducted its initial research. <ref> Facebook Manipulated User News Feeds To Create Emotional Responses, Gregory McNeal, June 28, 2014, http://www.forbes.com/sites/gregorymcneal/2014/06/28/facebook-manipulated-user-news-feeds-to-create-emotional-contagion/#33453f145fd8 </ref>  In reality, it seems that Facebook "cheated" the system and should have been more open in sending out a notice to those users affected by its study.
  
 +
=== Emotional Manipulation ===
 +
Facebook is touching on a very sensitive subject whether or not a technology platform has the right to influence your emotions and to manipulate them in the name of science.  Regardless, [[Luciano Floridi]], a well-known philosopher in technology, states in "The Ethics of Information Transparency" that there "may be a real need to disclose the ethical details of any process of information management."  (Floridi, 2007, p. 111).
  
 
== See Also ==
 
== See Also ==
Line 47: Line 49:
 
<references/>
 
<references/>
 
}}
 
}}
 +
 +
[[Category: Information Ethics]]
 +
[[Category: Privacy]]
 +
[[Category: Concepts]]

Latest revision as of 15:53, 17 April 2018

Algorithms are silently influencing the way people interact online

Advancements in computer algorithms have given society the tools to process and analyze data efficiently. Unfortunately, algorithms have presented a myriad of ethical and morality questions. While the implementation of computer algorithms in software products provides real-time feedback to design better user-interfaces and experiences, algorithms can also be used as a weapon, to unintentionally inflict harm to millions of clients, as in the 2010 case when Facebook manipulated its users' emotions. This monumental Facebook case presents a discussion for examining ethical issues related to transparency and informed consent embedded in computer algorithms.

Algorithm Development

Algorithms are mathematical and logic processes that serve as instructions for processing data or performing calculations.[1] Algorithms have experienced major breakthroughs that have allowed it to replace traditional data mining technologies. Data mining technologies have historically used statistical modeling software such as HP's Vertica Analytics Platform and powerful computers to process large amounts of data. These traditional tools offered great analysis but at great costs. These technologies are extremely expensive especially considering HP's Vertical Software costs nearly $100,000-$150,000 per terabyte.[2] As a result, algorithms provide a resource effective solution that cuts back costs, physical space, and time.

As the "Internet of Things" drastically multiplied over the years, traditional data mining software and computers became inefficient. As such, there was a growing need in commercial markets to develop new technologies that were able to process and analyze data quickly and cheaply. Algorithms within the context of Mathematics and Computer Science, were refined and found to help alleviate some of the logistical problems in data mining. Even though algorithms have been around since the dawn ages, Computer Science algorithms have recently seen development and application to technological platforms. The benefit of algorithms is that it filters out data real-time and automatically. Algorithms allow developers to build in implementation directly into the application/website and allows developers to incorporate the algorithms' analysis in real-time.

Applications

Facebook News Feed Algorithm

Algorithms are widely used in the field of consumerism. Financial companies have long used mathematical algorithms to determine portfolio investments, predict stock change, and inform clients of possible future decisions. However, the social media site Facebook has recently pioneered the data mining industry over the past decade with its implementation of a variety of algorithms in collecting and processing data from 1.23 billion active users.[3] It has especially pioneered machine learning algorithms to analyze the online behavior of a user's status updates, comments, likes, and groups. The most famous Facebook algorithm is the news feed algorithm which recently received spotlight in the 2010 case. By effectively drawing patterns from the sustained usage of millions of users, Facebook designers and developers were able to get feedback on design changes and functionalities. This in turn helped Facebook create a more user-friendlier social networking site that attracts and retains more users.

2010 Case of Data Manipulation

Monumental Facebook Ethics Case

Overview

In 2010, Facebook was involved in a controversy that revolved around its News Feed algorithm. A team of Facebook data scientists, led by Adam Kramer, sought to know if displaying all "negative" or "positive" statuses, pictures, news, and comments were to alter the behavior in the same way a user was influenced.[4] For the users flooded with inspirational quotes, ideas, pictures, and upbeat statuses, were more likely to conform to the behavior of the environment and even adopt the same mindset? Thus, this team of data scientists set out on a research project in collaboration with PNAS that altered the News Feeds of 689,003 users. Facebook, without the consent from these "guinea pigs", employed developers to tweak the algorithms for these users. The reason why they did not need consent from users beforehand was because these users had agreed to the terms and conditions outlined in the Facebook's data use policy when they first registered on the site.[4] It explicitly states in the terms and conditions during initial user registration that user information will be used "for internal operations, including troubleshooting, data analysis, testing, research and service improvement". [5]

This experiment was conducted between January 11th-18th, 2012.[4] Over the course of the week, algorithms filtered forward either strictly "positive" or "negative" feeds for these users. The result, according to Adam Kramer, was contagious, "When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred."[4] Facebook concluded that the emotions exhibited by a user's community directly influences his/her own behavior on the social network site, as evidenced by the PNAS research paper published, "We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues." [6]

Public Perception

Public perception of this research study initially caused widespread anger and terror. News media and journalists were extremely quick to jump to conclusions and most headlines created this rhetoric that Facebook was manipulating its users. This still to this day, is the reputation and rhetoric surrounding this case.

Facebook's Response

Facebook responded to the initial criticism of the study by focusing more on privacy and data use rather than the ethical dilemmas of emotional manipulation.[4] Facebook maintains the fact that the research conducted used data that wasn't linked to a specific user's Facebook account. [4] However, this statement is correct; Facebook didn't use a specific user's data, but rather altered the composite News Feed algorithm. In short, Facebook continually maintains the fact that its research in understanding how people react to emotional content is important in improving the social media site and to make it as relevant and engaging as possible.

Ethical Concerns

Informed Consent

Facebook conducted this study because researchers found a loophole in its data use policy that allowed the team to continue without consent. Many critics have been extremely vocal in expressing that Facebook makes its own rules to follow. The very fact that it manipulated its users' emotions has been compared to mass human experimentation by the Nazis. According to University of Maryland technology and law Professor James Grimmelmann, a research study that subjects people to psychological changes is experimentation that undoubtedly requires informed consent. [7]

Upon further examination into Facebook's research paper, Facebook seemingly justifies its Data Use Policy as consistent with the definition of informed consent.[7] Many critics feel that Facebook should have a consent process that actively notifies study participants rather that passively confirming their participation.

Sarah Tanksalvala from Thomson Reuters further argues that participants also must be "given completely neutral circumstances in which they’re comfortable to leave before the study starts or at any time during it."[8] Essentially, participants must not feel pressure at any time during the participation process. If they do, Tanksalvala advocates that they must be given the freedom to pull out of the research project.

Transparency

There are some major ethical concerns with Facebook's behavior and decisions recently regarding transparency. Facebook avoided obtaining informed consent by finding a clause within its data use policy. While Facebook could legally have conducted this research, the way it went about doing so wasn't transparent. In fact, Gregory McNeal of Forbes noted that Facebook added the phrase "research" months after it conducted its initial research. [9] In reality, it seems that Facebook "cheated" the system and should have been more open in sending out a notice to those users affected by its study.

Emotional Manipulation

Facebook is touching on a very sensitive subject whether or not a technology platform has the right to influence your emotions and to manipulate them in the name of science. Regardless, Luciano Floridi, a well-known philosopher in technology, states in "The Ethics of Information Transparency" that there "may be a real need to disclose the ethical details of any process of information management." (Floridi, 2007, p. 111).

See Also


References

  1. The algorithm, Antony Funnell, March 25, 2012, http://www.abc.net.au/radionational/programs/futuretense/the-algorithm/3901466
  2. Ex-Vertica CEO: Hadoop is pulling the rug from under the database industry, Derrick Harris, November 2, 2013, https://gigaom.com/2013/11/02/ex-vertica-ceo-hadoop-is-pulling-the-rug-from-under-the-database-industry/
  3. Facebook passes 1.23 billion monthly active users, 945 million mobile users, and 757 million daily users, Emil Protalinski http://www.anderson.ucla.edu/faculty/jason.frand/teacher/technologies/palace/index.htm
  4. 4.0 4.1 4.2 4.3 4.4 4.5 Facebook Manipulated 689,003 Users' Emotions for Science, Kashmir Hill, June 29, 2014, http://www.forbes.com/sites/kashmirhill/2014/06/28/facebook-manipulated-689003-users-emotions-for-science/#3705a9d0704d
  5. Data Policy, January 30th, 2015 https://www.facebook.com/policy.php
  6. Experimental evidence of massive-scale emotional contagion through social networks, Adam Kramer, October 23, 2015, http://www.pnas.org/content/111/24/8788.full
  7. 7.0 7.1 Facebook's Unethical Experiment, Katy Waldman, http://www.slate.com/articles/health_and_science/science/2014/06/facebook_unethical_experiment_it_made_news_feeds_happier_or_sadder_to_manipulate.html
  8. Facebook's study illustrates the need for informed consent, Sarah Tanksalvala, February 9, 2015, http://endnote.com/blog/facebooks-study-illustrates-need-informed-consent
  9. Facebook Manipulated User News Feeds To Create Emotional Responses, Gregory McNeal, June 28, 2014, http://www.forbes.com/sites/gregorymcneal/2014/06/28/facebook-manipulated-user-news-feeds-to-create-emotional-contagion/#33453f145fd8