Cambridge Analytica

From SI410
Jump to: navigation, search

Cambridge Analytica (CA) was a political consulting firm specializing in strategic data analysis techniques used to influence the behavior of targeted audiences. The company started in 2013 and consequently closed in 2018 due to the Facebook–Cambridge Analytica data scandal[1]. By combining extensive personal consumer data records with human behavioral patterns and social influence methods, CA offered extremely valuable data to their prospective clients, including businesses and political campaigns looking to utilize customized microtargeting.[2] 'Microtargeting' is a marketing technique that leverages a consumer's demographic and behavioral data to drive strategy.

Most notably, Cambridge Analytica is known for their association with President Trump’s 2016 campaign and their invasion of personal privacy through their exploitation of Facebook’s platform. Cambridge Analytica used online personality quizzes that users self-selected to take, and with a relatively small portion of test takers were able to collect not only the information from that user's Facebook, but the data from their Facebook Friends' accounts as well, creating a huge pool of data with insights into social and political ties from the data points. Nearly 87 million users of the social media giant, Facebook, had their personal data exposed and sold to Cambridge Analytica while the firm was contracted to do work for the Trump campaign.[3] Since March of 2018, when the news of CA’s dealings and practices first broke, there has been a drastic increase in mention of and care around online data privacy and security. These questions surrounding this topic along with how our personal data is being accessed and used to influence our behaviors and actions unwantedly have gained greater ethical concern.

Origin & Timeline

Cambridge Analytica Logo

Connection to SCL

In 2005, the Strategic Communication Laboratories Group (SCL) was founded by Nigel Oakes. Oakes, with his significant background in human behavioral studies, sought to create a means of manipulating personal behavior aimed at influencing the results of political campaigns. SCL group served as the parent company to Cambridge Analytica, performing data mining and analysis on its audience, in order to generate communications to target key audience groups according to client-specific goals. The company described itself as a "global election management agency." After success in the advertising space, SCL expanded into military and political industries, with intent to approximate the "thinking" of their "target audience." SCL influenced many elections across many countries, and fomented coups around the world, while claiming that their methodologies were approved by the Government of the United Kingdom and the US government. The purpose of forming Cambridge Analytica was specifically in order to influence the election process in the United States, and the group successfully participated in 44 U.S. congressional, U.S. Senate, and state-level elections in the 2014 midterm elections. In addition, CA's support of Ted Cruz, funded by hedge-fund billionaire Robert Mercer and Donald Trump, is widely documented.

Cambridge Analytica Founding

In mid 2013, Christopher Wylie, who would become one of CA’s co-founders, was investigating the British Liberal Democratic party and the lack of election success among their party’s candidates. After some research, Wylie decided to look into using humans’ personality traits as tools for influencing their voting behaviors. Soon after, Wylie was introduced to Alexander Nix, the current CEO of SCL Elections, which morphed into Cambridge Analytica. Later that very year, after hearing about SCL’s (and CA’s) powerful methods, Steve Bannon, a senior strategist to Trump, partnered with Robert Mercer, an elite Republican donor, and his daughter Rebekah to contact Nix and Wylie. Bannon and the Mercers sought to hire CA to aid in Trump’s campaign efforts through exploiting personal data through Facebook applications. That summer, Aleksandr Kogan, the researcher behind developing these applications to obtain Facebook users’ personal data, entered into a deal with SCL Group. Seeing the potential for success, Bannon and the Mercers invested millions. Later than year, Wylie decided to leave Cambridge Analytica. In the beginning of 2016, Nix reported that CA was responsible for Ted Cruz’ Iowa primary victory against Trump. That summer, the Trump campaign hired CA.[4]

In March of 2018, Wylie became a whistleblower against his former company, and spoke out against what he admitted was the unethical and wrongful invasion of privacy and personal data used by CA to help the Trump presidential campaign. As a result, there was a public and media backlash against CA and Facebook and debate about personal privacy and data ethics.[3]

Although Cambridge Analytica shut down after the news broke of their illegal actions, it doesn’t seem that Trump’s campaign has lost contact with them. An Associated Press article from June 2018 broke the news that a new “data and behavioral science company” is teaming up with the Trump’s 2020 re-election campaign. That new company, Data Propia, is being run by former employees of Cambridge Analytica.[4]

Facebook-Cambridge Analytica data scandal

"how did Kogan use Facebook to harvest up to 87 million user profiles?"[5]

In March 2018, the name, Cambridge Analytica, entered the mainstream media after news of a major political scandal erupted. Nearly 87 million Facebook users were affected after Cambridge Analytica harvested personal information from their Facebook profiles without consent. The data was then used for the 2016 presidential Trump campaign. The data was collected by Aleksandr Kogan, a Russian American who worked at the University of Cambridge, through a Facebook app. Facebook apps are able to connect to Facebook’s supplementary software infrastructure, or API, which allowed access for third-party developer’s to any user’s Facebook information after granting permission. Kogan was able to pull information from Facebook’s databases and relay the information to Cambridge Analytica. Facebook uses a Web API as it permits access to third-party developers to its data with a convenient interface, however, restricts the capability of writing or altering a user’s data in the databases[6].

Kogan’s application, This Is Your Digital Life, was a quiz type app that requested permissions from the users, which is common of almost all Facebook and mobile apps. Granting permission enabled Kogan to acquire not only the user’s data who approved the app but that of all of their friends on the platform[6]. The problem arises because this access was not intended by Facebook’s backend developers as every user must give proper permission for their information to be accessible. Kogan’s app acquired 270,000 unique users that granted permission for the use of their information, which enabled access to 87 million users by means of Facebook friends. After collecting the data, Kogan sold it to Cambridge Analytica, directly violating Facebook’s terms & conditions since the data still remained the property of Facebook[6]. Information that was obtained includes, but isn’t limited to, the user’s birthday, page likes, hometown location, personal messages, and timeline. All of these traits and data were combined by CA to curate a unique profile for each user and target them with specific advertisements to influence their opinions and voting behavior.

On April 25, 2019, Facebook decided to restrict personality quizzes, one year after the scandal. A spokesperson for Facebook said that quizzes won't be completely banned but will be subjected to heightened scrutiny[7]. The spokesperson did not elaborate on how on the new criteria for scrutiny or how they will enforce the new criteria. However, Personality quizzes were not the problem itself, but rather the fact that Facebook was allowing developers to use the data [7]. There was never any policy enforcing the protection of the users providing that information for many years. While Facebook proposed a 90-day access limit around the time of the scandal, the change was put into force in April as well. The 90-day access limit protects users who are inactive for 90 days or more on the app by not allowing developers to access their information.[8]


Similarly, on April 25, 2019, Facebook also began to face more legal troubles with Canada. Daniel Therrien, Canada's privacy commissioner, hopes to take Facebook to court and attempt to have the company change its privacy policies. Therrien and Michael McEvoy, the privacy commissioner of British Columbia have said that the charges are related to "unauthorized access; lack of meaningful consent from 'friends of friends'; no proper oversight over privacy practices of apps" and "overall lack of responsibility for personal information." Just this month, Facebook has made a deal with the EU to comply with the GDPR and also expects to spend $5 billion to cover expected fines by the Federal Trade Commission. [9]

Facebook Founder Mark Zuckerberg speaks on Facebook's Cambridge Analytica scandal

Ramifications

The Federal Trade Commission (FTC), which is the primary agency for enforcing consumer data privacy regulations, is investigating Facebook and Cambridge Analytica's improper collection and usage of Facebook users' personal information. In 2011, Facebook entered a consent decree with the FTC to implement better data security policies regarding its data-sharing active with third-party applications and digital advertisers. After the Cambridge Analytica Scandal, the FTC is in the process of assessing whether Facebook violated its 2011 agreement, which could create increased fines and large penalties, as well as catalyze state and federal lawmakers to implement data privacy laws that would burden data controlling social media platforms. The FTC is additionally assessing whether Facebook and Cambridge Analytica's usage of online consumer data violates the agency's prohibition of "unfair or deceptive practices." Although penalties are still being determined by the FTC, it is possible that the FTC could fine Facebook up to $40,000 for each violation, for each of its 87 million compromised consumer accounts. However, those fines are portrayed by some as astronomically high, and unlikely. [10]

Importance & Implications

Cambridge Analytica’s use of Kogan’s application and exploitation of personal privacy through Facebook has stimulated the conversion how our data is being accessed, revealed various consequences that may ensue, and brought to light the unethical practices being taken to manipulate public opinion and influence human behaviors. This event has allowed the greater discussion on information privacy to be more publicized rather than just circulating around academia. Social media platforms, consulting firms, and now politicians, were now being criticized and taken more seriously for the ethical standards they may have not been meeting or data collection methods they may not have been transparent enough about.

The primary concern among the public involved how and why their data was being obtained and the need for more transparency by the platforms being granted access to such data. Everyone has a right to personal privacy and protection online, and personal data was being used in a plethora of ways unbeknownst to the source of said data. Not only was it being collected, which was not nearly as transparent as it should have been, but the data was being used in correlation with bots and algorithms as a means to retarget the user and influence their personal voting behavior. It also probes questioning on what companies should do with all of the data once it has been collected.

Ethical Dilemmas & Consequences

As technology and the internet increase in their pervasiveness or modern society, users are providing, most of the time, without even knowing, unbelievable amounts of private, personal details. Online advertising and retargeting have proven to be some of the most effective means of marketing to a desired demographic of subset. The CA scandal has exposed the reality that political campaigns are changing with the times, and are now taking advantage of very similar techniques and strategies to influence a user’s behavior and actions they take.

Cheating

New technologies like this are extremely vulnerable to manipulation. Mia Consalvo conducted an analysis of those who cheat in multiplayer video games and found people will exploit many points of vulnerability like social engineering, creating hacks, exploiting a bug in the implementation, or creating bots[11]. Her piece gives a sense of the levels people will go to in order to manipulate a system. If one was to extend this idea to much broader and more essential technologies, like the data held by Facebook and Cambridge Analytica, the consequences become far more serious. This manipulation by Cambridge offers a much more pressing role for politicians and lawmakers to take when designing regulations for these technologies that control such vital and important data. Everyone who engages online has a curated profile comprised of their personal interests, hobbies, and basic personal data, at least. Clearly, politicians have finally realized that election campaigns could benefit largely from modernizing and using technological resources to customize and optimize their campaign efforts, just like online companies have been doing for years through digital marketing tactics. It is this type of manipulation when expanded from video games to national elections, where the practical consequences of cheating are seen.

Transparency

This scandal has awoken the public to the lack of transparency, how easy so much of their information is being exposed to, and accessed by the everyday services they interact with. Floridi and Turilli’s “The ethics of information transparency” offers several unique perspectives on the role of information transparency as ethical enabling or impairing factors. First of all, the article states that transparency depends on the availability and accessibility of information in pragmatically supporting users' decision-making process. However in reality, companies such as Cambridge Analytica and Facebook often embed their user data policy within long and convoluted bodies of text, thereby diminishing users' ability to make informed decisions. In addition, the article also presents the argument that information transparency is not just limited to disclosing information, but how such information has been produced. Thus, the failure of Cambridge Analytica to present its internal policy in an accessible way and reveal how the company acquired such data served to impair ethical values such as accountability, privacy, and welfare. [12]. In many cases too, this data is being sold to firms, organizations, and politicians in an effort to manipulate the subject’s behaviors and influence their personal actions. This case has opened the ethical questions of how companies should disclose what data they are collecting and how it is going to be used.

Privacy

Data mining and data laundering present serious threats to our personal privacy. Namely, the patterning process of aggregated personal data produced through data mining is part of someone's personal identity, therefore becoming a real threat to their private data and personal privacy. Using information that relates to users' self-identity for the framework of privacy, patterning creates a large issue in revealing intimate details. Breaches of informational privacy take away a key element of self-determination, the ability to expose details of one's identity as they see fit.[13] This definitely applies to the Cambridge Analytica scandal and proves that we really only have a degree of personal privacy when we have individual control over such data. Data mining in this case was used with clearly evil intentions to influence the outcome of a presidential election.

Privacy in Public

Numerous questions regarding how individual's information that has been publicly been made available should be treated have risen following the Cambridge Analytica event. In general, we define privacy as being any infringement on people's personal lives, however the ethics of how the information we chose post to websites such as Facebook should be made available remains unclear. [14] Many proponents of personal data distribution argue that by making one's information publicly available, one is forfeiting their ownership over that data to the hands of the platforms and corporations who they have chosen to provide it to. [15] The debate of who may take ownership of this data and how can be shared among different parties continues to be disputed.

Behavioral Microtargeting

Cambridge Analytica claimed to use "behavioral microtargeting" to use curate psychographics in an attempt to influence large groups of people in a way that resonated most of their personality and values.[16] This method of analyzing user data through machine learning and human behavior mapping has several consequences, many of which are unknown. It's been hypothesized that this type of advertisement can manipulate and suppress human ideas while creating ideological polarization towards extreme views and away from more centrist ones. Some academia condemn this type of targeting, saying that it should be regulated[17]

Anonymity

The Cambridge Analytica scandal called into question people’s capacity to be anonymous online. In hopes of finding interesting information about people’s personality for voting purposes, the firm acquired a significant amount of data on many Facebook users. Cambridge Analytica stored information about users’ identity, activity on the site, location, and network [18]. This information was used to target specific users with advertisements to persuade them to vote a certain way in the election. However, Cambridge Analytica’s ability to gather a significant amount of information and coordinate it to find patterns greatly limits a user’s ability to be anonymous or have a low-profile. Separately the data collected per user does not provide a great deal of identifying information, but when looked at together it can easily reveal a person’s identity [19]. However, people’s identities are not the only susceptible entity. By looking at a user’s data overtime, it can reveal personal facts about a person’s life and interests. Coordinating data can uncover so much information about a user. Yet, it is important that user’s have a right to be anonymous or maintain a low-profile if they so desire.

Information Ethics and Designer Responsibility

When Aleksandr Kogan developed the app to harvest 87 million users data on Facebook that was then utilized by Cambridge Analytica, he became what Luciano Floridi would describe as a moral agent. Floridi describes a moral agent as an interactive, autonomous, and adaptable transition system that can perform morally qualiafiable actions. [20] Kogan claims that he was misled by Cambridge Analytica and along with Facebook he has been made the scapegoat for the fallout of the scandal, but Grodzinsky would counter his argument with the rebuttal that as the designer of the app he is responsible. [21] Grodzinsky goes a step further than Floridi does by asserting that the designer has responsibility over artificial agents. Artificial agents as she describes exhibit learning and intentionality, which the personality app that Kogan developed meets due to obtaining participants data and the intentions to harvest this data of the users. Kogan himself, admits that he should have questioned the exercise, but not being inquisitive of the ethics is hardly a reasonable answer for being involved in the breaching of millions of private citizens lives.[22]

See also

References

  1. Solon, Olivia, and Oliver Laughland. “Cambridge Analytica Closing after Facebook Data Harvesting Scandal.” The Guardian, Guardian News and Media, 2 May 2018, www.theguardian.com/uk-news/2018/may/02/cambridge-analytica-closing-down-after-facebook-row-reports-say.
  2. Cambridge Analytica. (n.d.). Retrieved March 13, 2019, from https://en.wikipedia.org/wiki/Cambridge_Analytica
  3. 3.0 3.1 Chang, A. (2018, May 02). The Facebook and Cambridge Analytica scandal, explained with a simple diagram. Retrieved March 11, 2019, from https://www.vox.com/policy-and-politics/2018/3/23/17151916/facebook-cambridge-analytica-trump-diagram
  4. 4.0 4.1 Lewis, P., & Hilder, P. (2018, March 23). Leaked: Cambridge Analytica's blueprint for Trump victory. Retrieved March 12, 2019, from https://www.theguardian.com/uk-news/2018/mar/23/leaked-cambridge-analyticas-blueprint-for-trump-victory
  5. Chang, Alvin. “The Facebook and Cambridge Analytica Scandal, Explained with a Simple Diagram.” Vox, Vox, 2 May 2018, www.vox.com/policy-and-politics/2018/3/23/17151916/facebook-cambridge-analytica-trump-diagram.
  6. 6.0 6.1 6.2 Kang, C., & Frenkel, S. (2018, April 04). Facebook Says Cambridge Analytica Harvested Data of Up to 87 Million Users. Retrieved March 11, 2019, from https://www.nytimes.com/2018/04/04/technology/mark-zuckerberg-testify-congress.html
  7. 7.0 7.1 Yurieff, Kaya. “Facebook Is Cracking down on Personality Quizzes.” CNN, Cable News Network, 25 Apr. 2019, www.cnn.com/2019/04/25/tech/facebook-personality-quizzes/index.html.
  8. Kastrenakes, Jacob. “Facebook Bans Personality Quizzes after Cambridge Analytica Scandal.” The Verge, The Verge, 25 Apr. 2019, www.theverge.com/2019/4/25/18516608/facebook-personality-quiz-ban-cambridge-analytica.
  9. Holt, Kris. “Canada Says Facebook Broke Privacy Laws in Cambridge Analytica Scandal.” Engadget, 25 Apr. 2019, www.engadget.com/2019/04/25/canada-facebook-privacy-laws-cambridge-analytica/.
  10. Reardon, Marguerite. "Facebook's FTC consent decree deal: What you need to know," Cnet, 4/14/18, https://www.cnet.com/news/facebooks-ftc-consent-decree-deal-what-you-need-to-know/
  11. Consalvo, Mia. “Cheaters.” Cheating: Gaining Advantage in Videogames, by Mia Consalvo, MIT Press, 2009, pp. 107–128.
  12. Turilli, Matteo, and Luciano Floridi. “The Ethics of Information Transparency.” SpringerLink, Springer Netherlands, 10 Mar. 2009, link.springer.com/article/10.1007/s10676-009-9187-9.
  13. Conway, P. (2019, February 14). Cambridge Analytica and Personal Data Aggregation: A case study. Lecture presented at Week 5b Lecture Presentation in University of Michigan, Ann Arbor.
  14. Patton, Jason W. "Protecting privacy in public? Surveillance technologies and the value of public places" Ethics and Information technology 10.3 (2000)2:81-187
  15. Floridi, Luciano "The Fourth Revolution" Ethics and Information technology 5, Privacy
  16. Wilson, Dennis G. "The ethics of automated behavioral microtargeting." AI Matters 3.3 (2017): 56-64.
  17. Bay, Morten. "Social Media Ethics: A Rawlsian Approach to Hypertargeting and Psychometrics in Political and Commercial Campaigns." ACM Transactions on Social Computing 1.4 (2018): 16.
  18. Granville, Kevin. “Facebook and Cambridge Analytica: What You Need to Know as Fallout Widens.” The New York Times, The New York Times, 19 Mar. 2018, www.nytimes.com/2018/03/19/technology/facebook-cambridge-analytica-explained.html
  19. “Privacy - Informational Friction .” The 4th Revolution: How the Infosphere Is Reshaping Human Reality, by Luciano Floridi, Oxford University Press, 2016.
  20. . Floridi, Luciano. "Information Ethics" by Luciano Floridi, Cambridge University Press, 2010.
  21. "The ethics of designing artificial agents" by Frances S. Grodzinsky et al, Springer, 2008.
  22. Weaver, Matthew · (21 March 2018) · Facebook scandal: I am being used as scapegoat – academic who mined data · work · 21 March 2018
Back • ↑Top of Page