Difference between revisions of "Surveillance Technologies"

From SI410
Jump to: navigation, search
(Voyeurism)
m
 
(12 intermediate revisions by 6 users not shown)
Line 1: Line 1:
'''<big>Surveillance Technology</big>''' is the use of computer equipment to monitor people’s behaviors and actions. It is often implemented by companies and organizations with the goal of increasing safety, efficiency, or cost-effectiveness. These technologies are becoming more universal and changing rapidly in the 21st century. This presents issues because policy regarding surveillance technology is unable to keep up with its growth due to Moor's law.<ref>The Information Society "The Information Society and Its Philosophy: Introduction to the Special Issue on The Philosophy of Information, Its Nature, and Future Developments" Luciano Floridi 2009</ref> Subjects of surveillance technologies are often unaware and unconsenting, and technologies themselves are subject to misuses such as targeting, voyeurism, and violation of privacy. [[File:Survsfh.png|450px|thumb|right|<ref>[https://mashable.com/2018/02/15/nvidia-developing-facial-recognition-cameras/#OM42Da6Aqiqt Chin, Monica. "Nvidia is creating surveillance cameras with built-in face recognition. Uh, great?" 15 Feb 2018. Mashable.]</ref>]]
+
'''<big>Surveillance Technology</big>'''  
 +
Surveillance Technology is the use of computer equipment to monitor people’s behaviors and actions. It is often implemented by companies and organizations with the goal of increasing safety, efficiency, or cost-effectiveness. These technologies are becoming more universal and changing rapidly in the 21st century. This presents issues because policy regarding surveillance technology is unable to keep up with the growth of said technology due to Moor's law.<ref>The Information Society "The Information Society and Its Philosophy: Introduction to the Special Issue on The Philosophy of Information, Its Nature, and Future Developments" Luciano Floridi 2009</ref> Subjects of surveillance technologies are often unaware and unconsenting, and technologies themselves are subject to misuses such as targeting, voyeurism, and violation of privacy. [[File:Survsfh.png|450px|thumb|right|<ref>[https://mashable.com/2018/02/15/nvidia-developing-facial-recognition-cameras/#OM42Da6Aqiqt Chin, Monica. "Nvidia is creating surveillance cameras with built-in face recognition. Uh, great?" 15 Feb 2018. Mashable.]</ref>]]
  
 
==Types of Surveillance Technologies==
 
==Types of Surveillance Technologies==
Line 5: Line 6:
 
[[File:Screen Shot 2019-03-15 at 2.59.28 PM.png|100px|thumb|right|Hidden Subway Camera<ref>[https://nypost.com/2015/01/18/4500-cameras-in-nycs-subways-including-these-hidden-ones/ Buiso, Gary. "The hidden cameras in NYC's subways. 18 Jan 2015. New York Post.]</ref>]]
 
[[File:Screen Shot 2019-03-15 at 2.59.28 PM.png|100px|thumb|right|Hidden Subway Camera<ref>[https://nypost.com/2015/01/18/4500-cameras-in-nycs-subways-including-these-hidden-ones/ Buiso, Gary. "The hidden cameras in NYC's subways. 18 Jan 2015. New York Post.]</ref>]]
 
===Video Surveillance===
 
===Video Surveillance===
[http://si410wiki.sites.uofmhosting.net/index.php/Video_Surveillance Video surveillance] is one of the most commonly used and well-known surveillance technologies. The uses of video surveillance is society vary from utilizing its monitoring capabilities in high-risk areas (i.e. government buildings) to the monitoring of personal property and all environments in which the monitoring of activities should be present. Video surveillance works to keep a watchful eye over an area to keep the inhabitants and possessions safe from potential threats. Video cameras allow many areas to be monitored at once by the same person and eliminate physical barriers that often obscure the view of human security guards. Problems arise when public spaces become “blanketed” with video cameras<ref>[https://www.aclu.org/issues/privacy-technology/surveillance-technologies "Surveillance Technologies". ACLU.]</ref>. Many public spaces, such as subways in NYC, have hidden cameras<ref>[https://nypost.com/2015/01/18/4500-cameras-in-nycs-subways-including-these-hidden-ones/ New York Post "The Hidden Cameras in NYC's Subways" Gary Buiso 2015]</ref> which can be problematic because individuals are not aware they are being watched and their privacy is violated unconsciously. Video surveillance technology is easily misused as a tool for voyeurism, especially because it has become so accessible in recent years.<ref>[https://www.nbcnews.com/tech/security/tiny-cameras-big-crimes-peeping-toms-go-high-tech-n173151 NBC News "Tiny Cameras, Big Crimes: Peeping Toms Go High-Tech" 2014]</ref> When public spaces are covered with video surveillance technology, individuals lose their ability to go about their lives anonymously and are subject to biased perceptions of their identity when being recorded out of context.  
+
[http://si410wiki.sites.uofmhosting.net/index.php/Video_Surveillance Video surveillance] is one of the most commonly used and well-known surveillance technologies. The uses of video surveillance is society vary from utilizing its monitoring capabilities in high-risk areas (i.e. government buildings) to the monitoring of personal property and all environments in which the monitoring of activities should be present. Video surveillance works to keep a watchful eye over an area to keep the inhabitants and possessions safe from potential threats. The ethical concern arises when the general does not know where this footage goes who it technically belongs to. Video cameras allow many areas to be monitored at once by the same person and eliminate physical barriers that often obscure the view of human security guards. Problems arise when public spaces become “blanketed” with video cameras<ref>[https://www.aclu.org/issues/privacy-technology/surveillance-technologies "Surveillance Technologies". ACLU.]</ref>. Many public spaces, such as subways in NYC, have hidden cameras<ref>[https://nypost.com/2015/01/18/4500-cameras-in-nycs-subways-including-these-hidden-ones/ New York Post "The Hidden Cameras in NYC's Subways" Gary Buiso 2015]</ref> which can be problematic because individuals are not aware they are being watched and their privacy is violated unconsciously. Video surveillance technology is easily misused as a tool for voyeurism, especially because it has become so accessible in recent years.<ref>[https://www.nbcnews.com/tech/security/tiny-cameras-big-crimes-peeping-toms-go-high-tech-n173151 NBC News "Tiny Cameras, Big Crimes: Peeping Toms Go High-Tech" 2014]</ref> When public spaces are covered with video surveillance technology, individuals lose their ability to go about their lives anonymously and are subject to biased perceptions of their identity when being recorded out of context.  
  
 
==== International Usage ====
 
==== International Usage ====
Line 16: Line 17:
 
===Big Data===  
 
===Big Data===  
 
Corporations collect [http://si410wiki.sites.uofmhosting.net/index.php/Big_Data big data] containing individuals online data patterns and public data dumps. This data which corporations are collecting gives them the power construct profiles about individuals (i.e. [http://si410wiki.sites.uofmhosting.net/index.php/Data_brokers data brokers]). These online data packets of information formed from the collection of big data is packaged together to be sold to other businesses who are interested in consumer's habits, which causes many [http://si410wiki.sites.uofmhosting.net/index.php/Confidentiality_of_Online_Data privacy issues]. When individuals online actions are being monitored, the ability to be anonymous in the infosphere is limited. Data mining is a violation of privacy, even when the information is public because there is no context for a subject's actions.<ref>Stanford University Press ''Privacy In Context'' Hellen Nissenbaum 2009</ref> This violation can lead to biased targeting of vulnerable populations by large corporations and others who obtain such data.
 
Corporations collect [http://si410wiki.sites.uofmhosting.net/index.php/Big_Data big data] containing individuals online data patterns and public data dumps. This data which corporations are collecting gives them the power construct profiles about individuals (i.e. [http://si410wiki.sites.uofmhosting.net/index.php/Data_brokers data brokers]). These online data packets of information formed from the collection of big data is packaged together to be sold to other businesses who are interested in consumer's habits, which causes many [http://si410wiki.sites.uofmhosting.net/index.php/Confidentiality_of_Online_Data privacy issues]. When individuals online actions are being monitored, the ability to be anonymous in the infosphere is limited. Data mining is a violation of privacy, even when the information is public because there is no context for a subject's actions.<ref>Stanford University Press ''Privacy In Context'' Hellen Nissenbaum 2009</ref> This violation can lead to biased targeting of vulnerable populations by large corporations and others who obtain such data.
 +
===Data Mining in Post-911 America===
 +
With heightened public concern for national security following the 911 attacks, the Bush administration responded by launching multiple programs aimed to prevent future terrorism. In effect, these programs utilized data mining practices to increase surveillance.
 +
 +
'''MATRIX''', or the Multistate Anti-Terrorism Information Exchange Program, was a federally funded data mining system run by Seisint Inc. and Florida law enforcement intended to link criminal history with public records in order to identify potential terrorist threats. However, MATRIX was shut down in 2005 in response to public backlash regarding privacy concerns and the possibility of false positives, or incorrectly labeling innocent people as terrorists.<ref>"Ramasastry, Anita. “Why We Should Fear the Matrix.” CNN, Cable News Network, 6 Nov. 2003, www.cnn.com/2003/LAW/11/06/findlaw.analysis.ramasastry.matrix/."</ref>
 +
 +
'''TIA''' or Total Information Awareness, was a large-scale database created in 2003 by the Defense Advanced Research Projects Agency (DARPA) branch of the U.S. Department of Defense. During The War on Terror, the program was intended to consolidate government intelligence to best predict, identify, and thus, prevent terrorist threats. Criticism and public unease surrounding this widespread surveillance of American citizens caused TIA to be defunded in late 2003; however, efforts similar to TIA continued beyond 2003 through a more cryptic program named "Basketball."<ref>"“Q&A On the Pentagon's ‘Total Information Awareness’ Program.” American Civil Liberties Union, ACLU, www.aclu.org/other/qa-pentagons-total-information-awareness-program."</ref> <ref>"“TIA Lives On.” Shane Harris, National Journal, 23 Feb. 2006, shaneharris.com/magazinestories/tia-lives-on/."</ref>
 +
 +
'''CAPPS II''', or Computer Assisted Passenger Prescreening System II, was a U.S. Department of Homeland Security run program that replaced the 90s era CAPPS I after the 911 attacks with the intent of intensifying airport screening. TSA utilized CAPPS II's prescreening risk assessment and identity verification system to detect airport travelers who posed a potential terrorist threat. Upon being cancelled in 2004, CAPPS II has since been replaced by the "Secure Flight" airline passenger screening program.<ref>"Pike, John. “CAPPSII.” Global Security, www.globalsecurity.org/security/systems/cappsii.htm."</ref>
 +
 +
'''ADVISE''', or Analysis, Dissemination, Visualization, Insight, and Semantic Enhancement, begun in 2003 as a broad, all-encompassing Homeland Security run data mining system directed towards associating individuals with organizations through the analyzation of patterns within data. Due to a lack of required privacy safeguards, the short-lived program ended in 2004.<ref>"“DHS Halts Anti-Terror Data-Mining Program.” NBCNews.com, NBCUniversal News Group, 5 Sept. 2007, www.nbcnews.com/id/20604775/ns/us_news-security/t/dhs-halts-anti-terror-data-mining-program/#.XMXWEJNKjOQ."</ref>
 +
 +
Overall, these reactionary, Bush-era data mining surveillance systems used to promote national security, raised public concern about the limits of American privacy and the misidentification of innocent citizens as terrorists.
  
 
===Biometrics===  
 
===Biometrics===  
[[File:biomsfh.jpg|300px|thumb|right|Fingerprint ID <ref>[http://www.m2sys.com/blog/biometric-resources/biometrics-on-smartphones/ "How Biometrics on Smartphones is Changing our Lives", Second Image]</ref>]]
+
[[File:biomsfh.jpg|300px|thumb|right|Fingerprint ID <ref>[http://www.m2sys.com/blog/biometric-resources/biometrics-on-smartphones/, Second Image]</ref>]]
 
Biometrics are ways to identify someone based on their physical characteristics such as fingerprints, DNA, [http://si410wiki.sites.uofmhosting.net/index.php/Iris_Recognition eyes], voice, [http://si410wiki.sites.uofmhosting.net/index.php/Face_recognition face], etc. Many technologies have been developed to recognize these characteristics. The FBI has a “Next Generation Identification” (NGI) system<ref>[https://www.fbi.gov/services/cjis/fingerprints-and-other-biometrics/ngi  FBI "Next Generation Identification (NGI)"]</ref> which is a bank of biometric information about individuals that allows them to use someone's physical characteristics to identify them. For example, if someone is caught on a surveillance camera, facial recognition technology can be used alongside their NGI database to identify the subject. Social media companies<ref>[https://www.usatoday.com/story/tech/news/2018/04/19/facebook-growing-use-facial-recognition-raises-privacy-concerns/526937002/ USA Today "Facebook wants to save your face. Should you say yes to facial recognition?" Jessica Guynn 2018]</ref> and other technology companies<ref>[https://support.apple.com/en-us/HT208108 Apple inc. "About Face ID advanced technology" 2018]</ref> have also been developing and implementing these tools to increase legitimacy in authentication. The consequences of these new technologies cannot be predicted, but they increase the ability to identify and track individuals based on their physical identity, eliminating anonymity. Because aspects of one's physical identity (i.e. fingerprint) do not change like other identifying characteristics (i.e. phone number), this locks people into their identity permanently.<ref>[https://opusresearch.net/wordpress/2018/04/27/biometrics-the-good-the-bad-and-the-reality/ opusresearch "Biometrics – the Good, the Bad and the Reality" Ravin Sanjith 2018]</ref> This is a violation of peoples autonomy and therefore their privacy.
 
Biometrics are ways to identify someone based on their physical characteristics such as fingerprints, DNA, [http://si410wiki.sites.uofmhosting.net/index.php/Iris_Recognition eyes], voice, [http://si410wiki.sites.uofmhosting.net/index.php/Face_recognition face], etc. Many technologies have been developed to recognize these characteristics. The FBI has a “Next Generation Identification” (NGI) system<ref>[https://www.fbi.gov/services/cjis/fingerprints-and-other-biometrics/ngi  FBI "Next Generation Identification (NGI)"]</ref> which is a bank of biometric information about individuals that allows them to use someone's physical characteristics to identify them. For example, if someone is caught on a surveillance camera, facial recognition technology can be used alongside their NGI database to identify the subject. Social media companies<ref>[https://www.usatoday.com/story/tech/news/2018/04/19/facebook-growing-use-facial-recognition-raises-privacy-concerns/526937002/ USA Today "Facebook wants to save your face. Should you say yes to facial recognition?" Jessica Guynn 2018]</ref> and other technology companies<ref>[https://support.apple.com/en-us/HT208108 Apple inc. "About Face ID advanced technology" 2018]</ref> have also been developing and implementing these tools to increase legitimacy in authentication. The consequences of these new technologies cannot be predicted, but they increase the ability to identify and track individuals based on their physical identity, eliminating anonymity. Because aspects of one's physical identity (i.e. fingerprint) do not change like other identifying characteristics (i.e. phone number), this locks people into their identity permanently.<ref>[https://opusresearch.net/wordpress/2018/04/27/biometrics-the-good-the-bad-and-the-reality/ opusresearch "Biometrics – the Good, the Bad and the Reality" Ravin Sanjith 2018]</ref> This is a violation of peoples autonomy and therefore their privacy.
 
<br>
 
<br>
'''[http://si410wiki.sites.uofmhosting.net/index.php/Face_recognition Face Recognition] Technology''' is a common example of a biometric surveillance technology. Today, while generally used for technological advancements such as unlocking phones or logging into applications, the technology can still be used for general wary surveillance purposes. For example, state motor vehicle departments have high resolution photographs of their residents that can be used with public surveillance in order to track and identify. <ref> https://www.aclu.org/issues/privacy-technology/surveillance-technologies/face-recognition-technology "FACE RECOGNITION TECHNOLOGY" American Civil Liberties Union</ref> Today, facial recognition technologies are known to be less effective and reliable than [http://si410wiki.sites.uofmhosting.net/index.php/Iris_Recognition Iris Recognition]
+
'''[http://si410wiki.sites.uofmhosting.net/index.php/Face_recognition Face Recognition] Technology''' is a common example of a biometric surveillance technology. Today, while generally used for technological advancements such as unlocking phones or logging into applications, the technology can still be used for general wary surveillance purposes. For example, state motor vehicle departments have high resolution photographs of their residents that can be used with public surveillance in order to track and identify. <ref> [https://www.aclu.org/issues/privacy-technology/surveillance-technologies/face-recognition-technology "FACE RECOGNITION TECHNOLOGY" American Civil Liberties Union]</ref> Today, facial recognition technologies are known to be less effective and reliable than [http://si410wiki.sites.uofmhosting.net/index.php/Iris_Recognition Iris Recognition]
  
 
===Domestic Drones===  
 
===Domestic Drones===  
Line 36: Line 49:
  
 
=== Safety and Security ===
 
=== Safety and Security ===
Amongst many others, the security camera and military and defense industries have been positively impacted with the rapid growth of Artificial Intelligence and Computer Vision technologies. These technologies have progressed at a rate that is faster than many had predicted and with their growing integration into security camera systems, it is believed that the expansion of Artificial Intelligence into national security and defense is unpreventable. <ref> "Artificial Intelligence and National Security" https://www.belfercenter.org/publication/artificial-intelligence-and-national-security</ref>  
+
Amongst many others, the security camera and military and defense industries have been positively impacted with the rapid growth of Artificial Intelligence and Computer Vision technologies. These technologies have progressed at a rate that is faster than many had predicted and with their growing integration into security camera systems, it is believed that the expansion of Artificial Intelligence into national security and defense is unpreventable. <ref>[https://www.belfercenter.org/publication/artificial-intelligence-and-national-security Allen, Greg. "Artificial Intelligence and National Security" July 2017. Harvard Kennedy School: Belfer Center.]</ref>
 +
 
 +
=== Workplace Surveillance ===
 +
When an individual accepts employment under another entity, oftentimes they are agreeing to a set of rules and conduct that are enforced through workplace surveillance. That is to say, an employee is subject to be monitored by their employer while in the workplace. There are a variety of purposes to surveilling employees: performance tracking, protecting the firm from legal liability, protecting the spread of trade secrets, and general security. Workplace surveillance can be done through methods such as software monitoring, telephone tapping, location monitoring and email monitoring. Location monitoring has a further use in incorporating big data. Firms with a high ceiling for logistical excellence such as Amazon gain a strategic advantage through the use of such activity monitoring to streamline operations <ref> Yeginsu, C. (2018, February 01). If Workers Slack Off, the Wristband Will Know. (And Amazon Has a Patent for It.). Retrieved from https://www.nytimes.com/2018/02/01/technology/amazon-wristband-tracking-privacy.html </ref>. Laws differ state-to-state and internationally drastically in the realm of employee monitoring, and, of course, there are many ethical concerns beyond legality that will be addressed in a broader scope. <ref> What is Employee Monitoring? (n.d.). Retrieved April 28, 2019, from https://hubstaff.com/employee_monitoring </ref>
  
 
==Ethics==
 
==Ethics==
Line 53: Line 69:
 
===Power Imbalance===
 
===Power Imbalance===
  
Social theorists and philosophers argue that with the ubiquity of surveillance and security technologies today, an environment of virtually constant surveillance has been created. As 'smart devices' become more prevalent, more information is being made public constantly, with and without the user's awareness. Michel Foucault <ref>Gutting, Gary and Oksala, Johanna (2019), "Michel Foucault", The Stanford Encyclopedia of Philosophy, https://plato.stanford.edu/entries/foucault/</ref>, philosopher and historian, makes the argument that the current structure of power resembles that of late 18th century Europe. While the [https://en.wikipedia.org/wiki/Ancien_R%C3%A9gime Ancient Regime] ruled with the constant threat of violence, and the spectacle of public torture, the systems in place today rule with the overbearing threat of punishment or judgement, through the panoptic surveillance system that is always at work. <ref>https://www.zedbooks.net/blog/posts/marx-foucault-difference Bidet, Jacques. "The Marx/Foucault Difference". 28 September 2016.</ref> Such a system would not require constant surveillance to govern the behavior of its people, the mere existence of the system maintains the power dynamics between the observer and the observed. Luciano Floridi's re-ontologization of the user's role in an information environment from the prime beneficiary to an information endpoint convey's similar concerns in regards to the ethics of such surveillance.
+
Social theorists and philosophers argue that with the ubiquity of surveillance and security technologies today, an environment of virtually constant surveillance has been created. As 'smart devices' become more prevalent, more information is being made public constantly, with and without the user's awareness. Michel Foucault <ref>Gutting, Gary and Oksala, Johanna (2019), "Michel Foucault", The Stanford Encyclopedia of Philosophy, https://plato.stanford.edu/entries/foucault/</ref>, philosopher and historian, makes the argument that the current structure of power resembles that of late 18th century Europe. While the [https://en.wikipedia.org/wiki/Ancien_R%C3%A9gime Ancient Regime] ruled with the constant threat of violence, and the spectacle of public torture, the systems in place today rule with the overbearing threat of punishment or judgement, through the panoptic surveillance system that is always at work. <ref> [https://www.zedbooks.net/blog/posts/marx-foucault-difference Bidet, Jacques. "The Marx/Foucault Difference". 28 September 2016.]</ref> Such a system would not require constant surveillance to govern the behavior of its people, the mere existence of the system maintains the power dynamics between the observer and the observed. Luciano Floridi's re-ontologization of the user's role in an information environment from the prime beneficiary to an information endpoint convey's similar concerns in regards to the ethics of such surveillance.
  
 
===Bias===
 
===Bias===
 +
 +
:)
  
 
Surveillance technologies are never neutral because they are designed by humans who have their own biases. These biases, or "embedded values,"<ref>Cambridge University Press "Values in technology and disclosive computer ethics" Philip Brey 2010</ref> affect the way technology surveils. For example, Amazon's new facial recognition technology, [https://aws.amazon.com/rekognition/ Rekognition], has been reported to falsely match photos of people of color to mugshot databases at a higher rate than it falsely matches photos of white people.<ref>[https://www.aclu.org/blog/privacy-technology/surveillance-technologies/amazons-face-recognition-falsely-matched-28 Snow, Jacob. "Amazon's Face Recognition Falsely Matched 28 Members of Congress with Mugshots". 26 July 2018. ACLU.]</ref> The software has learned this racial bias from its creators and the digital world it interacts with. While it would seem that technical analysis would be less biased than human analysis, the implementation of such technology into society would amplify existing racial prejudice instead of eliminating it.
 
Surveillance technologies are never neutral because they are designed by humans who have their own biases. These biases, or "embedded values,"<ref>Cambridge University Press "Values in technology and disclosive computer ethics" Philip Brey 2010</ref> affect the way technology surveils. For example, Amazon's new facial recognition technology, [https://aws.amazon.com/rekognition/ Rekognition], has been reported to falsely match photos of people of color to mugshot databases at a higher rate than it falsely matches photos of white people.<ref>[https://www.aclu.org/blog/privacy-technology/surveillance-technologies/amazons-face-recognition-falsely-matched-28 Snow, Jacob. "Amazon's Face Recognition Falsely Matched 28 Members of Congress with Mugshots". 26 July 2018. ACLU.]</ref> The software has learned this racial bias from its creators and the digital world it interacts with. While it would seem that technical analysis would be less biased than human analysis, the implementation of such technology into society would amplify existing racial prejudice instead of eliminating it.

Latest revision as of 18:18, 11 February 2021

Surveillance Technology

Surveillance Technology is the use of computer equipment to monitor people’s behaviors and actions. It is often implemented by companies and organizations with the goal of increasing safety, efficiency, or cost-effectiveness. These technologies are becoming more universal and changing rapidly in the 21st century. This presents issues because policy regarding surveillance technology is unable to keep up with the growth of said technology due to Moor's law.[1] Subjects of surveillance technologies are often unaware and unconsenting, and technologies themselves are subject to misuses such as targeting, voyeurism, and violation of privacy.

Types of Surveillance Technologies

There are many types of surveillance technologies in existence, and they are often used in conjunction with one another to conduct comprehensive monitoring of an area either in the real world or in the infosphere. These technologies can be used to monitor people in public places, employees in the workplace, criminal suspects, or a person of interest. Each technology presents the user with its own set of ethical issues.

Hidden Subway Camera[3]

Video Surveillance

Video surveillance is one of the most commonly used and well-known surveillance technologies. The uses of video surveillance is society vary from utilizing its monitoring capabilities in high-risk areas (i.e. government buildings) to the monitoring of personal property and all environments in which the monitoring of activities should be present. Video surveillance works to keep a watchful eye over an area to keep the inhabitants and possessions safe from potential threats. The ethical concern arises when the general does not know where this footage goes who it technically belongs to. Video cameras allow many areas to be monitored at once by the same person and eliminate physical barriers that often obscure the view of human security guards. Problems arise when public spaces become “blanketed” with video cameras[4]. Many public spaces, such as subways in NYC, have hidden cameras[5] which can be problematic because individuals are not aware they are being watched and their privacy is violated unconsciously. Video surveillance technology is easily misused as a tool for voyeurism, especially because it has become so accessible in recent years.[6] When public spaces are covered with video surveillance technology, individuals lose their ability to go about their lives anonymously and are subject to biased perceptions of their identity when being recorded out of context.

International Usage

Closed-ciruit television or CCTV, has an international presence with the majority of surveillance being in Asia and the United Kingdom. In China, it's there was a reported amount of over 170 million surveillance cameras -- all which are capable to facial recognition [7]. The United Kingdom, which is one of the most surveilled nations in the world, has at least one surveillance camera for every 11 people in the United Kingdom [8]. However, with this much footage, authorities in the United Kingdom struggle to filter through all of the footage obtained when needed. [9] However, CCTV is not limited to China and the United Kingdom, and is present across Europe, Asia, South Africa, Latin America, and North America as well.

Body Cameras

Police body cameras are a form of video surveillance and an example of sousveillance. They are used to oversee police actions and have been implemented in recent years in order to combat police violence. Similar to security cameras, they present an issue of people being unaware their actions are being recorded, but are used to combat police bias and violence. A study published in October 2017 by David Yokum et al [10] was conducted on 2,600 police officers in Washington D.C. Metropolitan Police Department, ultimately concluding no statistical significance in the use of body cameras in reducing the amount of force that police officers use in law enforcement. The study found that body cameras may inhibit police officers from making the decision to use profanity or make a crude comment, in a lucid state, with awareness of the camera being attached to their vests. However, in situations that involve potential violence, and/or physical combat, such inhibitions from awareness of the bodycam are rendered useless, and an officer's fight-or-flight instincts cause similar behavior to the behavior exhibited by a police officer lacking a bodycam. However, the discourse among commentators is unanimous that the value-add of public knowledge and opportunity for police to engage in discourse exploring the implications of the actions of their colleagues across the nation, is a crucial factor in creating a well-equipped police force.

Big Data

Corporations collect big data containing individuals online data patterns and public data dumps. This data which corporations are collecting gives them the power construct profiles about individuals (i.e. data brokers). These online data packets of information formed from the collection of big data is packaged together to be sold to other businesses who are interested in consumer's habits, which causes many privacy issues. When individuals online actions are being monitored, the ability to be anonymous in the infosphere is limited. Data mining is a violation of privacy, even when the information is public because there is no context for a subject's actions.[11] This violation can lead to biased targeting of vulnerable populations by large corporations and others who obtain such data.

Data Mining in Post-911 America

With heightened public concern for national security following the 911 attacks, the Bush administration responded by launching multiple programs aimed to prevent future terrorism. In effect, these programs utilized data mining practices to increase surveillance.

MATRIX, or the Multistate Anti-Terrorism Information Exchange Program, was a federally funded data mining system run by Seisint Inc. and Florida law enforcement intended to link criminal history with public records in order to identify potential terrorist threats. However, MATRIX was shut down in 2005 in response to public backlash regarding privacy concerns and the possibility of false positives, or incorrectly labeling innocent people as terrorists.[12]

TIA or Total Information Awareness, was a large-scale database created in 2003 by the Defense Advanced Research Projects Agency (DARPA) branch of the U.S. Department of Defense. During The War on Terror, the program was intended to consolidate government intelligence to best predict, identify, and thus, prevent terrorist threats. Criticism and public unease surrounding this widespread surveillance of American citizens caused TIA to be defunded in late 2003; however, efforts similar to TIA continued beyond 2003 through a more cryptic program named "Basketball."[13] [14]

CAPPS II, or Computer Assisted Passenger Prescreening System II, was a U.S. Department of Homeland Security run program that replaced the 90s era CAPPS I after the 911 attacks with the intent of intensifying airport screening. TSA utilized CAPPS II's prescreening risk assessment and identity verification system to detect airport travelers who posed a potential terrorist threat. Upon being cancelled in 2004, CAPPS II has since been replaced by the "Secure Flight" airline passenger screening program.[15]

ADVISE, or Analysis, Dissemination, Visualization, Insight, and Semantic Enhancement, begun in 2003 as a broad, all-encompassing Homeland Security run data mining system directed towards associating individuals with organizations through the analyzation of patterns within data. Due to a lack of required privacy safeguards, the short-lived program ended in 2004.[16]

Overall, these reactionary, Bush-era data mining surveillance systems used to promote national security, raised public concern about the limits of American privacy and the misidentification of innocent citizens as terrorists.

Biometrics

Fingerprint ID [17]

Biometrics are ways to identify someone based on their physical characteristics such as fingerprints, DNA, eyes, voice, face, etc. Many technologies have been developed to recognize these characteristics. The FBI has a “Next Generation Identification” (NGI) system[18] which is a bank of biometric information about individuals that allows them to use someone's physical characteristics to identify them. For example, if someone is caught on a surveillance camera, facial recognition technology can be used alongside their NGI database to identify the subject. Social media companies[19] and other technology companies[20] have also been developing and implementing these tools to increase legitimacy in authentication. The consequences of these new technologies cannot be predicted, but they increase the ability to identify and track individuals based on their physical identity, eliminating anonymity. Because aspects of one's physical identity (i.e. fingerprint) do not change like other identifying characteristics (i.e. phone number), this locks people into their identity permanently.[21] This is a violation of peoples autonomy and therefore their privacy.
Face Recognition Technology is a common example of a biometric surveillance technology. Today, while generally used for technological advancements such as unlocking phones or logging into applications, the technology can still be used for general wary surveillance purposes. For example, state motor vehicle departments have high resolution photographs of their residents that can be used with public surveillance in order to track and identify. [22] Today, facial recognition technologies are known to be less effective and reliable than Iris Recognition

Domestic Drones

Domestic drones are used in combination with other technologies to surveil people without their knowledge, eliminating their ability to be anonymous. Drones can be equipped with microphones, GPS tracking, infrared detection, and facial recognition to detect and track people in many different circumstances with great discretion, usually going unnoticed by the subject. This is a violation of contextual privacy because even if someone is acting a certain way in a public space, there is no context of their life included in the surveillance.

Sunpass contactless toll collection[23]

Radio Frequency Identity (RFID) Chips

RFID chips are placed on or on physical objects to allow for contactless authentication. Examples include electronic toll collection passes, passports, and contactless entry keys, and they allow users to be identified and tracked without their knowledge. Because subjects are not in control of the collection of this information, people's privacy is violated by tracking them this way, and their ability to move about the world anonymously is taken away.

Stingray Tracking Devices

Stingray tracking devices are cell phone surveillance devices that act like cell towers and send out signals to cell phones to get their location and identifying information. They are often used by law enforcement in solving crime in order to geolocate people. When identifying information is collected about a subject without their knowledge, their privacy and opportunity for anonymity are violated.

Advantages

Surveillance technologies aid in solving crimes and can even deter criminals. They are more effective than traditional surveillance methods because they are not disrupted by distance or physical barriers. They are less labor intensive and often less expensive than traditional surveillance methods. Data generated by surveillance technology can be shared easily, and a reduced informational friction[24] allows for this information to be used effectively by multiple people in a time of need.

Safety and Security

Amongst many others, the security camera and military and defense industries have been positively impacted with the rapid growth of Artificial Intelligence and Computer Vision technologies. These technologies have progressed at a rate that is faster than many had predicted and with their growing integration into security camera systems, it is believed that the expansion of Artificial Intelligence into national security and defense is unpreventable. [25]

Workplace Surveillance

When an individual accepts employment under another entity, oftentimes they are agreeing to a set of rules and conduct that are enforced through workplace surveillance. That is to say, an employee is subject to be monitored by their employer while in the workplace. There are a variety of purposes to surveilling employees: performance tracking, protecting the firm from legal liability, protecting the spread of trade secrets, and general security. Workplace surveillance can be done through methods such as software monitoring, telephone tapping, location monitoring and email monitoring. Location monitoring has a further use in incorporating big data. Firms with a high ceiling for logistical excellence such as Amazon gain a strategic advantage through the use of such activity monitoring to streamline operations [26]. Laws differ state-to-state and internationally drastically in the realm of employee monitoring, and, of course, there are many ethical concerns beyond legality that will be addressed in a broader scope. [27]

Ethics

Privacy

The most prominent ethical issue arising from with surveillance technology is the violation of privacy. Humans have control over how we present ourselves based on the context we are in and what we know about our surroundings and social circles. Constant surveillance of individuals without their knowledge violate the subject’s privacy because it takes away their control over their public portrayal.[28] The limitation theory of privacy states that privacy is the area that others cannot get to,[29] but if society is creating environments in which individuals are under unconscious surveillance, the ability to shape public persona disappears and privacy begins to disappear.

Domestic Abuse & Surveillance Technologies

"Smart abuse" has become a more prevalent issue with constant advancement of surveillance and "smart home" technology. Surveillance technologies are being utilized to monitor and track a partner's whereabouts in domestic abuse relationships. These technologies were created to help keep intruders out and provide an added sense of protection, but they are being misused to torment partners in abusive relationships. Abusers will use these technologies to frighten and keep their partners from trying to leave the relationship in fear of getting caught.[30] Abusers also utilize other technologies like remote control locks and other "smart" house technologies to harass and manipulate their partners when they aren't even home. New apps and other resources for educating women on how to maintain online privacy and best handle these situations are becoming more prominent because of the alarming increase in this problem.

Voyeurism

Surveillance technology has created opportunity for voyeurism, or surveillance for pleasure of the surveyor. There have been incidents of security cameras being used as a tool for voyeurism, a misuse of technology designed to protect people. Many parties who use surveillance technology are facing legal issues, often because there is not up-to-date legislation to guide their use. When subjects being monitored are completely unaware they are being tracked and watched, they have not consented to it. Parties using these technologies should be open about their uses to protect the rights of the people they are surveilling,[31] and to prevent the "perfect voyeur".[32]

Power Imbalance

Social theorists and philosophers argue that with the ubiquity of surveillance and security technologies today, an environment of virtually constant surveillance has been created. As 'smart devices' become more prevalent, more information is being made public constantly, with and without the user's awareness. Michel Foucault [33], philosopher and historian, makes the argument that the current structure of power resembles that of late 18th century Europe. While the Ancient Regime ruled with the constant threat of violence, and the spectacle of public torture, the systems in place today rule with the overbearing threat of punishment or judgement, through the panoptic surveillance system that is always at work. [34] Such a system would not require constant surveillance to govern the behavior of its people, the mere existence of the system maintains the power dynamics between the observer and the observed. Luciano Floridi's re-ontologization of the user's role in an information environment from the prime beneficiary to an information endpoint convey's similar concerns in regards to the ethics of such surveillance.

Bias

)

Surveillance technologies are never neutral because they are designed by humans who have their own biases. These biases, or "embedded values,"[35] affect the way technology surveils. For example, Amazon's new facial recognition technology, Rekognition, has been reported to falsely match photos of people of color to mugshot databases at a higher rate than it falsely matches photos of white people.[36] The software has learned this racial bias from its creators and the digital world it interacts with. While it would seem that technical analysis would be less biased than human analysis, the implementation of such technology into society would amplify existing racial prejudice instead of eliminating it.

Anonymity

Surveillance technology decreases and can even eliminate people's anonymity online and in the physical world. As people's online interactions are tracked, turned into data, and sold, they are no longer anonymous online. As public spaces are increasingly blanketed with video monitors and contactless scanning devices, people can be unknowingly tracked as they move through their daily lives. These aspects of surveillance technology and its integration into society don't allow anyone to slip by unnoticed, thereby eradicating anonymity in the 21st century.

References

  1. The Information Society "The Information Society and Its Philosophy: Introduction to the Special Issue on The Philosophy of Information, Its Nature, and Future Developments" Luciano Floridi 2009
  2. Chin, Monica. "Nvidia is creating surveillance cameras with built-in face recognition. Uh, great?" 15 Feb 2018. Mashable.
  3. Buiso, Gary. "The hidden cameras in NYC's subways. 18 Jan 2015. New York Post.
  4. "Surveillance Technologies". ACLU.
  5. New York Post "The Hidden Cameras in NYC's Subways" Gary Buiso 2015
  6. NBC News "Tiny Cameras, Big Crimes: Peeping Toms Go High-Tech" 2014
  7. “Chinese Man Caught by Facial Recognition at Pop Concert.” BBC News, BBC, 13 Apr. 2018, www.bbc.com/news/world-asia-china-43751276.
  8. Barrett, David. “One Surveillance Camera for Every 11 People in Britain, Says CCTV Survey.” The Telegraph, Telegraph Media Group, 10 July 2013, www.telegraph.co.uk/technology/10172298/One-surveillance-camera-for-every-11-people-in-Britain-says-CCTV-survey.html.
  9. Temperton, James. “One Nation under CCTV: the Future of Automated Surveillance.” WIRED, WIRED UK, 4 Oct. 2017, www.wired.co.uk/article/one-nation-under-cctv.
  10. Yokum et al (2017), The Lab DC, Evaluating the Effects of Police Body-Worn Cameras:A Randomized Controlled Trial https://bwc.thelab.dc.gov/TheLabDC_MPD_BWC_Working_Paper_10.20.17.pdf
  11. Stanford University Press Privacy In Context Hellen Nissenbaum 2009
  12. "Ramasastry, Anita. “Why We Should Fear the Matrix.” CNN, Cable News Network, 6 Nov. 2003, www.cnn.com/2003/LAW/11/06/findlaw.analysis.ramasastry.matrix/."
  13. "“Q&A On the Pentagon's ‘Total Information Awareness’ Program.” American Civil Liberties Union, ACLU, www.aclu.org/other/qa-pentagons-total-information-awareness-program."
  14. "“TIA Lives On.” Shane Harris, National Journal, 23 Feb. 2006, shaneharris.com/magazinestories/tia-lives-on/."
  15. "Pike, John. “CAPPSII.” Global Security, www.globalsecurity.org/security/systems/cappsii.htm."
  16. "“DHS Halts Anti-Terror Data-Mining Program.” NBCNews.com, NBCUniversal News Group, 5 Sept. 2007, www.nbcnews.com/id/20604775/ns/us_news-security/t/dhs-halts-anti-terror-data-mining-program/#.XMXWEJNKjOQ."
  17. Second Image
  18. FBI "Next Generation Identification (NGI)"
  19. USA Today "Facebook wants to save your face. Should you say yes to facial recognition?" Jessica Guynn 2018
  20. Apple inc. "About Face ID advanced technology" 2018
  21. opusresearch "Biometrics – the Good, the Bad and the Reality" Ravin Sanjith 2018
  22. "FACE RECOGNITION TECHNOLOGY" American Civil Liberties Union
  23. News Service of Florida. "SunPass continues to catch up on toll transactions". 10 July 2018. Flapol.
  24. Oxford University Press The 4th Revolution Luciano Floridi 2014
  25. Allen, Greg. "Artificial Intelligence and National Security" July 2017. Harvard Kennedy School: Belfer Center.
  26. Yeginsu, C. (2018, February 01). If Workers Slack Off, the Wristband Will Know. (And Amazon Has a Patent for It.). Retrieved from https://www.nytimes.com/2018/02/01/technology/amazon-wristband-tracking-privacy.html
  27. What is Employee Monitoring? (n.d.). Retrieved April 28, 2019, from https://hubstaff.com/employee_monitoring
  28. Information Technology and Moral Philosophy "Plural Selves and Relational Identity" Dean Docking 2008
  29. Ethics and Information Technology "Self-exposure and exposure of the self: informational privacy and the presentation of identity" David Shoemaker 2010
  30. Reilly, C. 2018. Cameras, surveillance and the sinister tech behind domestic abuse. CNET. https://www.cnet.com/news/cameras-surveillance-and-the-sinister-tech-behind-domestic-abuse/
  31. "Surveillance Technologies". ACLU
  32. Doyle, Tom. "Privacy and perfect voyeurism". 27 May 2009. Springer.
  33. Gutting, Gary and Oksala, Johanna (2019), "Michel Foucault", The Stanford Encyclopedia of Philosophy, https://plato.stanford.edu/entries/foucault/
  34. Bidet, Jacques. "The Marx/Foucault Difference". 28 September 2016.
  35. Cambridge University Press "Values in technology and disclosive computer ethics" Philip Brey 2010
  36. Snow, Jacob. "Amazon's Face Recognition Falsely Matched 28 Members of Congress with Mugshots". 26 July 2018. ACLU.