Difference between revisions of "Disclosive Ethics"

From SI410
Jump to: navigation, search
(Algorithm Protection)
Line 24: Line 24:
 
====Algorithm Bias====
 
====Algorithm Bias====
  
These algorithms, which tend to deafen the voices of more alternative viewpoints in mass media through this data analysis, are created with the intent to bring more popular topics and websites to the top of the search algorithm queue. Most of the data packets these systems are relying on come from humans with “biased data” learned from a specific society<ref name = "Snow">Snow, Jackie. “Bias Already Exists in Search Engine Results, and It's Only Going to Get Worse.” MIT Technology Review, MIT Technology Review, 1 Mar. 2018, www.technologyreview.com/s/610275/meet-the-woman-who-searches-out-search-engines-bias-against-women-and-minorities/.</ref><ref name = "Cathy">Cathy O’Neil. Weapons of Math Destruc- tion: How Big Data Increases Inequality and Threatens Democracy. Broadway Books, 2016 p.22-30</ref>. Users are implying that they have knowledge about the value inherently built into these systems, but this acts as a moral vice in creating moral neutrality on a specific search engine. The implied morally opaque practice becomes the new norm and through the act of seeming absolute knowledge of the online practice of popular opinion.
+
These algorithms, which tend to deafen the voices of more alternative viewpoints in mass media through this data analysis, are created with the intent to bring more popular topics and websites to the top of the search algorithm queue. Most of the data packets these systems are relying on come from humans with “biased data” learned from a specific society<ref name = "Snow">Snow, Jackie. “Bias Already Exists in Search Engine Results, and It's Only Going to Get Worse.” MIT Technology Review, MIT Technology Review, 1 Mar. 2018, www.technologyreview.com/s/610275/meet-the-woman-who-searches-out-search-engines-bias-against-women-and-minorities/.</ref><ref name = "Cathy">Cathy O’Neil. Weapons of Math Destruc- tion: How Big Data Increases Inequality and Threatens Democracy. Broadway Books, 2016 pp.22-30</ref>. Users are implying that they have knowledge about the value inherently built into these systems, but this acts as a moral vice in creating moral neutrality on a specific search engine. The implied morally opaque practice becomes the new norm and through the act of seeming absolute knowledge of the online practice of popular opinion.
  
 
====Algorithm Protection====
 
====Algorithm Protection====
 
   
 
   
Since the growing concern in society for individuals to have access to certain information, such as issues pertain to self harm, companies in charge of creating these search engine criterias have put in an effort to curb access the these results without restricting the public's ability to search for keywords. Search engines have inherent set of conditions that prevent users from being able to obtain information and instructions through the act of flagging these keywords in the system as harmful and restricts the user’s ability to obtain content related to the keyword. <ref name = "Goldman">Goldman, E., 2008. “Search Engine Bias and the Demise of Search Engine Utopianism,” in Web Search: Multidisciplinary Perspectives, A. Spink and M. Zimmer (eds.), Berlin: Springer-Verlag, pp.</ref>
+
Since the growing concern in society for individuals to have access to certain information, such as issues pertain to self harm, companies in charge of creating these search engine criterias have put in an effort to curb access the these results without restricting the public's ability to search for keywords. Search engines have inherent set of conditions that prevent users from being able to obtain information and instructions through the act of flagging these keywords in the system as harmful and restricts the user’s ability to obtain content related to the keyword. <ref name = "Goldman">Goldman, E., 2008. “Search Engine Bias and the Demise of Search Engine Utopianism,” in Web Search: Multidisciplinary Perspectives, A. Spink and M. Zimmer (eds.), Berlin: Springer-Verlag, pp.42-43</ref>
  
 
=== Online Social Movements===
 
=== Online Social Movements===

Revision as of 17:13, 15 March 2019

Disclosive ethics refers to an exploratory subfield of information ethics founded by Philip Brey in 2000[1]. The methodology of disclosive ethics utilizes an approach towards ethical issues pertaining to practices in information driven atmospheres with analysis of perceived normative environments in technology. One of the key facets to disclosive thinking involves the ability to approach perceived normative environments with a level of scepticism in order to expose the unperceived biases of the designers of a specific informational system. Designer bias that may have been knowingly or unknowingly embedded into the creation process of that information system, have the ability to shape the way a user interacts with a system which has the capacity of effecting societal cultural belief systems. This social implication is one of the fundamental reasons behind Brey proposal to create a subfield of computer ethics exclusively focusing on broadening the scope of moral analysis within computer practices to the issues that have not been exposed. Disclosive ethics is a deviation from the current standard model of applied ethics, which does not account for morally opaque ethical issues in perceived normative information systems and acts as a complementary approach in developing the limitations of the standard model of applied ethics.

Foundational Principles

Morally Opaque Practices

Morally opaque practices (or the existence thereof in computer related practices) are a set of computational practices and values embedded within the design of a system that user’s of the system may be aware or unaware of in an online information atmosphere. This creates an environment of perceived moral neutrality on the front end of an online platform although this may not be the case. The existence of a morally opaque practice can theoretically be traced back to two probable causes within the initial design of a system[1]. The principle of the existence of a morally opaque practice within the design of a system is one of the foundational beliefs of disclosive ethics.

Implied knowledge

The extent of moral opacity related to this computer related practice is based off of implied knowledge or unknown knowledge that a front end user may have about the design of a platform. This morally opaque practice can act as a hidden practice although there may be a slight awareness in the user using the system. The user is left unaware of the logistics of the running practice.

Undetectable Embedded Practices

A practice is built into the design of a system, through algorithms or a set rule of conditions, so that the user may appear to be using the system in a normal fashion but this normal fashion may be tainted when used outside the bounds of a embedded virtues in the design process that designers perceive as protecting a societal virtue. In this fashion, the system appears to be value neutral and no apparent or implied knowledge of the practice can be detected by the user.

Ethical Implications

Search engines

A user of a search engine may have implied knowledge that search engine criterias have the embedded value of popular opinion and proactive search suggestions. The built in trend capturing algorithms implemented to accommodate these values have a tendency to “perpetuate the status quo”through data analysis driven by online traffic.[2]

Algorithm Bias

These algorithms, which tend to deafen the voices of more alternative viewpoints in mass media through this data analysis, are created with the intent to bring more popular topics and websites to the top of the search algorithm queue. Most of the data packets these systems are relying on come from humans with “biased data” learned from a specific society[3][4]. Users are implying that they have knowledge about the value inherently built into these systems, but this acts as a moral vice in creating moral neutrality on a specific search engine. The implied morally opaque practice becomes the new norm and through the act of seeming absolute knowledge of the online practice of popular opinion.

Algorithm Protection

Since the growing concern in society for individuals to have access to certain information, such as issues pertain to self harm, companies in charge of creating these search engine criterias have put in an effort to curb access the these results without restricting the public's ability to search for keywords. Search engines have inherent set of conditions that prevent users from being able to obtain information and instructions through the act of flagging these keywords in the system as harmful and restricts the user’s ability to obtain content related to the keyword. [5]

Online Social Movements

Online social movements work to address social concerns that have been unaddressed by the tangible world through the power of collective validity based off of online communal storytelling. These social movements, work to expose honesty in the face of hidden realities by highlighting the morally opaque practices in society which have been overlooked. Movements such as #MeToo and “Fake news” aim to create environments where falsehood can out outed in society through the power of online communities.

Trust in Intent

Online social movements capitalize on the need for an inherent social value within an information system in the infosphere which can create environments in which the user trusts the movement. The embedded intent of the design of systems being used to develop the movement can become morally opaque through the establishment of trust. Disclosive ethics aims to evaluate the parts of an online social movement to create more transparency within informational systems.




  1. 1.0 1.1 Brey, Philip. “Disclosive Computer Ethics.” ACM SIGCAS Computers and Society, vol. 30, no. 4, 2000, p. 10., doi:10.1145/572260.572264.
  2. Trevisan, Filippo. “Social Engines and Social Science: A Revolution in the Making.” SSRN Electronic Journal, 2013, doi:10.2139/ssrn.2265348.
  3. Snow, Jackie. “Bias Already Exists in Search Engine Results, and It's Only Going to Get Worse.” MIT Technology Review, MIT Technology Review, 1 Mar. 2018, www.technologyreview.com/s/610275/meet-the-woman-who-searches-out-search-engines-bias-against-women-and-minorities/.
  4. Cathy O’Neil. Weapons of Math Destruc- tion: How Big Data Increases Inequality and Threatens Democracy. Broadway Books, 2016 pp.22-30
  5. Goldman, E., 2008. “Search Engine Bias and the Demise of Search Engine Utopianism,” in Web Search: Multidisciplinary Perspectives, A. Spink and M. Zimmer (eds.), Berlin: Springer-Verlag, pp.42-43