Difference between revisions of "Disclosive Ethics"

From SI410
Jump to: navigation, search
(Ethical Implications)
(Algorithm Bias)
Line 24: Line 24:
 
====Algorithm Bias====
 
====Algorithm Bias====
  
These algorithms, which tend to deafen the voices of more alternative viewpoints in mass media through this data analysis, are created with the intent to bring more popular topics and websites to the top of the search algorithm queue. Most of the data packets(link 2) these systems are relying on come from humans with “biased data” learned from a specific society. Users are implying that they have knowledge about the value inherently built into these systems, but this acts as a moral vice in creating moral neutrality on a specific search engine. The implied morally opaque practice becomes the new norm and through the act of seeming absolute knowledge of the online practice of popular opinion.
+
These algorithms, which tend to deafen the voices of more alternative viewpoints in mass media through this data analysis, are created with the intent to bring more popular topics and websites to the top of the search algorithm queue. Most of the data packets these systems are relying on come from humans with “biased data” learned from a specific society<ref name = "Snow">Snow, Jackie. “Bias Already Exists in Search Engine Results, and It's Only Going to Get Worse.” MIT Technology Review, MIT Technology Review, 1 Mar. 2018, www.technologyreview.com/s/610275/meet-the-woman-who-searches-out-search-engines-bias-against-women-and-minorities/.</ref>. Users are implying that they have knowledge about the value inherently built into these systems, but this acts as a moral vice in creating moral neutrality on a specific search engine. The implied morally opaque practice becomes the new norm and through the act of seeming absolute knowledge of the online practice of popular opinion.
  
  

Revision as of 14:35, 15 March 2019

Disclosive ethics refers to an exploratory subfield of information ethics founded by Philip Brey in 2000[1]. The methodology of disclosive ethics utilizes an approach towards ethical issues pertaining to practices in information driven atmospheres with analysis of perceived normative environments in technology. One of the key facets to disclosive thinking involves the ability to approach perceived normative environments with a level of scepticism in order to expose the unperceived biases of the designers of a specific informational system. Designer bias that may have been knowingly or unknowingly embedded into the creation process of that information system, have the ability to shape the way a user interacts with a system which has the capacity of effecting societal cultural belief systems. This social implication is one of the fundamental reasons behind Brey proposal to create a subfield of computer ethics exclusively focusing on broadening the scope of moral analysis within computer practices to the issues that have not been exposed. Disclosive ethics is a deviation from the current standard model of applied ethics, which does not account for morally opaque ethical issues in perceived normative information systems and acts as a complementary approach in developing the limitations of the standard model of applied ethics.

Foundational Principles

Morally Opaque Practices

Morally opaque practices (or the existence thereof in computer related practices) are a set of computational practices and values embedded within the design of a system that user’s of the system may be aware or unaware of in an online information atmosphere. This creates an environment of perceived moral neutrality on the front end of an online platform although this may not be the case. The existence of a morally opaque practice can theoretically be traced back to two probable causes within the initial design of a system[1]. The principle of the existence of a morally opaque practice within the design of a system is one of the foundational beliefs of disclosive ethics.

Implied knowledge

The extent of moral opacity related to this computer related practice is based off of implied knowledge or unknown knowledge that a front end user may have about the design of a platform. This morally opaque practice can act as a hidden practice although there may be a slight awareness in the user using the system. The user is left unaware of the logistics of the running practice.

Undetectable Embedded Practices

A practice is built into the design of a system, through algorithms or a set rule of conditions, so that the user may appear to be using the system in a normal fashion but this normal fashion may be tainted when used outside the bounds of a embedded virtues in the design process that designers perceive as protecting a societal virtue. In this fashion, the system appears to be value neutral and no apparent or implied knowledge of the practice can be detected by the user.

Ethical Implications

Search engines

A user of a search engine may have implied knowledge that search engine criterias have the embedded value of popular opinion and proactive search suggestions. The built in trend capturing algorithms implemented to accommodate these values have a tendency to “perpetuate the status quo”through data analysis driven by online traffic.[2]

Algorithm Bias

These algorithms, which tend to deafen the voices of more alternative viewpoints in mass media through this data analysis, are created with the intent to bring more popular topics and websites to the top of the search algorithm queue. Most of the data packets these systems are relying on come from humans with “biased data” learned from a specific society[3]. Users are implying that they have knowledge about the value inherently built into these systems, but this acts as a moral vice in creating moral neutrality on a specific search engine. The implied morally opaque practice becomes the new norm and through the act of seeming absolute knowledge of the online practice of popular opinion.



  1. 1.0 1.1 Brey, Philip. “Disclosive Computer Ethics.” ACM SIGCAS Computers and Society, vol. 30, no. 4, 2000, p. 10., doi:10.1145/572260.572264.
  2. Trevisan, Filippo. “Social Engines and Social Science: A Revolution in the Making.” SSRN Electronic Journal, 2013, doi:10.2139/ssrn.2265348.
  3. Snow, Jackie. “Bias Already Exists in Search Engine Results, and It's Only Going to Get Worse.” MIT Technology Review, MIT Technology Review, 1 Mar. 2018, www.technologyreview.com/s/610275/meet-the-woman-who-searches-out-search-engines-bias-against-women-and-minorities/.