Difference between revisions of "Bias in Everyday Appliances"

From SI410
Jump to: navigation, search
(Class work)
 
Line 1: Line 1:
Medical Equipment
+
== Bias in Everyday Appliances ==
Bias occured in the usage of oximeters, a device to measure the percentage of oxygen in blood. The way these oximeters functioned were to pass infrared light through the skin to determine the blood oxygen level. Most oximeters distributed in the general public were initially calibrated for people with white or light skin. This was a vias that produced errors more frequently for those who tend to have a darker skin color.
+
Bias is generally defined as a natural inclination towards or against something or someone—regardless of whether or not one is conscious of the biases they hold, one may propose that it is crucial to be wary of them due to the rash decisions and harm they can bring <ref>Sussex Publishers. (n.d.). Bias. Psychology Today. Retrieved January 27, 2023, from https://www.psychologytoday.com/us/basics/bias </ref>. Biases in everyday appliances occur when builders and engineers of these appliances fail to either utilize diverse sets of training data, or thoroughly think of the social consequences of their designs <ref>Wallis, C. (2021, June 1). Fixing Medical Devices That Are Biased against Race or Gender. Scientific American. Retrieved January 27, 2023, from https://www.scientificamerican.com/article/fixing-medical-devices-that-are-biased-against-race-or-gender/</ref> <ref>Goethe, T. S. (2019, March 2). Bigotry Encoded: Racial Bias in Technology. Reporter. Retrieved January 27, 2023, from https://reporter.rit.edu/tech/bigotry-encoded-racial-bias-technology</ref>. Some appliance domains of frequent use include: facial recognition, speech recognition, soap dispensers and medical devices. Through these domains, many types of biases may arise, with some prevalent ones being racial bias and gender bias. In order to combat these biases, engineers and researchers are seeking more ways to incorporate greater representation in training data—ensuring that the data being used is accurate and informative—and modify existing systems to be more inclusive of certain ethnic groups and genders <ref>Turner, A., & Stainwright, S. (2022, May 3). Racist soap dispensers and other tales of Health Measurement Bias. Actuaries Digital. Retrieved January 27, 2023, from https://www.actuaries.digital/2022/04/28/racist-soap-dispensers-and-other-tales-of-health-measurement-bias/ </ref> <ref>Wallis, C. (2021, June 1). Fixing Medical Devices That Are Biased against Race or Gender. Scientific American. Retrieved January 27, 2023, from https://www.scientificamerican.com/article/fixing-medical-devices-that-are-biased-against-race-or-gender/</ref>.  
  
https://www.actuaries.digital/2022/04/28/racist-soap-dispensers-and-other-tales-of-health-measurement-bias/
+
== Reference for the Types of Bias Discussed Below ==
 
+
=== Racial Bias ===
Speech Recognition
+
Racial bias is a form of implicit bias and is defined to be a personal and potentially illogical judgment made solely on an individual’s race. <ref>Williams, S. A. S. (1970, January 1). Bias, Race. SpringerLink. Retrieved January 27, 2023, from https://link.springer.com/referenceworkentry/10.1007/978-0-387-79061-9_329 </ref>
Speech recognition is a tool used to process human speech onto on-screen text. Speech recognition programs like Siri, and Alexa had trouble identifying speech patterns of those from minority groups. Claudia Lopes Lloreda brought up research which showed there exists bias against those who identify as Black: “On average, all five programs from leading technology companies like Apple and Microsoft showed significant race disparities; they were twice as likely to incorrectly transcribe audio from Black speakers as opposed to white speakers".
+
=== Gender Bias ===
 
+
Gender bias is another form of implicit bias and is defined to be the tendency to prefer one gender over another. <ref>Reiners, B. (2022, September 29). What Is Gender Bias in the Workplace? Built In. Retrieved January 27, 2023, from https://builtin.com/diversity-inclusion/gender-bias-in-the-workplace </ref>
https://www.ibm.com/topics/speech-recognition
+
=== Physical Bias ===
https://www.futurelearn.com/info/courses/anti-racist-technologies/0/steps/224822
+
This is the first way Achuta Kadambi, assistant professor at University of California, Los Angeles' Samueli School of Engineering, claims is how racial and gender bias can negatively affect medical devices. Physical bias refers to the inherent bias present in the mechanics of the medical device.
 +
=== Computational Bias ===
 +
The second way racial and gender bias can enter into the realm of medical devices is computational bias, which refers to the bias present in the software or data sets used to develop the devices.
 +
=== Interpretation Bias ===
 +
According to Kadambi, the third and final way racial and gender bias can affect medical devices is interpretation bias, which refers to the bias present in the user of the device—specifically when clinicians apply race-based standards to the outputs of medical devices and tests.
 +
=== Selection or Sample Bias ===
 +
This bias occurs when a dataset does not reflect the realities of the environment the system will be used in.  
 +
=== Measurement Bias ===
 +
Measurement bias refers to the bias that occurs when the data collected for training does not represent the real world accurately, or when faulty measurements result in ill-fitted data.

Revision as of 10:04, 27 January 2023

Bias in Everyday Appliances

Bias is generally defined as a natural inclination towards or against something or someone—regardless of whether or not one is conscious of the biases they hold, one may propose that it is crucial to be wary of them due to the rash decisions and harm they can bring [1]. Biases in everyday appliances occur when builders and engineers of these appliances fail to either utilize diverse sets of training data, or thoroughly think of the social consequences of their designs [2] [3]. Some appliance domains of frequent use include: facial recognition, speech recognition, soap dispensers and medical devices. Through these domains, many types of biases may arise, with some prevalent ones being racial bias and gender bias. In order to combat these biases, engineers and researchers are seeking more ways to incorporate greater representation in training data—ensuring that the data being used is accurate and informative—and modify existing systems to be more inclusive of certain ethnic groups and genders [4] [5].

Reference for the Types of Bias Discussed Below

Racial Bias

Racial bias is a form of implicit bias and is defined to be a personal and potentially illogical judgment made solely on an individual’s race. [6]

Gender Bias

Gender bias is another form of implicit bias and is defined to be the tendency to prefer one gender over another. [7]

Physical Bias

This is the first way Achuta Kadambi, assistant professor at University of California, Los Angeles' Samueli School of Engineering, claims is how racial and gender bias can negatively affect medical devices. Physical bias refers to the inherent bias present in the mechanics of the medical device.

Computational Bias

The second way racial and gender bias can enter into the realm of medical devices is computational bias, which refers to the bias present in the software or data sets used to develop the devices.

Interpretation Bias

According to Kadambi, the third and final way racial and gender bias can affect medical devices is interpretation bias, which refers to the bias present in the user of the device—specifically when clinicians apply race-based standards to the outputs of medical devices and tests.

Selection or Sample Bias

This bias occurs when a dataset does not reflect the realities of the environment the system will be used in.

Measurement Bias

Measurement bias refers to the bias that occurs when the data collected for training does not represent the real world accurately, or when faulty measurements result in ill-fitted data.
  1. Sussex Publishers. (n.d.). Bias. Psychology Today. Retrieved January 27, 2023, from https://www.psychologytoday.com/us/basics/bias
  2. Wallis, C. (2021, June 1). Fixing Medical Devices That Are Biased against Race or Gender. Scientific American. Retrieved January 27, 2023, from https://www.scientificamerican.com/article/fixing-medical-devices-that-are-biased-against-race-or-gender/
  3. Goethe, T. S. (2019, March 2). Bigotry Encoded: Racial Bias in Technology. Reporter. Retrieved January 27, 2023, from https://reporter.rit.edu/tech/bigotry-encoded-racial-bias-technology
  4. Turner, A., & Stainwright, S. (2022, May 3). Racist soap dispensers and other tales of Health Measurement Bias. Actuaries Digital. Retrieved January 27, 2023, from https://www.actuaries.digital/2022/04/28/racist-soap-dispensers-and-other-tales-of-health-measurement-bias/
  5. Wallis, C. (2021, June 1). Fixing Medical Devices That Are Biased against Race or Gender. Scientific American. Retrieved January 27, 2023, from https://www.scientificamerican.com/article/fixing-medical-devices-that-are-biased-against-race-or-gender/
  6. Williams, S. A. S. (1970, January 1). Bias, Race. SpringerLink. Retrieved January 27, 2023, from https://link.springer.com/referenceworkentry/10.1007/978-0-387-79061-9_329
  7. Reiners, B. (2022, September 29). What Is Gender Bias in the Workplace? Built In. Retrieved January 27, 2023, from https://builtin.com/diversity-inclusion/gender-bias-in-the-workplace