Bias in Everyday Appliances

From SI410
Jump to: navigation, search

Bias in Everyday Appliances

Bias is generally defined as a natural inclination towards or against something or someone—regardless of whether or not one is conscious of the biases they hold, one may propose that it is crucial to be wary of them due to the rash decisions and harm they can bring [1]. Biases in everyday appliances occur when builders and engineers of these appliances fail to either utilize diverse sets of training data, or thoroughly think of the social consequences of their designs [2] [3]. Some appliance domains of frequent use include: facial recognition, colored film (photography), speech recognition, automatic soap dispensers and medical devices. Although these tools may have been made with the intention to facilitate life, through these domains, many types of biases have arose, with some prevalent ones being racial bias and gender bias. In order to combat these biases, engineers and researchers are seeking more ways to incorporate greater representation in training data—ensuring that the data being used is accurate and informative—and modify existing systems to be more inclusive of certain ethnic groups and genders [2][4].

Types of Biases Referenced Below

Racial Bias

Racial bias is a common form of implicit bias and is defined to be a personal and potentially illogical judgment made solely on an individual’s race [5]. This type of bias can be found in almost all domains of everyday appliances, as further elaborated in the sections below.

Gender Bias

Gender bias is another form of implicit bias and is defined to be the tendency to prefer one gender over another [6]. Gender bias may be present in everyday appliances if developers happen to solely test the appliance within their team, where the diversity of the results would be limited by genders of the developers.

Physical Bias

This is the first way Achuta Kadambi, assistant professor at University of California, Los Angeles' Samueli School of Engineering, claims is how racial and gender bias can negatively affect medical devices. Physical bias refers to the inherent bias present in the mechanics of the medical device.

Computational Bias

The second way racial and gender bias can enter into the realm of medical devices is through computational bias, which refers to the bias present in the software or data sets used to develop the devices.

Interpretation Bias

According to Kadambi, the third and final way racial and gender bias can affect medical devices is by interpretation bias, which refers to the bias present in the user of the device—specifically when clinicians apply race-based standards to the outputs of medical devices and tests [2].

Selection or Sample Bias

This bias occurs when a dataset does not reflect the realities of the environment the system will be used in [4].

Measurement Bias

Measurement bias refers to the bias that occurs when the data collected for training does not represent the real world accurately, or when faulty measurements result in ill-fitted data [4].

Facial Recognition

Facial Recognition

Facial recognition is a widely implemented software in today's society that is used to match faces in images, from both photos and video stills, against an existing database of facial identities [7]. The usage of this technology can be commonly seen in phones (e.g. to unlock lock screens or to verify app purchases) and, starting from the early 2000s, in law enforcement (e.g. to identify criminals) [8]. Despite there being much improvement to this technology from when it was first used, facial recognition software is still far from perfect. Not only was iPhone’s facial recognition feature unable to correctly distinguish between two colleagues in China, Google Photos also mistakenly tagged two Black friends in a selfie as gorillas [3]. Moreover, in law enforcement systems, these facial recognition systems are still more prone to misidentifying African Americans than other races, which are errors that could lead to innocent people being incarcerated or claimed as potential suspects. Authors Claire Garvie and Jonathan Frankle propose that the racial makeup of the development team and types of photos available in the databases used can heavily influence the accuracy of facial recognition systems. This statement is supported by the fact that algorithms developed in East Asian countries recognized East Asian faces far more readily than Caucasians, and similarly, algorithms developed in Western and European countries recognized Caucasian facial characteristics with greater accuracy than those of East Asian descent [8].

Color Film (Photography)

Kodak is a global technology company providing products and services for commercial print, packaging, publishing, manufacturing and entertainment [9]. Their initial color film portrayed racial bias against darker-skinned people; although White people were depicted as they looked in everyday life, Black people were depicted as ink blots, often missing facial features. The cause behind this biased portrayal was that the film was neither designed nor tested to deal with a wide range of exposure—in other words, Kodak's initial film was optimized for people of lighter skin tones. Even though these disparities were initially noticed by parents of Black children, they were only brought to public concern when large corporations protested that Kodak's film failed to capture objects of dark hues. Later in the late 1970s, Kodak developed a more inclusive color film named the Kodak Gold Film. In attempts to show that their new film was capable of capturing pictures of darker-skinned people but hide their initial film's racial bias, they poetically announced that their new film had the ability to take a picture of a "dark horse in low light" [10]. There were further advancements made in the realm of photography in the 1990s where developers at Phillips in Breda, Holland designed a camera system that uses two different computer chips to balance lighter and darker skin tones individually [11].

Despite these advancements, the issues still persist today. Author Ainissa Ramirez indicates how silicon pixels in today's digital photography are still not optimized to register dark skin as well as light skin; some web cameras, which often follow instructions from algorithms, have greater difficulty recognizing and following a darker-skinned person [10]. Relating to the facial recognition section above, a widely known mishap occurred in 2009 when an HP MediaSmart webcam, designed to track head and face movements, solely failed to account for a Black person's movements [11]. Overall, Ramirez emphasizes that such devices can shed light onto the biases that still need to be accounted for and that people should be mindful of such tacit subscriptions to a belief of a standard when developing these technologies [10].

Speech Recognition/Voice Assistants

Alexa (left) and Siri (right)

Speech Recognition, or voice assistants, is a type of intelligent software that enables programs to process human speech into a written format and respond to voice commands [12] [13]. Some common examples of speech recognition software include Apple's Siri and Amazon's Alexa. Such technologies and programs occasionally have difficulty when interpreting the vernacular and speech patterns of people whose first language is not English or are from underrepresented groups. Author Claudia Lopez Lloreda highlights one study which indicated that programs from leading technology companies displayed significant race disparities, where audio from Black speakers were twice as likely to be incorrectly transcribed compared to those from White speakers [14].

Not only can speech recognition software potentially perpetuate racial bias, it can also enforce gender bias and stereotypes. This is because although voice assistants avoid gender adherence—e.g. claiming that they identify as a certain gender—, there is still a prominence of female-sounding assistants, indicating that voice assistants are still gendered. The settings of voice assistants are significant because there have been multiple studies suggesting that gendered voices have the ability to both shape people's perceptions of a person or situation and promote gender-stereotypic behaviors. For instance, one 2019 UNESCO report indicated that the dominance of female-sounding voice assistants encourages stereotypes of women as being submissive and compliant—creating an illusion that people who are gendered female need to respond on demand, especially to children. Similar to the study Lloreda mentioned above, authors Caitlin Chin and Mishaela Robison emphasize that it is important these racial and gender biases are accounted for since voice assistants can be a common way people obtain their information and it would be important to ensure that users are provided with the most reliable resources possible, regardless of their demographic characteristics. Specifically regarding ways to combat gender bias Chin and Robison propose some recommendations for companies and governments to enforce: develop industry-wide standards for the humanization of AI and gender portrayal, encourage companies to collect and publish product and team data relating to gender and diversity, reduce barriers for students who are wishing to pursue a STEM education—especially those which disproportionately affect women, transgender, or non-binary individuals—, address gender disparities and adopt more policies that benefit the underrepresented population throughout society [13].

Automatic Soap Dispensers

Automatic soap dispenser failing to work for a darker-skin toned user

Automatic soap dispensers use near-infrared technology to detect hand motions, and the invisible light that is reflected back from the skin is what triggers the sensor to pump out the soap. If the user's skin is of darker tone, however, not enough light is reflected back to activate the sensor, leaving darker-skin toned users soapless, and sometimes, waterless. Some examples of automatic soap dispensers displaying implicit racial bias include the time when an automatic soap dispenser at Marriott Hotel was unable to detect a Black customer's hand and the instance when Kadambi had to ask a lighter-skinned individual to activate the soap dispenser (and faucet) for him. Results like these can arise if companies fail to test their products on a large range of skin tones [2][3].

Medical Devices

Pulse Oximeters

Pulse oximeters are devices that use technology similar to that of soap dispensers, where light is transmitted through the skin and tissue to measure oxygen levels in a person's blood. The inherent racial bias perpetuated by these devices is evident by the study where low oxygen levels were three times more likely to be missed in Black patients than in White patients [2][4]. This generally shows how non-White patients may be hampered from receiving fair and necessary treatment.

Wearables/Smart Watches

Wearable measuring an individual's heart rate.

Wearables, or smart watches, are non-invasive blood oxygen level measuring products which also can provide sleep, exercise and health measurements. Some well-known examples of wearables include Fitbit and Apple Watch. The data received from these devices are often factored into medical decisions or even used as a measurement device in some clinical trials. Due to its cheaper use and greater resistance to motion error, many smart watches solely use green light to monitor users' measurements. Despite the benefits of green light, however, they send out shorter wavelengths, creating difficulty when trying to take accurate readings of people with darker-skin tones. The skin tone scale utilized in these devices is also outdated—most devices use the Fitzpatrick Scale which was developed in the 1970s. Not only is this skin tone scale used not objective enough, these devices are often calibrated using the sample population the developers have access to, which happens to consist of a high proportion of White and Asian males. The resulting racial biases lead to the collection of erroneous data, causing disadvantages for individuals who seek to obtain proper access to healthcare. For instance, it was found that fitness trackers were more successfully able to detect a heart murmur and hypoxia for White people than Black people [4].

Remote Plethysmography

Remote plethysmography is recently developed technology that measures heart rates through the analysis of live or recorded video; however, it works less well for people of color when programmed to pick up blush like changes in the skin. In order to make this technology more inclusive, a group of researchers at the Massachusetts Institute of Technology devised a remote plethysmograph that reads tiny changes in head motion which occur when the heart beats instead. Kadambi's research team is also aiming to combat more of the physical, computational, and interpretation bias referenced earlier by analyzing video images with thermal wavelengths rather than visible light [2].

Spirometers

Spirometers are medical devices that are used to diagnose asthma, chronic obstructive pulmonary disease (COPD) and other breathing or lung conditions [15]. Although potentially inaccurate, there is an assumption that ethnic minorities have a lower capacity than that of White people. Thus, due to this lower baseline for lung capacity, despite a non-White patient and White patient displaying the same lung capacity, the White patient is prioritized to receive treatment [4].

Medical-Grade Respirator Masks

Medical-grade respirator masks are designed to achieve a close facial fit and protect the wearer from particles or liquid that could contaminate the face [16]. These masks work well if they fit properly, but due to the one-size-fits-all design property being mainly calibrated to White males, women and certain ethnic minorities have been having difficulty with receiving the protection they need [4].

Cardiovascular Treatment

It has also been noted that there is a gender bias regarding the cardiovascular treatment and management of women. A study published in the Medical Journal of Australia portrayed how after six months of hospital discharge, women were more likely to face death and serious ST-Elevation Myocardial Infarction (STEMI). Although the study was not completely sure of the reason behind these findings, they proposed that it may have been due to poor awareness that women with STEMI are of higher risk or due to subjective assessments for determining who is considered "high risk"—when there should be an application of reliable and objective risk prediction tools instead [4].

Chest X-rays

An artificial intelligence system which was built to identify and analyze chest x-rays for 14 different lung and chest diseases worked less well for women. This was because the system was mainly trained on male chest scans, making it inevitable for gender bias to infiltrate the system. Later, when the system was trained on a more gender-balanced sample, however, the model was found to produce the best overall results. As shown by the modifications made to this system, Kadambi suggests that adding more variety to the training data can help improve performance levels of such medical devices [2].

References

  1. Sussex Publishers. (n.d.). Bias. Psychology Today. Retrieved January 27, 2023, from https://www.psychologytoday.com/us/basics/bias
  2. 2.0 2.1 2.2 2.3 2.4 2.5 2.6 Wallis, C. (2021, June 1). Fixing Medical Devices That Are Biased against Race or Gender. Scientific American. Retrieved January 27, 2023, from https://www.scientificamerican.com/article/fixing-medical-devices-that-are-biased-against-race-or-gender/
  3. 3.0 3.1 3.2 Goethe, T. S. (2019, March 2). Bigotry Encoded: Racial Bias in Technology. Reporter. Retrieved January 27, 2023, from https://reporter.rit.edu/tech/bigotry-encoded-racial-bias-technology
  4. 4.0 4.1 4.2 4.3 4.4 4.5 4.6 4.7 Turner, A., & Stainwright, S. (2022, May 3). Racist soap dispensers and other tales of Health Measurement Bias. Actuaries Digital. Retrieved January 27, 2023, from https://www.actuaries.digital/2022/04/28/racist-soap-dispensers-and-other-tales-of-health-measurement-bias/
  5. Williams, S. A. S. (1970, January 1). Bias, Race. SpringerLink. Retrieved January 27, 2023, from https://link.springer.com/referenceworkentry/10.1007/978-0-387-79061-9_329
  6. Reiners, B. (2022, September 29). What Is Gender Bias in the Workplace? Built In. Retrieved January 27, 2023, from https://builtin.com/diversity-inclusion/gender-bias-in-the-workplace
  7. Mohanakrishnan, R. (2021, September 2). Top 11 Facial Recognition Software in 2021. Spiceworks. Retrieved January 27, 2023, from https://www.spiceworks.com/it-security/identity-access-management/articles/facial-recognition-software/
  8. 8.0 8.1 Frankle, J., & Garvie, C. (2016, April 7). Facial-Recognition Software Might Have a Racial Bias Problem. The Atlantic. Retrieved January 27, 2023, from https://www.theatlantic.com/technology/archive/2016/04/the-underlying-bias-of-facial-recognition-systems/476991/
  9. Eastman Kodak Company. Kodak. (2023, January 25). Retrieved February 10, 2023, from https://www.kodak.com/en/
  10. 10.0 10.1 10.2 Ramirez, A. (2020, July 24). How camera film captured the bias in American culture. Time. Retrieved February 10, 2023, from https://time.com/5871502/film-race-history/
  11. 11.0 11.1 Caswell, E. (2015, September 18). Color film was built for white people. here's what it did to dark skin. Vox. Retrieved February 10, 2023, from https://www.vox.com/2015/9/18/9348821/photography-race-bias
  12. What is speech recognition? IBM. (n.d.). Retrieved January 27, 2023, from https://www.ibm.com/topics/speech-recognition
  13. 13.0 13.1 Chin, C., & Robison, M. (2022, March 9). How AI Bots and Voice Assistants Reinforce Gender Bias. Brookings. Retrieved February 10, 2023, from https://www.brookings.edu/research/how-ai-bots-and-voice-assistants-reinforce-gender-bias/
  14. Lloreda, C. L. (2020, July 5). Speech Recognition Tech Is Yet Another Example of Bias. Scientific American. Retrieved January 27, 2023, from https://www.scientificamerican.com/article/speech-recognition-tech-is-yet-another-example-of-bias/
  15. Mayo Foundation for Medical Education and Research. (2017, August 17). Spirometry. Mayo Clinic. Retrieved February 10, 2023, from https://www.mayoclinic.org/tests-procedures/spirometry/about/pac-20385201#:~:text=Spirometer-,Spirometer,a%20machine%20called%20a%20spirometer.
  16. Center for Devices and Radiological Health. (n.d.). N95 respirators, surgical masks, face masks, & barrier face coverings. U.S. Food and Drug Administration. Retrieved February 10, 2023, from https://www.fda.gov/medical-devices/personal-protective-equipment-infection-control/n95-respirators-surgical-masks-face-masks-and-barrier-face-coverings