Difference between revisions of "Racial Algorithmic Bias"

From SI410
Jump to: navigation, search
m
Line 9: Line 9:
  
  
Vernon Prater is a criminal, white male, who was charged for multiple armed and attempted armed robberies. Additionally, he was charged for a petty theft crime for shoplifting around $85 worth of supplies from Home Depot. Now let’s compare Vernon Prater, a white male, to Brisha Borden, a black female. Brisha Borden was charged with burglary and petty theft for picking up a bike and scooter (combined value of $80), which she quickly dropped once the owner came running. When Prater and Borden went to jail, a computer algorithm predicted who was more likely to commit a crime again down the line. The algorithm predicted that Borden would be much more likely to commit a crime again. The algorithm was miserably wrong. After both were let out of jail, Borden had not committed any new crimes, however Prater had broke into a warehouse stealing thousands of dollars worth of supplies. The answer? The algorithm’s racial bias.  
+
Vernon Prater is a criminal, white male, who was charged for multiple armed and attempted armed robberies. Additionally, he was charged for a petty theft crime for shoplifting around $85 worth of supplies from Home Depot. Now let’s compare Vernon Prater, a white male, to Brisha Borden, a black female. Brisha Borden was charged with burglary and petty theft for picking up a bike and scooter (combined value of $80), which she quickly dropped once the owner came running. When Prater and Borden went to jail, a computer algorithm predicted who was more likely to commit a crime again down the line. The algorithm predicted that Borden would be much more likely to commit a crime again. The algorithm was miserably wrong. After both were let out of jail, Borden had not committed any new crimes, however Prater had broke into a warehouse stealing thousands of dollars worth of supplies." <ref>[https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing] The answer? The algorithm’s racial bias.  
  
 
=== Sources of Racial Algorithmic Bias ===
 
=== Sources of Racial Algorithmic Bias ===

Revision as of 20:09, 15 March 2020

Racial Algorithmic Bias

Racial algorithmic bias refers to errors present in algorithms that skew results and create unfair advantages and partial results for certain racial groups. Machines learn through the use of algorithms, which are a set of computer implementable instructions used to solve a problem. Racial algorithmic bias can stem from errors in either the design of the machine or sampling errors in data that is used for machine learning. Machines are subject to the minds of their creators, human beings, and learn through real time data. Because human minds are subjective, and racial bias remains an issue that we have faced for decades, many algorithms that dictate important decisions for millions enshroud racial biases.

As machines become increasingly omnipresent, discussions such as algorithmic bias and computer ethics have become more and more important in order to help prevent partial outcomes in high revenue industries such as Healthcare and Prison that significantly drive our economy and drastically affect lives. As many of these industries revert to a computer generated decision for a seemingly more objective selection, racial algorithm bias can result in potentially unfair disadvantages to certain racial groups. There are many instances of dire consequences of racial bias in algorithms.

Cases of Racial Algorithmic Bias in Industries

For example, a physician and researcher at the UC Berkeley School of Public Health published a paper revealing that a major medical center’s algorithm used for identifying which patients needed extra medical care was racially biased. “The algorithm screened patients for enrollment in an intensive care management program, which gave them access to a dedicated hotline for a nurse practitioner, help refilling prescriptions, and so forth. The screening was meant to identify those patients who would most benefit from the program” Cite error: Closing </ref> missing for <ref> tag