Difference between revisions of "Gender bias in the Online Job Search"

From SI410
Jump to: navigation, search
(Balancing Humans and Algorithms)
Line 1: Line 1:
 
[[File:Hiringfunnel.png |250px|thumbnail|right| There are many stages to the hiring process, and algorithms are involved in almost all of them. <ref name=upturn>Bogen, M. & Rieke, A. (2018). <i>Help Wanted: An Examination of Hiring Algorithms, Equity, and Bias</i>. Upturn. https://www.upturn.org/static/reports/2018/hiring-algorithms/files/Upturn%20--%20Help%20Wanted%20-%20An%20Exploration%20of%20Hiring%20Algorithms,%20Equity%20and%20Bias.pdf</ref>]]
 
[[File:Hiringfunnel.png |250px|thumbnail|right| There are many stages to the hiring process, and algorithms are involved in almost all of them. <ref name=upturn>Bogen, M. & Rieke, A. (2018). <i>Help Wanted: An Examination of Hiring Algorithms, Equity, and Bias</i>. Upturn. https://www.upturn.org/static/reports/2018/hiring-algorithms/files/Upturn%20--%20Help%20Wanted%20-%20An%20Exploration%20of%20Hiring%20Algorithms,%20Equity%20and%20Bias.pdf</ref>]]
  
Gender bias refers to the “unfair difference” in the way that both men and women are treated <ref name=biasDefinition>Cambridge Dictionary.(n.d.). Gender Bias. In <i>Cambridge English Dictionary</i>. https://dictionary.cambridge.org/us/dictionary/english/gender-bias</ref>. In the context of the online job search, this refers to the advantage male job seekers have versus female job seekers. The advancement of technology and big data has led to companies recruiting and hiring potential employees through the use of big data and artificial intelligence <ref name=paulineKim>Kim, P. T. (2019). Big data and artificial intelligence: New challenges for workplace equality. University of Louisville Law Review, 57(2), 313-328.</ref>. The use of algorithms in the job search process can perpetuate existing biases which raise ethical concerns regarding opportunities for females being blocked, and how systematic gender roles are disadvantaging female job seekers.  
+
Gender bias refers to the “unfair difference” in the way that both men and women are treated <ref name=biasDefinition>Cambridge Dictionary.(n.d.). Gender Bias. In <i>Cambridge English Dictionary</i>. https://dictionary.cambridge.org/us/dictionary/english/gender-bias</ref>. In the context of the online job search, this refers to the advantage male job seekers have versus female job seekers. The advancement of technology and big data has led to companies recruiting and hiring potential employees through the use of big data and artificial intelligence <ref name=paulineKim>Kim, P. T. (2019). Big data and artificial intelligence: New challenges for workplace equality. <i>University of Louisville Law Review</i>, 57(2), 313-328.</ref>. The use of algorithms in the job search process can perpetuate existing biases which raise ethical concerns regarding opportunities for females being blocked, and how systematic gender roles are disadvantaging female job seekers.  
  
 
==<big>Evidence of Bias</big>==
 
==<big>Evidence of Bias</big>==
Line 7: Line 7:
 
Job recruitment algorithms have been found to reinforce and perpetuate unconscious human gender bias <ref name=abc>Hanrahan, C. (2020, December 2. <i>Job recruitment algorithms can amplify unconscious bias favouring men, new research finds.</i> The ABC News. https://www.abc.net.au/news/2020-12-02/job-recruitment-algorithms-can-have-bias-against-women/12938870</ref>. Because job recruitment algorithms are trained on real-world data – and the real world is biased – the algorithms amplify this bias on a larger scale. Decision-making algorithms are “designed to mimic how a human would…choose a potential employee” and without careful consideration, algorithms can intensify bias in recruiting <ref name=Melbourne>Cheong, M., et al. (n.d.). <i>Ethical Implications of AI Bias as a Result of Workforce Gender Imbalance</i>. The University of Melbourne. https://about.unimelb.edu.au/__data/assets/pdf_file/0024/186252/NEW-RESEARCH-REPORT-Ethical-Implications-of-AI-Bias-as-a-Result-of-Workforce-Gender-Imbalance-UniMelb,-UniBank.pdf</ref>.  
 
Job recruitment algorithms have been found to reinforce and perpetuate unconscious human gender bias <ref name=abc>Hanrahan, C. (2020, December 2. <i>Job recruitment algorithms can amplify unconscious bias favouring men, new research finds.</i> The ABC News. https://www.abc.net.au/news/2020-12-02/job-recruitment-algorithms-can-have-bias-against-women/12938870</ref>. Because job recruitment algorithms are trained on real-world data – and the real world is biased – the algorithms amplify this bias on a larger scale. Decision-making algorithms are “designed to mimic how a human would…choose a potential employee” and without careful consideration, algorithms can intensify bias in recruiting <ref name=Melbourne>Cheong, M., et al. (n.d.). <i>Ethical Implications of AI Bias as a Result of Workforce Gender Imbalance</i>. The University of Melbourne. https://about.unimelb.edu.au/__data/assets/pdf_file/0024/186252/NEW-RESEARCH-REPORT-Ethical-Implications-of-AI-Bias-as-a-Result-of-Workforce-Gender-Imbalance-UniMelb,-UniBank.pdf</ref>.  
  
Utilizing real-world data to shape algorithms leads to algorithms producing biased outcomes. Algorithms make predictions by analyzing past data <ref name=paulineKim>Kim, P. T. (2019). Big data and artificial intelligence: New challenges for workplace equality. University of Louisville Law Review, 57(2), 313-328.</ref>. If the past data includes biased judgments, then the algorithm’s predictions will also be biased.  Facebook’s tool called “lookalike audience” allows advertisers – in this case, employers – to input a “source audience” that will dictate who Facebook advertises jobs to, based on a person’s similarities with this “source audience” <ref name=paulineKim>Kim, P. T. (2019). Big data and artificial intelligence: New challenges for workplace equality. University of Louisville Law Review, 57(2), 313-328.</ref>. This tool is meant to help employers predict which users are most likely to apply for jobs. If an employer provides the lookalike tool with a dataset that does not include a lot of women, then Facebook will not advertise the job to women. Employers might use this tool to deliberately exclude certain groups, but there is also a possibility for employers to be unaware of the bias of their “source audience” <ref name=paulineKim>Kim, P. T. (2019). Big data and artificial intelligence: New challenges for workplace equality. University of Louisville Law Review, 57(2), 313-328.</ref>.
+
Utilizing real-world data to shape algorithms leads to algorithms producing biased outcomes. Algorithms make predictions by analyzing past data <ref name=paulineKim>Kim, P. T. (2019). Big data and artificial intelligence: New challenges for workplace equality. <i>University of Louisville Law Review</i>, 57(2), 313-328.</ref>. If the past data includes biased judgments, then the algorithm’s predictions will also be biased.  Facebook’s tool called “lookalike audience” allows advertisers – in this case, employers – to input a “source audience” that will dictate who Facebook advertises jobs to, based on a person’s similarities with this “source audience” <ref name=paulineKim>Kim, P. T. (2019). Big data and artificial intelligence: New challenges for workplace equality. <i>University of Louisville Law Review</i>, 57(2), 313-328.</ref>. This tool is meant to help employers predict which users are most likely to apply for jobs. If an employer provides the lookalike tool with a dataset that does not include a lot of women, then Facebook will not advertise the job to women. Employers might use this tool to deliberately exclude certain groups, but there is also a possibility for employers to be unaware of the bias of their “source audience” <ref name=paulineKim>Kim, P. T. (2019). Big data and artificial intelligence: New challenges for workplace equality. <i>University of Louisville Law Review</i>, 57(2), 313-328.</ref>.
  
 
===Job Recommendation Algorithms===
 
===Job Recommendation Algorithms===
Line 30: Line 30:
  
 
===Balancing Humans and Algorithms===
 
===Balancing Humans and Algorithms===
Implementing a balance between predictive algorithms and human insight is a promising solution for employers looking to use algorithms in their hiring process, while reducing bias <ref name=lawReview>Raub, M. (2018). Bots, Bias and Big Data: Artificial Intelligence, Algorithmic Bias and Disparate Impact Liability in Hiring Practices. Arkansas Law Review, 71(2), 529-570</ref>. Using artificial intelligence and algorithms to parse through large amounts of data or applicants works well for processing. Balancing the processing by algorithms with the "human ability to recognize more intangible realities of what that data might mean" is the second step in the process of limiting algorithmic bias <ref name=lawReview>Raub, M. (2018). Bots, Bias and Big Data: Artificial Intelligence, Algorithmic Bias and Disparate Impact Liability in Hiring Practices. Arkansas Law Review, 71(2), 529-570</ref>. For a partnership between humans and algorithms to be successful, companies need to consciously implement new practices that encourage new practices. The algorithms and humans still need to be held accountable for reducing bias, and working together would encourage a good short-term solution to gender bias in algorithms <ref name=lawReview>Raub, M. (2018). Bots, Bias and Big Data: Artificial Intelligence, Algorithmic Bias and Disparate Impact Liability in Hiring Practices. Arkansas Law Review, 71(2), 529-570</ref>.
+
Implementing a balance between predictive algorithms and human insight is a promising solution for employers looking to use algorithms in their hiring process, while reducing bias <ref name=lawReview>Raub, M. (2018). Bots, Bias and Big Data: Artificial Intelligence, Algorithmic Bias and Disparate Impact Liability in Hiring Practices. <i>Arkansas Law Review</i>, 71(2), 529-570</ref>. Using artificial intelligence and algorithms to parse through large amounts of data or applicants works well for processing. Balancing the processing by algorithms with the "human ability to recognize more intangible realities of what that data might mean" is the second step in the process of limiting algorithmic bias <ref name=lawReview>Raub, M. (2018). Bots, Bias and Big Data: Artificial Intelligence, Algorithmic Bias and Disparate Impact Liability in Hiring Practices. <i>Arkansas Law Review</i>, 71(2), 529-570</ref>. For a partnership between humans and algorithms to be successful, companies need to consciously implement new practices that encourage new practices. The algorithms and humans still need to be held accountable for reducing bias, and working together would encourage a good short-term solution to gender bias in algorithms <ref name=lawReview>Raub, M. (2018). Bots, Bias and Big Data: Artificial Intelligence, Algorithmic Bias and Disparate Impact Liability in Hiring Practices. <i>Arkansas Law Review</i>, 71(2), 529-570</ref>.
  
 
== References ==
 
== References ==
 
<references/>
 
<references/>

Revision as of 15:50, 12 March 2021

There are many stages to the hiring process, and algorithms are involved in almost all of them. [1]

Gender bias refers to the “unfair difference” in the way that both men and women are treated [2]. In the context of the online job search, this refers to the advantage male job seekers have versus female job seekers. The advancement of technology and big data has led to companies recruiting and hiring potential employees through the use of big data and artificial intelligence [3]. The use of algorithms in the job search process can perpetuate existing biases which raise ethical concerns regarding opportunities for females being blocked, and how systematic gender roles are disadvantaging female job seekers.

Evidence of Bias

Job Recruitment Algorithms

Job recruitment algorithms have been found to reinforce and perpetuate unconscious human gender bias [4]. Because job recruitment algorithms are trained on real-world data – and the real world is biased – the algorithms amplify this bias on a larger scale. Decision-making algorithms are “designed to mimic how a human would…choose a potential employee” and without careful consideration, algorithms can intensify bias in recruiting [5].

Utilizing real-world data to shape algorithms leads to algorithms producing biased outcomes. Algorithms make predictions by analyzing past data [3]. If the past data includes biased judgments, then the algorithm’s predictions will also be biased. Facebook’s tool called “lookalike audience” allows advertisers – in this case, employers – to input a “source audience” that will dictate who Facebook advertises jobs to, based on a person’s similarities with this “source audience” [3]. This tool is meant to help employers predict which users are most likely to apply for jobs. If an employer provides the lookalike tool with a dataset that does not include a lot of women, then Facebook will not advertise the job to women. Employers might use this tool to deliberately exclude certain groups, but there is also a possibility for employers to be unaware of the bias of their “source audience” [3].

Job Recommendation Algorithms

Job recommendation algorithms within online platforms are built to find and reproduce patterns in user behavior, updating predictions or decisions as job seekers and employers interact [6]. If the system within the platform recognizes that the employer interacts with mostly men, then the algorithm will look for those characteristics in potential job applicants and replicate the pattern [6]. This pattern picked up by the algorithm can happen without specific instruction from the employer.

Algorithms Extending Human Bias

Personal, human bias bleeds into algorithmic bias. A study has found that people’s own underlying biases were bigger determinants of their likelihood to apply to jobs than any gendered job posting [7]. This human underlying bias needs to be improved in order to work towards gender neutrality in the job market [7]. Unconscious human bias can contribute to algorithms being biased, as the human biases will be built into the algorithms.

Ethical Implications

Algorithms Blocking Opportunities

Amazon's hiring algorithm was modeled after real-world data, where there are more men in the tech field. [8]

Algorithms in the online job search do not outright reject job seekers, they do not even show certain groups of job seekers opportunities they are qualified for; as legal scholar Pauline Kim stated, “not informing people of a job opportunity is a highly effective barrier” to job seekers [6]. Qualified candidates cannot apply for a job if they have not been shown the opportunity.

Amazon’s algorithmic recruiting tool was trained with 10 years’ worth of resumes that were sent to Amazon; however, because technology is a male-dominated field, most of the resumes were from male applicants, leading the algorithm to downvote women [8]. This method of algorithm training taught the algorithm that men were more preferable, and it would penalize a candidate who included the word “women’s” in their resume, for example, if they listed an activity as “women’s team captain” [8].

Traditional Gender Roles Affect Outcomes

It has been studied that women can have a better keyword match on their resume, yet not be selected if a man has more experience than them [5]. These hiring algorithms built and trained by humans do not take into account the time women have to take off of work to have children or take care of children [5]. The author of a study done by the University of Melbourne recounts that “women have less experience because they take time [off work] for caregiving, and that algorithm is going to bump men up and women down based on experience” [4]. Because women are more likely to experience a disruption in their career due to children, they will be viewed as a lesser candidate by the algorithm, even if they have more relevant experience than a male candidate [5]. Hiring algorithms do not take into account gender roles, and women taking time off to give birth, which replicates and reinforces gender bias in the hiring process [5].

Looking Forward

Rethinking How Algorithms are Built

Vendors that are building the recruitment algorithms targeting specific job seekers need to think beyond minimum compliance requirements and have to consider whether or not the algorithm they are building is leading to more fair hiring outcomes [6]. Additionally, the people claiming that their algorithms reduce bias in the hiring process have to build and test their algorithms while keeping that goal in mind, or else the technology will continue to undermine the online job search process [6]. Re-thinking algorithms and how to build them will begin to reduce bias in the job search, as many factors need to be considered.

Balancing Humans and Algorithms

Implementing a balance between predictive algorithms and human insight is a promising solution for employers looking to use algorithms in their hiring process, while reducing bias [9]. Using artificial intelligence and algorithms to parse through large amounts of data or applicants works well for processing. Balancing the processing by algorithms with the "human ability to recognize more intangible realities of what that data might mean" is the second step in the process of limiting algorithmic bias [9]. For a partnership between humans and algorithms to be successful, companies need to consciously implement new practices that encourage new practices. The algorithms and humans still need to be held accountable for reducing bias, and working together would encourage a good short-term solution to gender bias in algorithms [9].

References

  1. Bogen, M. & Rieke, A. (2018). Help Wanted: An Examination of Hiring Algorithms, Equity, and Bias. Upturn. https://www.upturn.org/static/reports/2018/hiring-algorithms/files/Upturn%20--%20Help%20Wanted%20-%20An%20Exploration%20of%20Hiring%20Algorithms,%20Equity%20and%20Bias.pdf
  2. Cambridge Dictionary.(n.d.). Gender Bias. In Cambridge English Dictionary. https://dictionary.cambridge.org/us/dictionary/english/gender-bias
  3. 3.0 3.1 3.2 3.3 Kim, P. T. (2019). Big data and artificial intelligence: New challenges for workplace equality. University of Louisville Law Review, 57(2), 313-328.
  4. 4.0 4.1 Hanrahan, C. (2020, December 2. Job recruitment algorithms can amplify unconscious bias favouring men, new research finds. The ABC News. https://www.abc.net.au/news/2020-12-02/job-recruitment-algorithms-can-have-bias-against-women/12938870
  5. 5.0 5.1 5.2 5.3 5.4 Cheong, M., et al. (n.d.). Ethical Implications of AI Bias as a Result of Workforce Gender Imbalance. The University of Melbourne. https://about.unimelb.edu.au/__data/assets/pdf_file/0024/186252/NEW-RESEARCH-REPORT-Ethical-Implications-of-AI-Bias-as-a-Result-of-Workforce-Gender-Imbalance-UniMelb,-UniBank.pdf
  6. 6.0 6.1 6.2 6.3 6.4 Bogen, M. (2019, May 6). All the Ways Hiring Algorithms Can Introduce Bias. Harvard Business Review. https://hbr.org/2019/05/all-the-ways-hiring-algorithms-can-introduce-bias
  7. 7.0 7.1 Tang, S., et al. (2017). Gender Bias in the Job Market: A Longitudinal Analysis. ACM on the Human-Computer Interaction. . https://dl.acm.org/doi/epdf/10.1145/3134734
  8. 8.0 8.1 8.2 Dastin, J. (2018, October 10). Amazon scraps secret AI recruiting tool that showed bias against women. Reuters. https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G
  9. 9.0 9.1 9.2 Raub, M. (2018). Bots, Bias and Big Data: Artificial Intelligence, Algorithmic Bias and Disparate Impact Liability in Hiring Practices. Arkansas Law Review, 71(2), 529-570