Automated Resume Screening

From SI410
Revision as of 02:49, 12 February 2022 by Aasherak (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search
Steps in the hiring process using Automated Resume Screening. [1]

Automated Resume Screening refers to the use of machine learning algorithms and artificial intelligence (AI) to parse and extract information from applicant resumes. This technology can greatly save companies and recruiters time and other valuable resources while finding the most appropriate hires from amongst big pools of candidates. Different algorithms are used to identify relevant skills and experience for the job from the applicant's resume using keywords in the job description and requirements. Variations of algorithms can be used according to the company's hiring criteria. Certain companies may use algorithms that just do word matching between the job post and the resume, while some may place weights on certain past experiences and backgrounds. The use of automated resume screening then may stimulate bias against underprivileged applicants with lesser experience and relevant job backgrounds resulting in gender bias, racial bias, and circumstantial bias.

Algorithms for Resume Screening

With development in machine learning, companies can customize, modify, and personalize algorithms to follow a specific criteria or qualifications they are looking for in potential hires.

Different uses of AI algorithms at different stages of recruitment. [2]

Some companies use algorithms that look for certain technical and interpersonal skills in the applicants' resumes. The algorithms use certain words as keywords to look for in new resumes. For example software engineering companies looking for software developers with knowledge of C++ search for C++ as a keyword in resumes. If a resume contains that word then that will get a higher score for being fit for the role and the company. Moreover, if a company is looking for a hire for a leadership position, the algorithms use words like "lead", "led", and "President" in resumes and score accordingly.

Companies looking for candidates with skills as well as past experience can use algorithms that can extract the work history and experience of the applicants. Such algorithms can parse the dates and years on the resume to calculate the work experience of the applicants. This helps companies screen out applicants that do not meet the eligibility criteria for the company's job posting, thus reducing the applicant pool to only those of interest to the company.

Algorithms can also be adjusted to look for applicants whose experiences relate closely to the job role responsibilities. These algorithms find the correlation between the applicant's resumes and the job descriptions. They not only look for keywords present in both the resume and the job description, but also the context these words are used in. Applicants with past experiences similar to what the job responsibilities include then receive a better score than applicants with less relevant past experiences.

Improvement in performance of machine learning algorithms over time. [3]

The effectiveness of the resume screening algorithms depends on the ability of the algorithm to parse and extract information from the resume properly and fully. The applicants can format resumes in a way more appropriate to algorithms for better results. PDFs are the one of the most common file formats used by people, but are difficult for these algorithms to parse [4]. Adding graphics, graphs, and tables can make it difficult for the algorithms to extract text and may result in not being able to recognize some information. This can lead to poor analyzation and extraction of keywords and contexts leading to a poor score for the applicant.

Although machine learning algorithms have become very efficient and complex over time, they are still prone to deceptions by applicants. Applicants can learn what skills and qualifications the company is looking for from other successful hires and can falsely add that information into their resumes to make it through the resume screening process. Working on resumes to shape them to pass the resume screening can result in less qualified applicants overtaking more qualified applicants in the job recruitment process.

Benefits for Companies

Benefits of Applicant Tracking Systems. [5]

Companies are increasingly adopting the use of automated resume screening to optimize their recruitment process. 99% of Fortune 500 companies use automated resume screening and 93% of resume screening software buyers were happy with the feature set in 2021 [6]. All applicants go through a "pre-screening" process which filters out applicants that don't meet the job criteria or eligibility to work in the company based on their personal information for example age, location, salary expectations etc. Then the candidates go thorough the resume screening during which the machine learning algorithms rank and score applicants among whom the top ones move forward to the interview process.

Time Efficient

Automating resume screening using AI algorithms saves companies time-to-hire. On average, a recruiter spends 6 seconds to manually scan a resume [7]. AI algorithms can process exponentially more applicants than humans in the same time and can also extract more meaningful information than a human can in the same time. This reduces the amount of applicants recruiters have to manually evaluate before hiring someone for a specific position, in turn reducing the time-to-hire.

Organized

Most resume screening programs come as part of applicant tracking system. These softwares can store personal and contact information of different applicants categorized by their key skills and talents. If some candidate does not seem a good fit for some role at the current time, these programs keep their information in the system and can use it in future in case a relevant role opens that is a good fit for the candidate. This increases the pool of applicants and helps companies find a better fit for their roles. The organized storage of data makes data collection and maintenance easier especially in large-scale companies which receive more than hundred thousands of job applications every day. It also makes it easier to store employee data and records which might be needed in future for legal purposes.

Outreach

Outreach is a time-consuming part of the recruiting process. Companies aim to attract as many applicants as possible in order to find the most appropriate and beneficial fit for their role. More applicants mean more work in terms of organizing, analyzing, and reviewing data for the recruiter which is really inefficient if done manually. Automated resume screening and their accompanying applicant tracking systems speed up the processing of resumes and help recruiters consider both active applicants (applicants applying for the first time) as well as passive applicants (applicants who might have previously applied for a similar role). This makes it easier to source more applicants and get a bigger pool and also increases the standards of the applicant pool by considering passive applicants as well. These systems help attract more applicants by recommending job postings that a passive applicant might be interested in based on their stored previous applications and resumes.

Different versions of resume screening algorithms can be modified to screen job profiles on certain social media groups and recruiting applications including LinkedIn [8]. Given the millions of users from all over the world on these platforms, companies can extensively expand their outreach and scope.

Flexible

The software used for the resume screening process allows companies to be flexible for applicants. Once an applicant applies for a job posting, their data is stored in the records. If over time an applicant earns more experience or updates in their careers, the applicant can update their resume and details without adding any complex coordination with the company or the recruiter. The resume screening algorithms automatically reprocess the new resume updating their score and status. This prevents any duplicates appearing in the company records and allows the company to add the option for "reapplying" or "update your application" for the applicants making themselves more flexible.

Ethical Implications

Machine learning algorithms learning backgrounds rather than features of the cow. [9]

Bias refers to the tendency to believe that some people, ideas, etc., are better than others that usually results in treating some people unfairly [10]. This tendency can arise intentionally when someone believes some group or ideas are better than others and makes decisions based on this mindset giving rise to "intentional bias". It can also arise unintentionally when someone may not realize they are being biased or when someone intends not to be biased but the processes, methods, or laws they make end up being biased. This gives rise to unintentional bias.

Machine learning and artificial algorithms have grown really powerful and efficient over the years. But still they are not a hundred percent accurate, efficient, and perfect in making decisions. These algorithms are stochastic, not deterministic [11]. These algorithms aim to learn and draw out patterns from past successes and failures. For example, the algorithms learn patterns in attributes of successful applicants and treat them as positives while looking at new applicants. At the same time, they learn attributes of unsuccessful applicants and treat them as negatives while looking at new applicants. This results in a bias towards certain qualities and skills for who is going to get hired. This may seem ideal for companies at the first glance: companies then only get employees that will be beneficial for them or similar to the employees in the past that have been beneficial for them. But sometimes, if the algorithms pick up correct but misleading patterns (correlation is not causation [12]), there can be ethical implications against many groups and subjects including genders, minorities, and applicants' circumstances. Moreover, the same problems can result if the data or past applicants for a certain company or job posting have undesired skills and qualifications because then efficient algorithms can still learn patterns and trends which may not be desirable.

Gender Bias

Gender bias refers to the tendency to prefer one gender over another. It is a form of unconscious bias, or implicit bias, which occurs when one individual unconsciously attributes certain attitudes and stereotypes to another person or group of people [13]. In context of job applications, gender bias implies discriminating applicants based on their gender rather than their skills and abilities.`

Increase in share of households headed by women. [14]

Learning past applicants and evaluating new applicants based on that knowledge can lead to a gender bias. If a company or specific job role has been previously, or stereotypically, been dominated by a specific gender, the algorithms may learn that as a trend and hence choose future applicants biased for that gender. A real world example of this is illustrated by Amazon s recent experience with its AI recruiting tool [15]. Being a men dominated tech company, at a time when tech was in turn dominated by men, Amazon's AI recruiting tool which it had been developing for past 4 years learnt to prefer male applicants over female applicants. Therefore, it flagged keywords like "women's club" and "president women's chess club" as negatives where they would be otherwise preferred as positives as they indicate leadership positions. Therefore, this led to a bias against the female gender.

At a time when marriage rate is declining and more single women are becoming homeowners [14], such biases, be it intentional or unintentional, can lead to severe discrepancies in financial conditions of many households.

Ethnic Bias

Ethnic bias can be defined as engaging in discriminatory behavior, holding negative attitudes toward, or otherwise having less favorable reactions toward people based on their ethnicity [16]. In context of job applications, it implies preferring or discriminating between applicants based on their race or ethic backgrounds and not their skills and qualifications.

Companies may not have diverse employee racial and ethnic backgrounds. Countries, cities, and neighborhoods tend to have people with similar ethnicity living there anywhere throughout the world with small populations of minorities. Using machine algorithms to analyze past applicants then may result in a bias against the racial minorities as past employees may not have those ethnicities. This in turn results in consequential problems to racial minorities households as they may suffer from employment more than other ethnicities while also being the ones in most need. Therefore, using automated resume screening can result in an ethnic bias.

While the screening process can result in a lack of opportunities for racial minorities, it creates more opportunities for them as well but in specific roles only which can have ethical implications. Many companies and applicant tracking systems partner with social media companies like Facebook [17]. Facebook uses the similarity between social media profiles to recommend job similar job postings liked or worked by similar social media profiles. A team led by Muhammad Ali and Piotr Sapiezynski at Northeastern University found out that job postings for preschool teachers and secretaries, for example, were shown to a higher fraction of women. In contrast, algorithms showed postings for janitors and taxi drivers to a higher proportion of minorities [18]. This dilemma increases the discrepancy in salaries of different racial and ethnic groups which in turn leads to more superior lifestyle of ethnic majorities and more inferior lifestyle of ethnic minorities.

Circumstantial Bias

Circumstantial bias refers to discriminating behavior to a group based on their circumstance. In context of job applications, circumstantial bias can be thought of as a bias against applicants based on lack of opportunities and experience of applications when they may have the same potential and intellectual potential as other successful applicants.

While filtering out applicants based on a lack of qualifications and skills, the automated resume screening process doesn't evaluate the applicant's learning potential. An applicant may not be able to have received the same opportunities to earn experience or learn skills as other successful applicants. Therefore, they will be screened out early on in the recruitment process. This can have many ethical implications. Applicants with households with poor financial conditions get affected by unemployment the most severely and need a job to improve their financial conditions the most urgently. However, the screening process creates a reverse loop and leads to unemployment affecting already unemployment people more. Moreover, this also affects racial minorities who did not get the chance to earn the same skills and qualifications as other applicants due to lack of eligibility and resources rather than lack of aptitude. In 2017, there were only 4 African American CEOs (all men) of Fortune 500 companies, accounting for 2% of the list [7]. This results in racial minorities getting filtered early on due to their circumstances rather than their aptitude.

Reducing Bias

Minimizing bias in machine learning algorithms. [19]

Detecting Bias

Companies can adopt multiple strategies to detect bias in their machine learning algorithms and other steps in the recruitment process. Crowdsourcing can be really useful to detect bias in data used and the results produced by the machine learning algorithms. In context of job recruitment, companies can use applicant and employee feedback about their recruitment process and hiring results with respect to different metrics including performance, work culture, diversity etc. This results in improved quality of data which in turn results in better patterns being drawn by the algorithms thus reducing different biases.

Choosing the Correct Learning Model

To reduce bias in the decision making process of the resume screening, it is important to choose a correct learning model. While making the machine learning algorithms, companies can make sure irrelevant attributes do not have any impact while differentiating between candidates. If a company is looking for a specific skill for a relevant job posting, gender and ethnicity should not pose a factor in the decision making. The machine learning algorithm should be able to learn to ignore such attributes while figuring out patterns and correlations. Words like "nurse" and "teacher" should not be linked with any gender and words like "janitor" and "driver" should not be associated with any race.

Accounting for Infrastructural Issues

During the decision making process, partial weight can be given to the results of the algorithms. A difference between humans and computers is the ability to use judgment based on ethical and emotional data. After the algorithms give evaluations, more review can be added to the applicants based on their circumstances and other infrastructural and external factors that might have disadvantaged them compared to other applicants. These factors can be accounted for during the interview process but at that time most applicants have already be screened out. Therefore, these factors can be kept in mind while making decisions.

Monitoring Results

A company using the resume screening progress can continually evaluate the results of its recruitment process based on certain metrics including its performance, diversity, and the work culture. If the statistics tend to be biased towards certain attributes, the company can try to reconsider its recruitment process to make sure to reduce those biases.

Companies can make use of subpopulation analysis. It is the procedure of considering just a target subpopulation from the whole dataset and calculating the model evaluation metrics for that population. This type of analysis can help and identify if the model favors or discriminates against a certain section of the population including races, genders, and age groups [20]. This technique can help companies perform a more thorough and comprehensive analysis of their recruitment process.

Data Security

The quality of data has a significant impact on the results of the machine learning algorithms. Even slight or random changes can enable or prevent the machine learning algorithms from learning important or unimportant trends in the data. Companies can have strict and proper laws to make sure that the data is secure and safe from unintended access and changes. Levels of security and accesses can be placed depending on the role and seniority of the employees to prevent any personal bias of employees from altering the data.

Data of past hires should be representative of the applicant pool for the future decisions made to be appropriate. Therefore, if some role is different from the previous roles posted, companies can make sure to only use the appropriate part of relevant data for that job posting. If some applicants try to add fake resumes, then those can be filtered out from the data as well. The company can also participate in regular data cleaning to make sure that the there is minimum bias in the data.

References

  1. Online Job Application Systems. MBA Crystal Ball https://www.mbacrystalball.com/blog/2014/10/19/online-job-application-system/
  2. AI For Recruiting - A Complete Guide. CVViZ https://cvviz.com/ai-recruiting/
  3. How to Improve Machine Learning Algorithms? KDnuggets https://www.kdnuggets.com/2017/12/improve-machine-learning-algorithm-lessons-andrew-ng-part2.html
  4. Resumes & Cover Letters. Indeed. https://www.indeed.com/career-advice/resumes-cover-letters/automated-screening-resume
  5. The Advantages of an Applicant Tracking System. ZAMP HR https://blog.zamphr.com/the-advantages-of-an-applicant-tracking-system
  6. Applicant Tracking Systems (ATS) Market Update 2021. Trust Radius https://www.trustradius.com/vendor-blog/ats-applicant-tracking-systems-software-market
  7. 7.0 7.1 Bart Turczynski. 2021 HR Statistics: Job Search, Hiring, Recruiting & Interviews. Zety. https://zety.com/blog/hr-statistics#resume-statistics
  8. LinkedIn Scraper. Python Package Index. https://pypi.org/project/linkedin-scraper/
  9. Avoiding shortcut solutions in artificial intelligence. MIT CSail https://www.csail.mit.edu/news/avoiding-shortcut-solutions-artificial-intelligence
  10. Essential Meaning of bias. Merriam-Webster https://www.merriam-webster.com/dictionary/bias
  11. The Limitations of Machine Learning. Towards Data Science. https://towardsdatascience.com/the-limitations-of-machine-learning-a00e0c3040c6
  12. Correlation is not causation. The Guardian. https://www.theguardian.com/science/blog/2012/jan/06/correlation-causation
  13. What Is Gender Bias in the Workplace? Built In. https://builtin.com/diversity-inclusion/gender-bias-in-the-workplace
  14. 14.0 14.1 More Women Have Become Homeowners and Heads of Household. Could the Pandemic Undo That Progress? Urban Institute https://www.urban.org/urban-wire/more-women-have-become-homeowners-and-heads-household-could-pandemic-undo-progress
  15. Amazon scraps secret AI recruiting tool that showed bias against women. Reuters. https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G
  16. Measuring Ethnic Bias: APA Dictionary of Psychology American Psychological Association. https://dictionary.apa.org/ethnic-bias
  17. Facebook Page Admins and Jobs Distribution/ATS Partners Facebook. https://www.facebook.com/business/help/286846482350955
  18. Hao, Karen. "Facebook's Ad-Serving Algorithm Discriminates by Gender and Race." MIT Technology Review, 2 Apr. 2020, https://www.technologyreview.com/2019/04/05/1175/facebook-algorithm-discriminates-ai-bias.
  19. Tackling bias in artificial intelligence (and in humans). McKinsey and Company https://www.mckinsey.com/featured-insights/artificial-intelligence/tackling-bias-in-artificial-intelligence-and-in-humans
  20. How to reduce machine learning bias. Medium. https://medium.com/atoti/how-to-reduce-machine-learning-bias-eb24923dd18e