Disabled and Rejected from a Job?  Maybe AI-discrimination is to Blame!

Disabled and Rejected from a Job?  Maybe Artificial Intelligence is to Blame!  Artificial Intelligence in job hiring platforms have been discriminating against applicants with disabilities, preventing them from achieving in-person interviews and job offers.  By Alyssa Lung

By:  Alyssa Lung

The explosion of artificial intelligence (AI) in the past decade has advanced quick decision making, reduced human error, and quantified many aspects of daily life. Whether it be from predicting our risk of heart disease to Netflix recommendations, almost every industry has felt the presence of AI. Though the insurgence of new technological improvements has provided benefits to businesses and customers, the impacts on individuals with disabilities has yet to be considered. 

Many job hiring platforms in particular have shown to discriminate against people with disabilities, further reducing equal access to opportunities. Hirevue, a job hiring application that provides a series of questions and a time frame for an applicant to answer them through a video recording, has recently been facing the most criticism regarding discriminatory practices. The program analyzes face movements, speech patterns, and tone of voice to determine if the candidate should receive a follow up interview by generating a ‘score’ of the recording. However, when a disabled person partakes in an interview, Hirevue has shown to give them a worse score - especially to those who have speech impediments, are deaf, blind, or have had a stroke. The platform was designed using a sample of able-bodied testers, and consequently, struggles to detect disability, generating a reduced score for the applicant. Built-in AI does not account for anyone who falls outside its defined ‘norm’ (an able bodied person), and the lowered score prevents individuals from attaining a second interview or moving forward in the hiring process. This furthers inequity for those with disabilities as video interview programs like Hirevue hinder the access to the same opportunities because of a computer-generated score. Discriminatory practices like this reflect the bias of those who created the program - while it is easy to fall into the misconception that algorithms and AI are unbiased, the creators all implement their own bias into such programs by disregarding disabled people. 

In May 2022, The US Equal Employment Opportunity Commission (EEOC) released a new set of guidelines regarding the use of software and algorithms during the hiring process. The statement outlined the various types of violations of the Americans with Disabilities Act (ADA) and what employers should do to avoid this and adhere to the law. They have enforced reasonable accommodations, which requires employers to provide resources and alternative methods during the hiring process. Though there have been efforts to reduce discrimination and AI-bias in hiring platforms, it is extremely difficult for people to prove such programs are inherently biased and discriminatory. This is because the large umbrella of disability makes it difficult to identify a pattern of multiple candidates facing discrimination as the majority of job applicants don’t communicate regarding their status of an application. Additionally, holding programs accountable requires both the employer and the applicant to collaborate, which employers have no incentive to do either. 

Discriminatory practices and bias within platforms used to “sort out” individuals with disabilities reveals the need to  reexamine how we perceive  and utilize artificial intelligence. The fallacy that AI is a flawless solution to improve how employers hire candidates deepens the divide between the disabled community and able-bodied individuals. The lack of representation during development and creation of programs prevents minorities, including disabled people, from being accounted for, leading to embedded biased practices. Reexamining how we see AI and understanding its faults will allow platforms to be held accountable and encourage them to consider all groups when creating, testing, and implementing products. With hiring platforms becoming more popular in the job market, violations of ADA regulations need to be brought to light in order to avoid further marginalizing the disabled community. 

Originally Posted:  4 August 2023

 

 

Sources: 

AI Now Institute: Disability, Bias, and AI - Report (2019)

EEOC - The Americans with Disabilities Act and the Use of Software, Algorithms, and Artificial Intelligence to Assess Job Applicants and Employees

Expanding Employment Success for People with Disabilities (2018)

Addressing in Disability Bias in AI - Allerin Blog