Screening Blog

Hire Thinking

woman with binoculars

Artificial Intelligence in Hiring and Discrimination Once Again Make Headlines


As Hire Image predicted in our 2023 Background Screening Predictions, artificial intelligence (AI) is continuing to be an issue for employers, with increased discrimination cases and activity based on the hiring process. In fact, one recent class action lawsuit highlights the exact risks detailed in our predictions.

Some Background on AI in Employment Decisions

Over the past several years, there has been increased use of a wide variety of technological tools, including AI, for hiring, monitoring work performance, and determining salary and raises. For employers, the use of AI has become more prevalent due to its many benefits. It not only streamlines both the hiring and the employee evaluation processes, but also improves efficiency and accuracy, and results in cost savings.

However, as we’ve discussed, while the use of AI can be helpful in the evaluation of prospective employees, it also presents challenges regarding the potential for discrimination. Despite the benefits, there are inherent problems involved with using AI, including the potential for bias against those in protected classes. Reliance on AI to make hiring decisions or monitor employee performance could adversely impact and discriminate against certain individuals in that it cannot adequately take particular statuses into account. For example, if it detects a gap in employment, the algorithm may dismiss the candidate simply because of that gap, with no option for the individual to provide an explanation that they could have been hurt or sick resulting in a disability that now must be accommodated under the ADA. Today, these types of instances are being examined more closely in the courts and by the Equal Employment Opportunity Commission (EEOC).

Class Action Lawsuit

A class action lawsuit was recently filed against the company Workday, claiming that its AI screening tools disproportionately and discriminatorily disqualify Black, older, and disabled job applicants. The named plaintiff is a Black man over the age of 40 who suffers from anxiety and depression. He alleges that he applied for over 80 positions since 2018 and that the one commonality was that all of the employers used Workday as a screening tool. He had been denied each time, despite his qualifications.

Underlying the claim is that the AI Workday uses unlawfully favors applicants outside of protected classes through its reliance on algorithms and inputs. And because these are created by humans with either conscious or unconscious biases (or maybe some of both), those biases then exist in the algorithms themselves. In response, a Workday spokesperson noted its “risk-based review process … to help mitigate any unintended consequences, as well as extensive legal reviews to help ensure compliance with regulations.”

How this case plays out, along with others, will be important not only for future AI and discrimination cases, but also for the future of AI use overall in employment related decisions.

EEOC Actions

This case comes on the heels of the EEOC’s public hearing “Navigating Employment Discrimination in AI and Automated Systems: A New Civil Rights Frontier,” exploring the risks and benefits of using AI in employment decisions. During the hearing, the EEOC listened to witnesses ranging from computer scientists, civil rights advocates, and legal experts to an industrial-organizational psychologist and employer representatives.

“The use and complexity of technology in employment decisions is increasing over time,” said EEOC Chair Charlotte A. Burrows.  “The goals of this hearing were to both educate a broader audience about the civil rights implications of the use of these technologies and to identify next steps that the Commission can take to prevent and eliminate unlawful bias in employers’ use of these automated technologies. We will continue to educate employers, workers and other stakeholders on the potential for unlawful bias so that these systems do not become high-tech pathways to discrimination.”

The hearing was a part of the EEOC’s ongoing AI and Algorithmic Fairness Initiative, working to ensure that the use of software, including AI used in hiring and other employment decisions, complies with workplace discrimination laws. As part of the initiative, the EEOC is issuing technical assistance to provide guidance on the use of AI in employment decisions, identifying best practices, holding sessions with key stakeholders, and gathering information about the impact of hiring and other employment-related technologies. Through these actions, the EEOC is more closely examining how AI technologies ultimately change the way employment decisions are being made. The goal is to help employers, employees, and job applicants ensure that the impacts are fair and consistent with federal equal employment opportunity laws.

The EEOC also recently took a stance on how existing ADA requirements may apply to the use of AI in employment-related decision making and how the use of these tools may create disadvantages for job applicants and employees with disabilities by issuing Technical Assistance. It details various instances of the use of algorithmic decision-making tools regarding the ADA in general, reasonable accommodations, screening out qualified individuals with disabilities, and medical examinations. Also provided are recommended practices for employers to help with ADA compliance when using AI decision making tools and tips for both job applicants and employees who think that their rights may have been violated.

While many employers will continue to utilize AI tools to save time and money, more need to understand how the algorithms work and what that means for the decisions they are making. Simply, they must shift their perspective and take a serious look at the unintended consequences of the software they are using or face serious repercussions with the EEOC. Employers should pay close attention to these regulatory and litigious developments throughout 2023 and further. While technology can be a great asset, it can also present some serious challenges. As with everything else, employers must be aware of how to minimize these risks to their organizations.

Click here for more information.

At Hire Image, we work with employers to understand the EEOC’s guidance as it relates to their hiring practices. Please contact us if you have any questions about the laws and rules governing background checks and drug screening.

Scroll to Top