New EEOC Guidance Clarifies Employer Responsibility for Discrimination in AI Employment Tools
Many employers have begun using artificial intelligence (AI) tools supplied by third-party vendors. On May 18, 2023, the Equal Employment Opportunity Commission (EEOC) provided guidance indicating that, in its view, employers are generally liable for the outcomes of using selection tools to make employment decisions.
The EEOC's new technical guidance titled, "Assessing Adverse Impact in Software, Algorithms, and Artificial Intelligence Used in Employment Selection Procedures Under Title VII of the Civil Rights Act of 1964," details how the EEOC understands Title VII to apply to the use of algorithmic decision-making tools in employment decisions.
What Tools Are Covered
The EEOC's guidance begins with a broad definition of covered selection tools often used by employers. Algorithmic decision-making tools include software, AI, and automated systems. The new guidance offers examples of algorithmic decision-making tools, including the following:
- Resumé scanners that prioritize applications using certain keywords.
- Employee monitoring software that rates employees on the basis of their keystrokes or other factors.
- "Virtual assistants" or "chatbots" that ask job candidates about their qualifications and reject those who do not meet predefined requirements.
- Video-interviewing software that evaluates candidates based on their facial expressions and speech patterns.
- Testing software that provides "job fit" scores for applicants or employees regarding their personalities, aptitudes, cognitive skills, or perceived "cultural fit" based on their performance on a game or on a more traditional test.
Evaluating Algorithmic Decision-Making Tools for Disparate Impact
After defining the tools, the EEOC's guidance details that the selection criteria should be evaluated under Title VII of the Civil Rights Act of 1964 and, specifically, the disparate impact theory. Title VII prohibits employment discrimination based on race, color, religion, sex, and national origin. Disparate impact occurs when a neutral test or selection procedure disproportionately excludes people based on a protected characteristic.
According to Title VII's three-part test, employment discrimination exists where (1) a selection procedure has a disparate impact on a particular protected class, (2) the employer cannot establish that the test is job related for the position in question, and (3) a less discriminatory tool is available but not used.
Supplementing Title VII, the EEOC also relies on the Uniform Guidelines on Employee Selection Procedures (Guidelines) as support for the appropriate analysis. Notably, while the EEOC has adopted the Guidelines, the new technical guidance does not rise to the level of official EEOC regulations. Applying the legal test to the range of tools, the EEOC has stated that employers can be held responsible under Title VII for the use of such tools, even if the tools are designed or administered by a third party.
Monitoring for Adverse Impact Discrimination in Algorithmic Decision-Making Tools
After setting forth the legal test, the guidance turns to how employers can mitigate those risks by monitoring for disparate impact, which the new EEOC guidance refers to as "adverse impact discrimination," when using algorithmic decision-making tools. The EEOC does not establish a new policy for algorithmic decision-making tools, but rather applies the aforementioned principles already established under Title VII.
According to the EEOC, employers should:
- Regularly monitor the use of algorithmic decision-making tools to determine if the disparity in selection rates between groups is statistically significant. The Guidelines recommend but do not require that employers apply the "four-fifths rule," which states that one rate is "substantially" different than another if their ratio is less than four-fifths (or 80%). The EEOC recognizes that other tests of statistical significance (such as the standard deviation test) may be appropriate depending on the sample size or other statistical considerations. As such, the EEOC suggests that employers and vendors use the appropriate test to determine statistical significance.
- If monitoring demonstrates an adverse impact, the employer should determine whether the use of the algorithmic decision-making tool is job-related and consistent with business necessity.
- If the use of the algorithmic decision-making tool is job-related and consistent with business necessity, the employer should still explore less discriminatory alternatives and implement alternatives if available.
- If the employer is relying on a vendor or third party to develop or administer an algorithmic decision-making tool, the employer should ask the vendor about what process, if any, they use to determine whether the use of the tool might have an adverse impact.
- If, during the development of an algorithmic decision-making tool, the employer discovers an effective alternative to reduce the adverse impact, the employer should adopt the alternative tool. The new EEOC guidance explains that employers may be liable for failure to adopt a less discriminatory algorithm that was considered during the development process.
The recent EEOC guidance is part of the agency's 2021 Initiative on Artificial Intelligence and Algorithmic Fairness and provides a useful starting point for employers using these tools. The new guidance, however, does not provide a broad description of the EEOC's position on the fast-changing legal landscape. Nor does the new guidance provide any significant technical detail related to algorithmic tools. For a deeper dive into the topics, see our previous Update on the NIST (National Institute of Standards and Technology) study and recommendations for employers. Employers with operations in New York also may want to consult our previous Update on New York's final rules for implementation of NYC's Local Law 144. The use of algorithmic tools in employment decisions remains an evolving topic, and employers should seek experienced counsel to mitigate legal risks.
© 2023 Perkins Coie LLP