Skip to main content
Home
Home

The Growing Regulation of AI-Based Employment Decision Tools

The Growing Regulation of AI-Based Employment Decision Tools

blurred image of a window, placeholder image for attorney

The growing use of video and automated technology, including artificial intelligence (AI), in employment practices—and the concern that the technology may foster bias—has triggered a wide array of regulatory efforts. 

These technologies include advanced resumé screening software, interview video analysis algorithms, and employee surveillance devices. Lawmakers are increasingly concerned that these devices foster bias and create unfair results.

Automated employment decision tools (AEDTs) are one application of tools sometimes referred to as "automated decision tools" or "automated decision systems." Automated decision making technologies replace discretionary decision-making by a human being. They can include systems or software programs that use computation—such as machine learning (ML), statistics, data analytics, data processing, or AI—to make decisions. Other technologies of interest include video technology capable of analyzing facial expressions during employment interviews and technology capable of monitoring employee behavior.

At least 11 statutes have been introduced targeting the use of AI-related technology to assist with employment decisions. And the Equal Employment Opportunity Commission (EEOC) has trumpeted a settlement related to its determination that an employer discriminated against applicants through their use of automated screening software. Below is a summary of enacted and proposed legislation and enforcement efforts related to technology used to make employment decisions.

Enforcement

Equal Employment Opportunity Commission (EEOC)

In 2022, the EEOC sued iTutorGroup, Inc. after its investigation revealed that the company violated the Age Discrimination in Employment Act (ADEA) when its recruiting software rejected more than 200 older applicants. The ADEA prohibits employers from discriminating against employees and applicants aged 40 and over based on their age. According to the EEOC, the company's automated decision tools automatically rejected female applicants aged 55 and over and male applicants aged 60 and older. The age bias was discovered when one of the applicants was offered an interview only after reapplying for the same job using a more recent birthdate. The EEOC settled with iTutorGroup in August 2023, with the company agreeing to pay $365,000 in total to applicants that were rejected based on age.

This is the first time the EEOC has sued an employer for AI-related employment discrimination. The lawsuit comes after the EEOC announced its focus on employers' use of automated systems, including AI or ML, in its strategic enforcement plan.

Enacted Legislation

Illinois

In 2020, Illinois enacted the Artificial Intelligence Video Interview Act to regulate the use of AI analysis on video interviews. Illinois employers that intend to use AI when analyzing applicant video submissions are prohibited from asking applicants to submit video interviews unless they (1) notify the applicant before the interview in which AI may be used to assess the applicant, (2) provide applicants with information before the interview about how AI works and what characteristic(s) will be used to evaluate the applicants, and (3) obtain the applicant's consent to be evaluated by AI. Employers can only share the videos with a person whose expertise or technology is necessary to evaluate the applicant. Additionally, employers are required to delete the video interview and any copies of it no more than 30 days after an applicant requests an employer to do so.

Another law in Illinois, the Biometric Information Privacy Act (BIPA), has also formed the basis of private litigation related to AI use. The act regulates the use of a person's private biometric data, such as eye scanners, fingerprint, voiceprint, or facial geometry. Employers must get an employee's written consent before collecting biometric data, which may include data collected using AI. In 2022, plaintiffs in Deyerler v. HireVue filed suit under the act, alleging that the defendant used an online video interview platform to screen candidates using facial geometry scanning and tracking powered by AI. In 2022, in McDonald v. Symphony Bronzeville Park, the Illinois Supreme Court confirmed that employees have a right to sue under the act.

Maryland

In 2020, Maryland enacted HB 1202, which prohibits an employer from using certain facial recognition services during an applicant's interview for employment unless the applicant consents. The applicant may consent by signing a waiver verifying (1) the applicant's name, (2) the date of the interview, (3) that the applicant consents to the use of facial recognition during the interview, and (4) whether the applicant read the consent waiver.

New York City

On July 5, 2023, New York City began enforcing New York City Local Law 144, which prohibits employers and employment agencies from using AEDTs unless (1) the tool has been subject to a bias audit within one year of the use of the tool, (2) information about the bias audit is publicly available, and (3) certain notices have been provided to employees or job candidates, including a notice that candidates can request an alternative selection process or accommodation. Violators are subject to civil penalties. Employers with operations in New York should read our previous Update on Local Law 144 for a more in-depth understanding of employer obligations.

Texas

Texas recently passed HB 2060, creating the Artificial Intelligence Advisory Council to monitor Texas state agencies' use of AI, including the Texas Workforce Commission, which cleared its backlog of unemployment claims with a chat bot in 2020.

Proposed Legislation

Federal Legislation

The No Robot Bosses Act was introduced on July 20, 2023, to prohibit certain uses of automated decision systems by employers. If enacted, the bill would prohibit employers from relying exclusively on an automated decision system in making employment-related decisions. Employers that use automated decision systems will be required to:

  • Pretest and validate the system for compliance with anti-discrimination laws, lack of potential discriminatory impact, and compliance with the Artificial Intelligence Risk Management Framework (AI RMF) developed by the National Institute of Standards and Technology (NIST) or a successor framework.
  • Make public the results of annual, independent testing for discriminatory impact or potential bias.
  • Timely disclose to the applicant that the employer uses or intends to use the system and provide a description and explanation of the system.
  • Corroborate the system's output via human oversight.
  • Provide applicants with a description and explanation of the input data and output generated by the system and the reason for using that information no later than seven days after making the employment decision.
  • Allow the applicant to dispute the system's output to a human.
  • Train individuals operating the system on how to properly use the system, including the potential for bias.

The No Robot Bosses Act would also allow employees that are being managed through an automated decision system to opt out of such management in favor of a human manager.

California

California is considering two initiatives aimed at regulating the use of automated technology. First, the California Civil Rights Council has proposed modifications to the state's anti-discrimination statute, which, if adopted, would hold employers liable for the use of AEDTs that have a discriminatory impact against an applicant or employee based on a protected characteristic.

California also introduced AB 331 on January 30, 2023, which would prohibit employers from using automated decision tools in a way that contributes to algorithmic discrimination. The law would require that:

  • Employers perform an impact assessment for any automated decision tools in use.
  • Employers provide notice regarding use of AEDTs use and allow a person to request an alternative process or accommodation.
  • Developers provide the employer with a statement regarding the tool's intended uses.
  • Developers or employers maintain a program that governs the risks of algorithmic discrimination.
  • Employers make their policy regarding the tool's use publicly available.

Massachusetts

Massachusetts HB 1873 was introduced on February 16, 2023. One of its purposes is to restrict the use of automated technology when making employment-related decisions. If enacted, the bill would require employers and vendors that use automated decision systems to provide employees with certain notices and maintain an updated list of automated decision systems in use. The proposed law would prohibit employers and vendors from using automated decision systems in ways that "result in violations of labor and employment laws."

New Jersey

New Jersey Bill A4909 was introduced on December 5, 2022, to regulate the use of automated tools in hiring decisions to minimize employment discrimination. If enacted, the bill would prohibit the sale of an automated employment decision tool unless the tool was subject to a bias audit. The proposed law would also require employers that use these tools to provide certain notices to candidates.

New York

New York State Bill S07623 was introduced on August 4, 2023, which, if enacted, would restrict the use of electronic monitoring and automated decision tools. Similar to the New York City law, employers and employment agencies would be prohibited from using an AEDT unless (1) the tool has been the subject of a bias audit within one year prior to use, (2) the results of the audit have been made public, and (3) certain notices have been given to employees and candidates for employment.

The law would also prohibit employers and employment agencies from using electronic monitoring tools unless the tool (1) is intended to accomplish a certain purpose allowed under the law, (2) it accomplishes that purpose in the least invasive means possible, and (3) its use is limited to the smallest number of workers and collects the least amount of data needed to carry out that purpose. Employers that use electronic monitoring tools will also be subject to certain notice, disclosure, collection, and destruction requirements and are prohibited from relying solely on employee data collected through electronic monitoring when making hiring, promotion, termination, disciplinary, or compensation decisions.

Vermont

Vermont HB 114 was introduced on January 26, 2023, to restrict the use of electronic monitoring of employees and employment-related automated decision systems. Among other restrictions, the proposal prohibits employers from using automated decision systems in a manner that results in violations of state and federal laws. The measure would also ban employers from relying solely on information from an automated decision system when making employment-related decisions.

The proposal also targets workplace surveillance systems by prohibiting employers from electronically monitoring employees unless the employer used the system for limited, enumerated purposes. The act would also require notice to employees before the employer could engage in monitoring.

Washington, D.C.

The Washington, D.C., Stop Discrimination by Algorithms Act of 2023 was introduced on February 2, 2023. If enacted, the bill would prohibit the use of algorithms to make discriminatory employment decisions based on class, actual or perceived race, color, religion, national origin, sex, gender identity or expression, sexual orientation, familial status, source of income, or disability. Employers would be required to (1) provide individuals with notice on how they use personal information in algorithmic decisions (including when use of the system results in adverse action), (2) audit for discriminatory impact, and (3) annually report the audit results to the Office of the Attorney General.

Takeaways for Employers

Employers should take note of enacted and proposed legislation and consult with legal counsel before implementing automated employment technologies. For recommendations on how to regulate the use of AI and other automated decision tools, read our previous Update.

© 2023 Perkins Coie LLP

Home
Jump back to top