Insights

New York City mandates AI bias analysis - what employers need to know

New York City mandates AI bias analysis - what employers need to know

Jan 06, 2023
Download PDFDownload PDF
Print
Share

Businesses have become increasingly reliant on artificial intelligence (AI) to assist with hiring, promotion, and other employment-related tasks. These tools are facing increased scrutiny from regulators, especially in New York City which has passed the first law in the United States (Local Law 144) requiring employers to conduct bias audits of AI-enabled tools used to assist with or make employment decisions. The Law refers to these tools as automated employment decision tools (AEDTs).  Local Law 144 applies to employers and employment agencies alike and sets forth several steps that employers must take before implementing or continuing to use AEDTs. 

What Qualifies as an AEDT?

Businesses that use AI-driven hiring or promotion tools must answer the critical question of whether the tools used constitute an AEDT within the meaning of Local Law 144.  Local Law 144 defines AEDTs as “any computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence, that issues simplified output, including a score, classification, or recommendation, that is used to substantially assist or replace discretionary decision making for making employment decisions that impact natural persons.” 

In response to public comment, the New York City Department of Consumer and Worker Protection (DCWP) issued proposed rules that clarify the scope of an AEDT as processes designed “to substantially assist or replace discretionary decision making,” such as:

  1. Relying solely on the tool’s simplified output (score, tag, classification, ranking, etc.) without considering other factors;
  2. Using the tool’s simplified output as one of a set of criteria where the output is weighted more than any other criterion in the set; or
  3. Using the tool’s simplified output to overrule or modify conclusions derived from other factors.

With this clarification, businesses can safely assume that an AI-driven hiring or promotion tool that is used as the final decision in hiring or to modify a conclusion derived from other factors, including modification of human decision-making, is likely to be in-scope for Local Law 144.

What Does the Law Require?

Employers who utilize AEDTs must:

  1. Subject AEDTs to a bias audit, conducted by an independent auditor, within one year of their use;
  2. Ensure that the date of the most recent bias audit and a “summary of the results”, along with the distribution date of the AEDT, are publicly available on the career or jobs section of the employer’s or employee agency’s website;
  3. Provide each resident of NYC who has applied for a position (internal or external) with a notice that discloses that their application will be subject to an automated tool, identifies the specific job qualifications and characteristics that the tool will use in making its assessment, and informs candidates of their right to request an alternative selection process or accommodation (the notice shall be issued on an individual basis at least 10 business days before the use of a tool); and
  4. Allow candidates or employees to request alternative evaluation processes as an accommodation.

Enforcement & Penalties

The Law went into effect on January 1, 2023, but in response to the volume of public comments received in connection with proposed rulemaking, the DCWP has announced that it will postpone enforcement until April 15, 2023.  Local Law 144 provides for enforcement by the NYC Corporation Counsel, and per the penalty schedule published by DCWP, penalties range from $375-$1,500 for each violation.

Next Steps

Organizations with employees in New York City should take advantage of the brief reprieve in enforcement and work with their vendors and outside counsel to assess their use of AI-driven tools for hiring and promotion decisions.  In particular, businesses should determine in connection with their use of AI-driven tools, (1) the tool’s decision-making objectives (what is the model trained to detect and what data was used to train the model), (2) what data does the tool collect; (3) what is the tool’s evaluative criteria for filtering out candidates, and (4) what assurances have been provided regarding bias.

Once that evaluation is complete, businesses can then determine whether their process qualifies as an AEDT, and establish a roadmap for compliance with Local Law 144.

It is also important to note that New York City is not the only governmental body scrutinizing these tools.  California, Illinois, and Maryland also have laws establishing employer duties for the use of AI-enabled tools.  The requirements under those state laws are less onerous, and will be covered in subsequent alerts. Use of AI-enabled tools may also implicate federal laws including the Fair Credit Reporting Act (FCRA) and Americans with Disabilities Act (ADA) with both the US Department of Justice and the US Equal Employment Opportunity Commission cautioning employers against “blind reliance” on AI-enabled tools.


Bryan Cave Leighton Paisner LLP has a team of knowledgeable data privacy and employment lawyers.  If you or your organization would like more information on this, please contact any attorney in our Data Privacy and Security practice group.

Related Practice Areas

  • Data Privacy & Security

Meet The Team

+1 312 602 5144

Meet The Team

+1 312 602 5144

Meet The Team

+1 312 602 5144
This material is not comprehensive, is for informational purposes only, and is not legal advice. Your use or receipt of this material does not create an attorney-client relationship between us. If you require legal advice, you should consult an attorney regarding your particular circumstances. The choice of a lawyer is an important decision and should not be based solely upon advertisements. This material may be “Attorney Advertising” under the ethics and professional rules of certain jurisdictions. For advertising purposes, St. Louis, Missouri, is designated BCLP’s principal office and Kathrine Dixon (kathrine.dixon@bclplaw.com) as the responsible attorney.