A lack of clarity remains as NYC restricts the use of AI in hiring

EEOC and DOJ concerns that reliance on AI tools during the hiring process could cause companies to violate the Americans with Disabilities Act were addressed by the law. HR teams use AI recruiting tools to assist them throughout the hiring process, from posting job ads to analyzing resumes and offering compensation packages.

A company’s goal is, of course, to find a candidate with the right qualifications. In this process, every stage can be tainted by bias, especially if the employer’s “model group” is compared to its existing workforce. A recruiting tool Amazon used to assess applicants based on resumes submitted over a decade had to be scrapped after the algorithm conditioned itself to penalize resumes that mentioned “women’s.”

There may also be problems with AI tools used to assess candidates during interviews. By assessing speech patterns in a video interview, candidates with speech impairments can be screened out, whereas tracking keyboard inputs may eliminate candidates with arthritis or other dexterity restrictions.

The same can be said for AI tools that help identify candidates with the right personality for the job. It is both a technical and philosophical issue – the tools returned different scores when a resume was submitted as raw text versus a PDF file.

AI is not magic. If you don’t teach it what to look for, and don’t validate it with the scientific method, then the predictions won’t be any better than a guess.

The law in New York City is part of a broader trend at the state and federal level

In addition to such provisions, Congress introduced the American Data Privacy and Protection Act earlier this year, as well as the Algorithmic Accountability Act, which would require “impact assessments” of automated decision-making systems for various uses, including employment-related decisions. Furthermore, California is considering adding liability related to AI recruiting tools to its anti-discrimination laws.

It is also unclear what a technology audit looks like – or how it should be conducted – given the limited guidance released by New York City officials leading up to the law taking effect on Jan. 1, 2023. In order to succeed, employers will need a partner who has business analytics and data skills.

Policymakers driving the conversation is the biggest challenge for those advocating for stronger regulation of such tools. The goal is to make sure policymakers know there needs to be a real auditing requirement for these tools and a meaningful disclosure and accountability for discrimination that results from these tools. Advocacy groups claim, there’s still a lot of work to do. Policy is not keeping up with technological changes, since the tools are already available.


Give yourself more time, better talent, and maximize the power of the people in your organization. Have questions or want to talk challenges and viable solutions? We’re here for you.

Yay! We just need your name and email.

"*" indicates required fields

No problem. You know where to find us. We only send out a few emails a month.

Wait... Before you go,
Download Insights into Background Checks

Full Name(Required)
Hi, Do you want to be notified about future events?