Hiring

AI Hiring Tools Will Require a Bias Audit in NYC

Employers will need to publish the audit's results and notify affected job candidates in advance of the use of automated employment decision tools.
AI Hiring Tools Will Require a Bias Audit in NYC
Photo: Getty

To the cost of searching for, interviewing, and onboarding job candidates, add one more line item if you are hiring people in New York City: the expense of hiring someone to do an independent audit of the software you use to screen candidates.

A new New York City law is trying to bring transparency to the use of artificial intelligence tools in hiring and promoting employees. NYC 144 prohibits employers or employment agencies from using an automated employment decision tool (AEDT) to make an employment decision unless the tool is audited annually for bias.

The law also requires employers to notify candidates or employees at least 10 days before using the tools, and the employer must make the results of the bias audit publicly available.

I think the concern from [New York’s] city council was that some of these automated tools can obscure discrimination. — Shea Brown, BABL AI

Employers that violate the law (including its notice requirements) will be subject to fines of up to $500 for their first violation and up to $1,500 for each subsequent violation. It takes effect on July 5.

While there are federal anti-discrimination laws, “I think New York was worried about how do [they] get applied to automated decision-making,” said Shea Brown, chief executive of BABL AI, an algorithm auditing firm. “I think the concern from [New York’s] CityCouncil was that  some of these automated tools can obscure discrimination.”

The law defines an AEDT as a tool that “substantially assists or replaces” an employer’s discretion in making employment-related decisions. Included in that would be software that screens applicant resumes and discards those that don’t meet specific criteria or tools that score, classify, or rank job applicants or employees based on one particular factor (e.g., rating a candidate’s technical skills).

Commenters on the initial draft of the law honed in on the ambiguity in the law’s definition of an AEDT.

But even after the definition in the law was revised, “it still creates a lot of uncertainty for companies to determine whether or not the various technological tools they use as part of the employment process need to be subjected to a bias audit,” Robert T. Szyba, a partner in labor and employment law at Seyfarth Shaw, told CFO.

The bias audit has to be done by an independent auditor that does not have financial or employment ties to the tool’s developer or the employer, and the audit must use data from the employer or employment agency’s own historical use of the tool. If such data does not exist or wouldn’t be statistically significant, the independent auditor could use historical data of other employers or employment agencies.

The vendor can hire an auditor to do the bias audit, but the employer has to have contributed their data as part of the testing. Indeed, “a smaller company that’s using these tools should push back the requirements on the vendor because they won’t have enough data to do a real audit,” said Brown.

The bias audit aims to assess whether the AEDT has a disparate impact on a group of job applicants or employees based on a particular protected characteristic, such as sex, race, and ethnicity. The final rule states that the bias audit must calculate the “impact ratio” of each protected category. The employer has to publish the impact ratios but does not have to supply context — it’s left up to the job candidate to determine whether the results show evidence of bias. 

Good intentions can backfire when the law becomes so onerous and burdensome to an employer that it either delays the hiring process or is structured in some way that ends up detrimental to the candidate. — Courtney Stieber, Seyfarth Shaw

The problem is that it is difficult to get information on job candidates’ protected classes, said Courtney Stieber, a partner at Seyfarth Shaw. One of the things an employer does intentionally to ensure it is not engaging in discriminatory behaviors is to avoid gathering information on applicants’ protected classes, Stieber said.

To inform job applicants that an AEDT will be used in connection with the employment decision (and how that tool will be used in assessing the applicant or employee), an employer can publish the information on the employment section of the employer’s website; include it in the relevant job posting; or send it to a job applicant or employee via U.S. mail or e-mail. 

While that might not be very onerous, the requirement to wait 10 days, said Szyba, could burden large employers, like some hospitals, that “need to use certain algorithms or technology to process a large volume of candidates quickly.”

While the NYC law, passed in November 2021, is part of a well-intentioned effort to introduce laws that address algorithm bias nationally and in individual states, Stieber says the NYC 144 may not be that helpful for job candidates.

“Good intentions can end up backfiring where the law becomes so onerous and burdensome to an employer that it either delays the hiring process or is structured in some way that ends up detrimental to the candidate. And this may be one of those cases.”