AI & RoboticsNews

NYC begins enforcing new law targeting bias in AI hiring tools

New York City’s Automated Employment Decision Tool (AEDT) law, believed to be the first in the U.S. aimed at reducing bias in AI-driven recruitment and employment decisions, will now be enforced — after the law went into effect in January and final rules were adopted in April.

Under the AEDT law, it will be unlawful for an employer or employment agency to use artificial intelligence and algorithm-based technologies to evaluate NYC job candidates and employees — unless it conducts an independent bias audit before using the AI employment tools. The bottom line: New York City employers will be the ones taking on compliance obligations around these AI tools, rather than the software vendors who create them.

Technically speaking, the law went into effect on January 1, but as a practical matter, companies could not easily be in compliance because the law did not provide enough detail on how to comply with a bias audit. But now the city’s Department of Consumer and Worker Protection has published an FAQ meant to provide more details.

According to the FAQ, the bias audit must be done each year, be “an impartial evaluation by an independent auditor” and, at a minimum, “include calculations of selection or scoring rates and the impact ratio across sex categories, race/ethnicity categories, and intersectional categories.”

The law requires employers and employment agencies to comply with “all relevant Anti-Discrimination laws and rules to determine any necessary actions based on the results of a bias audit,” and to publish a summary of the results of the most recent bias audit.

According to Niloy Ray, shareholder at labor and employment law firm Littler, in a majority of cases compliance with the law shouldn’t be particularly difficult, but it does require collaboration between third-party vendors that are creating AI hiring tools and the companies using them.

“The law has a pretty dense description of the technologies to which it applies, so that requires understanding how the tool works,” said Roy. “They are going to have to explain it enough to help companies [do the bias audit], so that’s a good outcome.”

That said, there are edge cases where it may be more challenging to determine whether the law applies. For example, what happens if the job is a fully remote position? Does New York City have jurisdiction over that role?

“Those edge cases get a little more confusing, but I think generally it’s still easy as long as you can understand the technology,” Ray said. “Then it’s just a question of collecting the data and performing simple arithmetic on the data.”

Roy pointed out that New York is not the only state or jurisdiction considering this kind of law governing AI bias in hiring tools. “California, New Jersey, Vermont, Washington D.C., Massachusetts, they all have versions of regulations working their way through the system,” he said.

But in New York City, any large company that is hiring is likely ready with what it needs for compliance, he added. For smaller companies, the vendors from which they acquire tools probably already have that bias audit done.

“If you’re working with a tool you didn’t develop but procure from a third party, go to them right away and discuss what they can do to help you be in compliance,” he said. “On the internal side, you may have to reach out to your legal counsel, someone who is doing this for several or hundreds of corporations, and they will be able to give you a jumpstart with a framework quickly.”

Even for those who didn’t hit the July 5 deadline, it’s important to keep working towards getting compliance done as efficiently as possible and to document your efforts to seek legal advice and help from vendors.

“It makes a huge difference if you say I stuck my head in the sand versus I saw the train coming, I couldn’t make it to the station, but I’m still trying to get it done,” Ray explained. “If you’re working in good faith, [they’re] not going to penalize you, [they’re] not going to bring enforcement actions, given the newness and the complexity of the law.”

Join top executives in San Francisco on July 11-12 and learn how business leaders are getting ahead of the generative AI revolution. Learn More


New York City’s Automated Employment Decision Tool (AEDT) law, believed to be the first in the U.S. aimed at reducing bias in AI-driven recruitment and employment decisions, will now be enforced — after the law went into effect in January and final rules were adopted in April.

Under the AEDT law, it will be unlawful for an employer or employment agency to use artificial intelligence and algorithm-based technologies to evaluate NYC job candidates and employees — unless it conducts an independent bias audit before using the AI employment tools. The bottom line: New York City employers will be the ones taking on compliance obligations around these AI tools, rather than the software vendors who create them.

Technically speaking, the law went into effect on January 1, but as a practical matter, companies could not easily be in compliance because the law did not provide enough detail on how to comply with a bias audit. But now the city’s Department of Consumer and Worker Protection has published an FAQ meant to provide more details.

Companies must complete an annual AI bias audit

According to the FAQ, the bias audit must be done each year, be “an impartial evaluation by an independent auditor” and, at a minimum, “include calculations of selection or scoring rates and the impact ratio across sex categories, race/ethnicity categories, and intersectional categories.”

Event

Transform 2023

Join us in San Francisco on July 11-12, where top executives will share how they have integrated and optimized AI investments for success and avoided common pitfalls.

Register Now

The law requires employers and employment agencies to comply with “all relevant Anti-Discrimination laws and rules to determine any necessary actions based on the results of a bias audit,” and to publish a summary of the results of the most recent bias audit.

According to Niloy Ray, shareholder at labor and employment law firm Littler, in a majority of cases compliance with the law shouldn’t be particularly difficult, but it does require collaboration between third-party vendors that are creating AI hiring tools and the companies using them.

“The law has a pretty dense description of the technologies to which it applies, so that requires understanding how the tool works,” said Roy. “They are going to have to explain it enough to help companies [do the bias audit], so that’s a good outcome.”

That said, there are edge cases where it may be more challenging to determine whether the law applies. For example, what happens if the job is a fully remote position? Does New York City have jurisdiction over that role?

“Those edge cases get a little more confusing, but I think generally it’s still easy as long as you can understand the technology,” Ray said. “Then it’s just a question of collecting the data and performing simple arithmetic on the data.”

Roy pointed out that New York is not the only state or jurisdiction considering this kind of law governing AI bias in hiring tools. “California, New Jersey, Vermont, Washington D.C., Massachusetts, they all have versions of regulations working their way through the system,” he said.

But in New York City, any large company that is hiring is likely ready with what it needs for compliance, he added. For smaller companies, the vendors from which they acquire tools probably already have that bias audit done.

“If you’re working with a tool you didn’t develop but procure from a third party, go to them right away and discuss what they can do to help you be in compliance,” he said. “On the internal side, you may have to reach out to your legal counsel, someone who is doing this for several or hundreds of corporations, and they will be able to give you a jumpstart with a framework quickly.”

Even for those who didn’t hit the July 5 deadline, it’s important to keep working towards getting compliance done as efficiently as possible and to document your efforts to seek legal advice and help from vendors.

“It makes a huge difference if you say I stuck my head in the sand versus I saw the train coming, I couldn’t make it to the station, but I’m still trying to get it done,” Ray explained. “If you’re working in good faith, [they’re] not going to penalize you, [they’re] not going to bring enforcement actions, given the newness and the complexity of the law.”

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.


Author: Sharon Goldman
Source: Venturebeat

Related posts
AI & RoboticsNews

Nvidia and DataStax just made generative AI smarter and leaner — here’s how

AI & RoboticsNews

OpenAI opens up its most powerful model, o1, to third-party developers

AI & RoboticsNews

UAE’s Falcon 3 challenges open-source leaders amid surging demand for small AI models

DefenseNews

Army, Navy conduct key hypersonic missile test

Sign up for our Newsletter and
stay informed!