Businesses and their service providers are grappling with how to comply with New York City’s mandate for audits of artificial intelligence systems used in hiring.
A New York City law that comes into effect in January will require companies to conduct audits to assess biases, including along race and gender lines, in the AI systems they use in hiring. Under New York’s law, the hiring company is ultimately liable—and can face fines—for violations.
But the requirement has posed some compliance challenges. Unlike familiar financial audits, refined over decades of accounting experience, the AI audit process is new and without clearly established guidelines.
“There is a major concern, which is it’s not clear exactly what constitutes an AI audit,” said Andrew Burt, managing partner at AI-focused law firm BNH. “If you are an organization that’s using some type of these tools…it can be pretty confusing.”
The city law will potentially impact a large number of employers. New York City in 2021 had just under 200,000 businesses, according to the New York State Department of Labor.
A spokesman for New York City said its Department of Consumer and Worker Protection has been working on rules to implement the law, but he didn’t have a timeline for when they might be published. He didn’t reply to inquiries about whether the city had a response to complaints about the purported lack of guidance.
Beyond the immediate impact in New York City, employers are confident that audit requirements will soon be required in far more jurisdictions, said Kevin White, the co-chair of the labor and employment team at law firm Hunton Andrews Kurth LLP.
AI has steadily crept into many companies’ human-resources departments. Nearly one in four uses automation, AI, or both to support HR activities, according to research that the Society for Human Resource Management published earlier this year. The number rises to 42% among companies with more than 5,000 employees.
Other studies have estimated even higher levels of use among businesses.
AI technology can help businesses hire and onboard candidates more quickly amid a “war for talent,” said Emily Dickens, SHRM’s head of government affairs.
Boosters for the technology have argued that, used well, it also can potentially stop unfair biases from creeping into hiring decisions. A person might, for example, unconsciously side with a candidate that went to the same college or roots for a certain team, whereas computers don’t have alma maters or favorite sports teams.
A human mind with its hidden motivations is “the ultimate black box,” unlike an algorithm whose responses to different inputs can be probed, said Lindsey Zuloaga, the chief data scientist at HireVue Inc. HireVue, which lists Unilever PLC and Kraft Heinz Co. among its clients, offers software that can automate interviews.
But, if companies aren’t careful, AI can “be very biased at scale. Which is scary,” Ms. Zuloaga said, adding that she supports the scrutiny AI systems have started to receive.
HireVue’s systems are audited for bias regularly, and the company wants to ensure customers feel comfortable with its tools, she said.
One audit of HireVue’s algorithms published in 2020, for example, found that minority candidates tended to be more likely to give short answers to interview questions, saying things like “I don’t know,” which would result in their responses being flagged for human review. HireVue changed how its software deals with short answers to address the issue.
Businesses have concerns about the “opaqueness and lack of standardization” regarding what is expected in AI auditing, said the U.S. Chamber of Commerce, which lobbies on behalf of businesses.
Even more concerning is the possible impact on small businesses, said Jordan Crenshaw, vice president of the Chamber’s Technology Engagement Center.
Many companies have had to scramble to determine even the extent to which they use AI systems in the employment process, Hunton’s Mr. White said. Companies haven’t taken a uniform approach to which executive function “owns” AI. In some, human resources drives the process, and in others, it is driven by the chief privacy officer or information technology, he said.
“They pretty quickly realize that they have to put together a committee across the company to figure out where all the AI might be sitting,” he said.
Because New York doesn’t offer clear guidelines, he expects there might be a range of approaches taken in the audits. But difficulties in complying aren’t driving companies back toward the processes of a pre-AI era, he said.
“It’s too useful to put back on the shelf,” he said.
Some critics have argued the New York law doesn’t go far enough. The Surveillance Technology Oversight Project, New York Civil Liberties Union and other organizations noted the lack of standards for bias audits, but pushed for tougher penalties in a letter sent before the law’s passage. They argued that companies selling tools deemed biased should themselves potentially face punishment, among other suggestions.
Regulators won’t necessarily be looking for perfection in the early days.
“The good faith effort is really what the regulators are looking for,” said Liz Grennan, co-leader of digital trust at McKinsey & Co. “Frankly, the regulators are going to learn as they go.”
Ms. Grennan said some companies aren’t waiting until the January effective date to act.
Companies in part are motivated by reputational risk as much as the fear of a regulator taking action. For large corporations with high-profile brands, concerns about social impact and environmental, social and governance issues might outweigh concerns about being “slapped by a regulator,” said Anthony Habayeb, chief executive of AI governance software company Monitaur Inc.
“If I’m a larger enterprise…I want to be able to demonstrate that I know AI might have issues,” Mr. Habayeb said. “And instead of waiting for someone to tell me what to do…I built controls around these applications because I know like with any software, things can and do go wrong.”
Write to Richard Vanderford at richard.vanderford@wsj.com
Copyright ©2022 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8