Businesses look to self-regulate the use of AI in hiring

Government agencies at the federal, state and local level are looking to play catch-up on AI regulations, prompting companies to self-regulate their use of AI in hiring

A group of large companies has developed a set of principles and policies for the self-regulation of using artificial intelligence (AI) technologies in hiring processes in response to the relative lack of government regulation on the subject.

A total of 18 companies worked with BBB National Programs to develop a pair of documents that will serve as a voluntary framework for self-regulation

Among the publicly-identified members of the working group were Amazon, Koch Industries, Microsoft, Qualcomm and Unilever. BBB National Programs cited data indicating that 99% of Fortune 500 companies rely on talent-sifting software, and 55% of human resources leaders use predictive algorithms in hiring.

Eric Reicin, president and CEO of BBB National Programs, told FOX Business the effort came together due to the "need for business to do the right thing, the need for business to self-regulate when the rules are not necessarily clear in government.

AI IS CHANGING HIRING PRACTICES, AND EMPLOYMENT ATTORNEYS ARE SOUNDING THE ALARM

Artificial intelligence logo

BBB National Programs and more than a dozen global companies developed a self-regulatory framework for using AI in hiring in the absence of governmental regulation on the subject. (iStock / iStock)

"The sheer number of applications that companies are seeing that there’s a use for AI technologies to work through those applications fairly and to mitigate the potential for bias that may exist in human-led decision-making in a recruitment and hiring process," Reicin said. "One of the organizations that was part of this incubator had over 20 million applications last year, and you can’t put a number of humans on that to handle all the decision-making for that.

"So you’re using AI-enabled tools, machine learning tools, to help in that decision-making process. And so the question becomes, when those tools are used, is the tool using AI in a way that is more inclusive, that reduces the potential for bias in human-led decision-making, or is it actually exasperating the issues that some studies have shown."

AI INTERVIEWS WILL BE USED BY 43% OF COMPANIES BY NEXT YEAR, SURVEY FINDS

The working group composed of BBB National Programs and senior legal and privacy representatives from large, global employers focused its AI Principles and Protocols on several key objectives. Those include ensuring that AI systems are valid and reliable; promoting equitable outcomes with harmful bias managed; increasing inclusivity; facilitating compliance, transparency and accountability; and striving for systems that are safe, secure, resilient, explainable, interpretable and privacy-enhanced.

illustration of robot lifting businessman

The self-regulatory framework seeks to provide transparency to job seekers about when AI and machine learning are used in the hiring process. (iStock / iStock)

To be certified under the framework, employers must take steps like providing notice to applicants about the use of AI processing and ensuring they fully understand how a vendor’s tool works in addition to monitoring them on an ongoing basis to mitigate biased outcomes. Certifications of compliance with the principles and protocols are performed by independent, third-party entities.

NYC’S NEW AI HIRING SOFTWARE REGULATIONS HIT WITH PUSHBACK FROM BUSINESS GROUPS

The self-regulatory framework doesn’t supersede any relevant federal, state or local laws governing the use of AI in hiring. While the federal government is yet to implement any major legislation or regulation on the subject, policymakers are starting to introduce proposals on the subject. 

At the local level, New York City’s Automated Employment Decision Tools law took effect July 5, which requires employers in the city that are using AI or machine learning tools for hiring decisions to notify New York City-based candidates about the use of automated tools. The law also requires automated employment decision tools to be analyzed by an independent auditor for potential bias.

Fiction AI written

Artificial intelligence technologies are being used by companies to screen applicants in the hiring process. (Karl-Josef Hildenbrand/picture alliance via Getty Images / Getty Images)

GET FOX BUSINESS ON THE GO BY CLICKING HERE

BBB National Programs was formed in 2019 following the restructuring of the Council of Better Business Bureaus (CBBB) into a pair of independent nonprofit organizations, the other being the International Association of Better Business Bureaus (IABBB).

Following the restructuring, BBB National Programs became the home for industry self-regulations that had been part of CBBB, some of which have been in existence more than five decades. By contrast, the IABBB is the national headquarters of Better Business Bureaus around the country that work to address business complaints and scams. 

BBB National Programs and the IABBB operate independently of one another.

FOX Business’ Breck Dumas contributed to this report.