Experts weigh in on strengths, weaknesses of Biden's AI executive order

President Biden's wide-ranging executive order on artificial intelligence has AI experts tracking how agencies craft and implement regulations

The wide-ranging executive order on artificial intelligence (AI) safety and testing requirements signed by President Joe Biden has AI experts tracking how agencies fill in the gaps from the broadly written order as those directives move into the rulemaking process.

Biden’s executive order instructed federal agencies to begin crafting rules and standards that touched on an expansive set of topics ranging from the watermarking of AI-generated content to rooting out bias in AI systems, and from addressing national security concerns to testing protocols for generative AI tools. 

While some of those regulatory areas had a relatively clear direction under the executive order, others were less clear and will be primarily defined by agencies as they craft the regulations in the weeks and months ahead – all of which will factor into how businesses and consumers alike interact with AI moving forward.

Josh Gruenspecht, a partner at Wilson Sonsini, told FOX Business that in the near term, the reporting and testing security requirements for companies with large-scale models, computer clusters with a certain amount of computing power and cloud computing providers who offer substantial amounts of computing power to foreign customers will be the most relevant items for AI firms.

WHITE HOUSE UNVEILS AI EXECUTIVE ORDER, REQUIRING COMPANIES TO SHARE NATIONAL SECURITY RISKS WITH FEDS

U.S. President Joe Biden

President Joe Biden's AI executive order on the "Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence" has agencies starting the rulemaking process in a variety of areas. ((Photo by Celal Gunes/Anadolu Agency via Getty Images) / Getty Images)

"All of those reporting requirements are expected to take effect quickly and will have a larger impact on the biggest AI firms and cloud computing providers than on the smaller AI companies, who won’t meet the reporting thresholds," he said. "However, in the longer term, as many of the other regulations proposed by the EO come into force and agencies start acting… the biggest AI companies may be better placed to absorb the cost of complying, and the smaller companies may struggle."

He noted that some of the rules, such as those applying to cloud computing providers, could have implications for the confidentiality and security of those companies’ clients depending on how the rulemaking process proceeds.

"The reporting requirements for cloud computing providers (e.g. AWS, Microsoft Azure), in particular, are underdeveloped and under-explained," Gruenspecht said. "If the cloud computing providers are now going to have to police users who request large amounts of computing power to see if they’re engaged in building a model with malicious cyber-enabled capabilities, then depending on how the regulations shake out, those companies may have to look at the activities undertaken by their larger customers directly."

AUTHORS’ COPYRIGHT LAWSUIT AGAINST OPENAI OVER CHATGPT BEGINS

Artificial intelligence logo

Biden's executive order could have privacy implications for businesses and their clients, depending on how rulemaking proceeds. (iStock / iStock)

Alon Yamin, co-founder and CEO of Copyleaks which uses AI for content authentication, told FOX Business that the executive order was fairly comprehensive overall but added, "I think really focusing a bit more on the different types of solutions for different content types and understanding and distinguishing between them is one point that I thought was missing a little bit."

"You can’t have the same strategy for detecting AI in video, to detecting AI in music, to detecting AI in photos, to detecting AI in text – each one of these content types is a different character and you can’t have one solution for all," Yamin said. He added that while watermarking was discussed in the executive order "it’s not a bulletproof strategy."

WHAT IS ARTIFICIAL INTELLIGENCE (AI)?

OpenAI ChatGPT Screen

The Biden administration's executive order aims to root out bias in AI systems. ((Photo by Jaap Arriens/NurPhoto via Getty Images) / Getty Images)

OJ Laos, a senior manager for accounting and consulting firm Armanino, told FOX Business that intellectual property (IP) and copyright rules could have an impact on how AI tools are designed and used – potentially hamstringing startups.

"You can look towards the pieces on the IP and copyright section of the executive order, and really they don’t broadly define exactly how that is going to be ultimately adjudicated, but it’s indicating there’s going to be some chilling effect," Laos said. "There’s going to be some more controls that go into place for what’s created, how these tools work, what that looks like, and that will limit some business use cases or probably startups."

GET FOX BUSINESS ON THE GO BY CLICKING HERE

He went on to explain that Biden’s AI executive order is best viewed as a signal, given that federal agencies will be primarily responsible for crafting the rules. 

"This was a signal at best at this point. We’re all going to be figuring out and there’s going to be a lot of change in the next few months to six months of getting more details about exactly what they mean because they didn’t go into a ton of detail on how that will look in practice," Laos said. "I think it’s going to be a little trickier."