AI chatbots could be impacted by Supreme Court's ruling in Section 230 case

Supreme Court's ruling should come before end of current term

The Supreme Court is expected to release a decision in the next few months in a case that concerns a liability shield for internet and social media companies that could impact chatbots using artificial intelligence (AI) like ChatGPT.

At issue in the case is whether tech platforms like YouTube, which is owned by Google, violate the liability shield under federal law that applies to user-generated, third-party content when they use algorithms to recommend content to users through suggestions on the platform or send notifications to such content.

Generative AI chatbots operate using similar algorithms to refer users to content in response to queries or to take information from multiple sources and summarize it for users. That could expose the companies behind those AI-informed chatbots to liability depending on how the Supreme Court rules on the matter.

Cameron Kerry, a visiting fellow at the Brookings Institution think tank and an expert in AI, told Reuters, "The debate is really about whether the organization of information available online through recommendation engines is so significant to shaping the content so as to become liable. You have the same kinds of issues with respect to a chatbot."

GOOGLE UNVEILS NEW BARD AI CAPABILITIES FOR CODING

supreme court

The U.S. Supreme Court is expected to issue a decision in the next few months in a case related to Section 230 liability protections for tech firms that could impact AI chatbots. (AP Photo / J. Scott Applewhite / File / AP Newsroom)

While the case at issue before the Supreme Court doesn’t directly relate to artificial intelligence, a ruling could revise or upend Section 230 of the Communications Decency Act – a landmark law from 1996 that helped shape the modern internet.

In practice, tech companies like Facebook, Google and Twitter are generally immune from liability for content posted on their platforms by users under Section 230, although they’re required to remove content that’s prohibited by federal law such as material infringing on copyrights or violating sex trafficking laws.

Justice Neil Gorsuch noted during oral arguments that AI programs can already create new content and tools to generate "poetry" and "polemics" that go "beyond picking, choosing, analyzing, or digesting content," so such content wouldn’t be protected under Section 230.

THESE JOBS ARE SAFE FROM THE AI REVOLUTION – FOR NOW

A person holds an iPhone using the Google Bard generative AI language model

An iPhone screen displays the Google Bard generative AI language model (chatbot). (Smith Collection / Gado / Getty Images / File / Getty Images)

Gorsuch discussed a standard developed by the Ninth Circuit for the purpose of evaluating whether the content is protected under Section 230 as the "neutral tools test" – which assesses whether a tool like a search engine filters content only by user-generated criteria. 

In posing a question to one of the attorneys, he said the court "could say the Ninth Circuit’s Neutral Tools test was mistaken because, in some circumstances, even neutral tools, like algorithms, can generate through artificial intelligence forms of content and that the Ninth Circuit wasn’t sensitive to that possibility."

AI chatbots often paraphrase existing information in responses to users – an action that would be protected under Section 230. But in some cases, AI chatbots have given users responses that appear to have been fabricated by the bot itself, which would likely fall outside the protections of Section 230 as it becomes original content. Depending on the nature of the content created by the AI chatbot, that could expose its creator to legal liability.

GERMAN AUTHORS, PERFORMERS CALL FOR TOUGHER CHATGPT RULES AMID COPYRIGHT CONCERNS

Hany Farid, a professor at the University of California, Berkeley, told Reuters that AI developers should be liable for the shortcoming of models they "programmed, trained and deployed." Farid added, "When companies are held responsible in civil litigation for harms from the products they produce, they produce safer products. And when they’re not held liable, they produce less safe products."

A sign with logos for Google and YouTube are shown outside the Googleplex corporate headquarters in Mountain View, California, April 14, 2018. (Smith Collection / Gado / Getty Images / Getty Images)

What’s the Supreme Court case about?

The court heard arguments in February regarding a case known as Gonzalez v. Google that involves a lawsuit filed against Google by the family of Nohemi Gonzalez, who was killed at the age of 23 in an ISIS terror attack in Paris more than seven years ago. 

The suit alleges that the algorithm used by YouTube, a subsidiary of Google, helped ISIS recruit and incite violence because the platform’s recommendation algorithm featured extremist videos based on viewers’ interests and sent users notifications to the controversial videos. Gonzalez’s family calls for Google to be held liable for the promotion of that content under Section 230 of the Communications Decency Act.

Ticker Security Last Change Change %
GOOGL ALPHABET INC. 155.99 -3.15 -1.98%

GET FOX BUSINESS ON THE GO BY CLICKING HERE

The Supreme Court is expected to hand down its decision in the case before the end of its current term, which will likely wrap up in late June or early July.

Reuters contributed to this report.