Facebook exec says algorithms protect users from ‘more hate speech’

Removing the algorithms would result in 'more, not less, hate speech,' the VP said

Facebook’s controversial algorithms protect its users from being exposed to extreme content, hate speech and misinformation, the beleaguered company’s vice president for policy and global affairs claimed in interviews on Sunday.

Nick Clegg defended Facebook against allegations from whistleblower Frances Haugen that its algorithms push clickbait and extreme content — but insisted the company would never be able to entirely eliminate misinformation and hate speech from its platforms.

"If you remove the algorithms… the first thing that would happen is that people would see more, not less, hate speech — more, not less misinformation," Clegg told Dana Bash on CNN’s State of the Union. "These algorithms are designed precisely to work almost like giant spam filters to identify and deprecate bad content."

FACEBOOK EXEC OUTLINES 'FUTURE PLANS' TO IMPLEMENT MEASURES TO PROTECT TEENS ON INSTAGRAM

"For every ten thousand bit of content, you’d only see five bits of heights of hate speech," he said. "I wish we could eliminate it to zero.. We have a third of the world’s population on our platforms. Of course, we see the good, the bad, and the ugly of human nature on our platforms."

Clegg insisted to Bash that Facebook’s algorithms did not play any special role in the lead-up to the Capitol Riot on Jan. 6. On NBC’s "Meet The Press," he told Chuck Todd that Haugen’s claim Facebook lifted measures intended to tone down user feeds after the 2020 presidential elections was "simply not true."

Home computing post truth stock photo. Photo credit: Getty Images/istock

"We in fact kept the vast majority of them right through to the inauguration, and we kept some in place permanently," Clegg told Todd, adding that some of the changes were "one-off."

TECH CEO: FACEBOOK IS FACING ‘THE BEGINNING OF THE END’

He said the company rolled back "blunt tools" — such as reducing the circulation of videos, civic engagement opportunities, and political ads — that had been inadvertently "scooping up a lot of entirely innocent legitimate legal playful enjoyable content."

"We did that very exceptionally," Clegg said. "We just simply let perfectly normal content just circulate less on our platform. That’s something we did because of the exceptional circumstances."

Clegg told Todd the onus is on Congress to "create a digital regulator" and set rules for data privacy and content moderation.

GET FOX BUSINESS ON THE GO BY CLICKING HERE

"I don’t think anyone wants a private company to adjudicate on these really difficult trade-offs between, you know, free expression on the one hand, and moderating or removing content on the other," he said. "There is fundamental political disagreement. The right thinks we… censor too much content, the left thinks we don’t take down enough."

Clegg told ABC’s George Stephanopoulos it was "extremely misleading" to analogize Facebook’s reported knowledge of the harm its products cause children and society to tobacco companies’ awareness of the danger of cigarettes.

"In the ’80s and ’90s there were analogies that watching too much television was like alcoholism, or arcade games like Pac-Man was like, you know, drug abuse," he said. "We’ can’t change human nature. You’ll always see bad things online."