Facebook inadvertently creates videos for Islamic extremists: Whistleblower

Facebook relies heavily on automated systems to remove violent images and other inappropriate content from its platforms, but in some cases, the social media company's own software is inadvertently creating the posts, a whistleblower claims.

Both the Islamic State and al-Qaida have leveraged content that Facebook algorithms culled from their pages and repackaged, according to a confidential complaint filed with the Securities and Exchange Commission and obtained by the Associated Press. It's the same technology behind the birthday videos that Facebook compiles for individual users from content they shared during the previous year.

The allegations are likely to prompt additional scrutiny from federal regulators, who are reviewing whether the social media giant engaged in anti-competitive practices, and Congress, where lawmakers have accused it of inadequately policing content from Russian propaganda to violent videos such as those from a white supremacist attack in Christchurch, New Zealand, at the start of 2019.

Earlier this week, in a move to keep the Facebook community safe, Facebook moved ahead with a long-promised independent oversight board that CEO Mark Zuckerberg says the community "can appeal to on some of the hardest questions about what content is allowed on our services."

The National Whistleblower Center, a Washington, D.C.-based nonprofit, plans to submit new details to regulators this week about the SEC complaint.

The filing obtained by The Associated Press identifies almost 200 auto-generated pages — some for businesses, some for schools and other organizations — that directly reference the Islamic State and dozens more representing al-Qaida and other known terrorist groups. One page categorized as “political ideology” is titled “I love Islamic state.” It features a group logo inside the outlines of Facebook’s famous thumbs-up icon.

The new details were disclosed just as members of the Senate Commerce Committee conducted a hearing with social media companies including Facebook on efforts to stem extremist messaging. After the New Zealand killings, which were live-streamed by a man police identified as the shooter, Facebook temporarily blocked users who break its rules from broadcasting live video. Later, before a 19-year-old gunman opened fire on a famed garlic festival in his California hometown, he urged his followers on Instagram -- a platform owned by Facebook -- to read a 19th-century book popular with white supremacists.

CLICK HERE TO READ MORE ON FOX BUSINESS

U.S. Rep. Max Rose, a New York Democrat, lambasted such lapses in a speech on the House floor in June.

“Instead of preventing terrorist content from spreading on their platform, as reported by the Associated Press, recently Facebook has been making videos and promoting terrorist content on its own system,” Rose said. “For instance, an al-Qaida-linked terrorist group has an autogenerated Facebook page that has nearly 4,500 likes.This case was profiled in the AP story and serves as yet another glaring example of Facebook’s inability to police itself. But what is even more striking, is before coming to speak on the House floor today, I checked and this profile is still up there."

The National Whistleblower Center didn't immediately respond to requests for comment from FOX Business.

"Our priority is detecting and removing content posted by people that violates our policy against dangerous individuals and organizations to stay ahead of bad actors," a Facebook spokesperson told FOX Business. "Auto-generated pages are not like normal Facebook pages as people can’t comment or post on them and we remove any that violate our policies. While we cannot catch every one, we remain vigilant in this effort."