Facebook: Fake News Alerts on Articles Not Super Effective

Facebook is switching up its approach to fighting fake news.

Since last December, the company has been warning users about questionable news articles by placing a red icon next to them. But it seems the alerts have actually had the opposite effect. "Putting a strong image, like a red flag, next to an article may actually entrench deeply held beliefs," Facebook said in a Wednesday blog post, citing academic research.

The company's own research, conducted over the last year, reached similar conclusions.

"We learned that dispelling misinformation is challenging," Facebook product designer Jeff Smith wrote in a separate blog post. "Just because something is marked as 'false' or 'disputed' doesn't necessarily mean we will be able to change someone's opinion about its accuracy."

As a result, Facebook is dropping the red icon approach and instead focusing on another tactic: placing the real news under the fake.

Earlier this year, the company began testing a revamped version of its Related Articles tool, which displays similar articles after you click on something in your News Feed. Going forward, this will show you fact-checked news content associated with any topics that appear in your Facebook feed.

The company has found this help users understand the context around a piece of news, including hoaxes, without triggering a negative reaction.

"Indeed, we've found that when we show Related Articles next to a false news story, it leads to fewer shares than when the Disputed Flag is shown," the company said.

Overall, Facebook said its processes to fighting fake news are working. When Facebook's third-party fact-checkers identify a piece of false news, the company will demote it, causing the offending content to lose 80 percent of its traffic. The company has also been punishing publishers of fake news by removing their ability to advertise on Facebook, and reducing their methods to distribute content.

However, the social media company said it's trying to react faster to the problem. It commonly takes over three days for one of its fact-checking partners to go over a news story, Facebook said.

Without going into details, the company added, "we are starting a new initiative to better understand how people decide whether information is accurate or not based on the news sources they depend upon."

This article originally appeared on PCMag.com.