Facebook Drowns Out Fake News With More Information -- Update

Facebook Inc. is fighting misinformation with more information.

Starting Thursday, when Facebook's U.S. users come across popular links -- including made-up news articles -- in their feeds, they may also see a cluster of other articles on the same topic. The "related articles" feature, which will roll out widely in the U.S. after months of testing, is part of the Facebook news feed team's effort to limit the damage of false news without taking down those posts.

In recent months, Facebook has launched features such as "related articles" that push users to think twice before sharing a story, but don't prevent them from sharing and thus spreading false news. Facebook has also partnered with five outside fact-checkers like Snopes.com, which Facebook recently started paying to label completely false stories as "disputed" from a Facebook-built database of possibly false news articles.

The moves show Facebook's strategy to reduce the presence of misinformation on its platform, without going so far as censoring it, a role it says it doesn't want. While Facebook has content policies that ban hate speech and other forms of expression, the social-media company is queasy about creating similar policies around accuracy.

Last year, Facebook came under fire for failing to prevent the spread of fabricated news articles during the 2016 U.S. presidential race, despite being a dominant platform for news consumption.

After initially resisting criticism, Chief Executive Mark Zuckerberg eventually acknowledged Facebook's responsibility to curb misinformation, but said he was wary of Facebook becoming what he calls the "arbiters of truth."

Facebook's approach to fighting misinformation mirrors that of Alphabet Inc.'s Google, which is also working with fact-checkers and recently retooled its search engine to prevent sites peddling fake news, hoaxes and conspiracy theories from appearing in its top results.

In a lengthy corporate manifesto posted in February, Mr. Zuckerberg said Facebook "would focus less on banning misinformation, and more on surfacing additional perspectives and information, including that fact checkers dispute an item's accuracy."

In coming months, Facebook says it plans to rely more heavily on fact checkers. If two or more label a story as "disputed", the article will automatically show up lower in users' news feeds.

In many cases, Facebook staples a "disputed" tag to those posts to warn users that fact checkers found an article's claims completely false. The company also is experimenting with other approaches, such as using "related articles" to show stories written by its fact checkers that debunk a false story in lieu of a disputed tag.

The "related articles" feature shows up on some stories that have been flagged as false by fact checkers working with Facebook, but also on some legitimate stories that are going viral. Facebook hopes the feature will make it easier for people to break out of their filter bubbles and see other views.

If the stories are going viral, Facebook software selects other, relevant articles to show underneath those posts. For some articles deemed false, Facebook will link to fact checkers' explanations for why the information presented is wrong.

Facebook has started paying those fact checkers, a spokeswoman said, declining to specify amounts. Facebook paid one partner, the nonprofit FactCheck.Org, $52,283.34 for fact-checking work in the first six months of 2017, FactCheck.Org disclosed in a financial report. Facebook's second-quarter net income was nearly $4 billion.

Fact checkers will start seeing more articles in their queues. Facebook has also started using fact checkers' rulings to improve its algorithms for predicting whether a story is potentially false, the spokeswoman added. Those articles will be sent to fact checkers who determine their accuracy.

Facebook has also been adjusting its news feed algorithms to help demote fake stories, as it did in June when it started punishing accounts that routinely post 50 links a day because they tend to share "low quality content" like misinformation.

Write to Deepa Seetharaman at Deepa.Seetharaman@wsj.com

(END) Dow Jones Newswires

August 03, 2017 15:36 ET (19:36 GMT)