Facebook to Rank News Sources by Quality to Battle Misinformation
Facebook Inc. plans to start ranking news sources in its feed based on user evaluations of credibility, a major step in its effort to fight false and sensationalist information that will also push the company further into a role it has long sought to avoid -- content referee.
The social-media giant will begin testing the effort next week by prioritizing news reports in its news feed from publications that users have rated in Facebook surveys as trustworthy, executives said Friday. The most "broadly trusted" publications -- those trusted and recognized by a large cross-section of Facebook users -- would get a boost in the news feed, while those that users rate low on trust would be penalized. The change only applies to U.S. users., though Facebook plans to roll it out later internationally.
The announcement, which confirms a report last week by The Wall Street Journal, comes after Facebook outlined another major news-feed overhaul that would diminish the presence of news in favor of what it calls "meaningful" interactions on the platform. This shift will result in news accounting for about 4% of the posts that appears in users' feeds world-wide, down from the current 5%, Facebook Chief Executive Mark Zuckerberg said in a post Friday.
The planned introduction of a trustworthiness score marks an important shift for Facebook, which Mr. Zuckerberg has long said seeks to avoid becoming the "arbiters of truth." But the company has been under pressure since the 2016 U.S. presidential campaign to stop enabling fabricated news articles and misinformation to spread across its platform. About 45% of U.S. adults get news from Facebook, according to a Pew Research Center survey conducted last summer.
Mr. Zuckerberg said the change -- which will be tested leading up to the 2018 U.S. midterm elections -- is necessary to address the role of social media in amplifying sensationalism, misinformation and polarization. "That's why it's important that News Feed promotes high quality news that helps build a sense of common ground," he wrote in his post.
Adam Mosseri, the Facebook executive who oversees its news feed, acknowledged that the company was wading into "tricky" territory by weighting publishers based on user trust.
"This is an interesting and tricky thing for us to pursue because I don't think we can decide what sources of news are trusted and what are not trusted, the same way I don't think we can't decide what is true and what is not," Mr. Mosseri said in an interview.
He added, however, that Facebook engineers themselves weren't taking a stance on credibility because the company relied on its users to provide a value judgment. He compared the approach with Facebook's reliance on third-party fact-checkers to determine whether or not an article is completely fabricated. Facebook uses those evaluations to determine where those posts should be ranked in users' feeds.
"The important distinction is that we're not actually deciding what is trusted and what is not -- we're asking our community to decide," Mr. Mosseri said in the interview. "We are asking people what they trust and what they don't trust and acting on that data -- as opposed to us deciding."
In surveys, Facebook asked a small percentage of its users whether or not they recognized a publication and if so how much they trusted them. Facebook is taking the aggregate of those results to inform its news feed rankings. Mr. Mosseri called the trust score an important weight, but still just one of many that Facebook uses to order posts in users' news feeds.
The use of surveys is sure to raise its own set of concerns, since the outcomes can be shaped by the kinds of questions asked, who responds, and how exactly the response translate to rankings inside Facebook's closely guarded news-feed algorithms.
Mr. Mosseri acknowledged the shortcoming of surveys, and said Facebook plans to fine-tune its rankings using other factors such as how informative and locally relevant news sources are. "No one signal that we use is perfect," he said. "There's always examples of when [the results] aren't lining up with what we're intending."
Facebook's trust score would boost the news-feed presence of well-known and widely trusted publications even if users disagree with the content or aren't avid readers. The change won't help publishers trusted by a small group of devoted readers but disparaged by everybody else, Mr. Mosseri said.
But, as with other Facebook news-feed changes, the moves could have a significant and unpredictable impact on news publishers, including The Wall Street Journal, many of which get substantial traffic from Facebook. Publishers' got 24% of their online traffic from Facebook as of last month, on average, down from 40% at the end of 2016, according to analytics firm Parse.ly.
Mr. Mosseri said Facebook tried to take steps to avoid hurting small, lesser-known publishers, although those outlets still could be outranked by more prominent publications. He said publishers won't be hurt if they aren't well-recognized in user surveys.
Many publishers are likely to be concerned about allowing users to decide how news outlets are ranked. Media executives have long been wary of Facebook's increasing dominance in both the ad market and as a vital distribution network for news with the power to massively magnify or dial down the amount of traffic to a site with a simple algorithm tweak.
At the same time, publishers have lobbied Facebook intensively to take a more active role in weeding out low-quality "clickbait," conspiracy theories and bogus stories and to prioritize news coming from established and respected media outlets.
Many media companies have been critical of Facebook's longstanding position that it isn't a media company, but simply a platform.
Over the past year, Facebook has consulted extensively with publishers on many issues including how to prioritize more trustworthy news sources, promote local news sources and accommodate news sites that are behind paywalls.
The company started discussing the possibility of a trust score internally around last fall. Facebook consulted experts about the trustworthy score but found there was "a massive amount of disagreement" among various media organizations over what makes a publication credible, Mr. Mosseri told the Journal.
In his post, Mr. Zuckerberg noted that he wasn't comfortable with Facebook making its own decision about what is and isn't trustworthy and that relying on outside experts "would likely not solve the objectivity problem."
--Lukas I. Alpert contributed to this article.
Write to Deepa Seetharaman at Deepa.Seetharaman@wsj.com
(END) Dow Jones Newswires
January 19, 2018 17:22 ET (22:22 GMT)