In this year's election, the most powerful news source was Facebook. More than 40 percent of US adults—that's more than watch any single TV channel—get their news from Facebook, according to Pew, and as plenty of journalists have recounted, the platform is flooded with hyperpartisan fake news.
Continue Reading Below
Yet Mark Zuckerberg and Facebook's leadership refuse to own up to their responsibility for our politics. Zuckerberg argued this week that fake news is "a very small amount of the content" on Facebook, and tried to judo the question into a partisan political argument, saying "Why would you think that there would be fake news on one side, but not the other?"
Well, of course there's fake news on both sides. Facebook's bias wasn't necessarily for or against Trump. Like a good tabloid, Facebook's bias was for the most inflammatory, extreme views. Those bubbled up to the top in Facebook's editorial curation.
A slew of profit-seeking Macedonian teens figured that out. As BuzzFeed recounts, they A/Bed various kinds of political content to figure out which ones went the most viral in Facebook's news feeds, with nearly all of the stories being false.
Debunking the News Feed
Central to Facebook's proclamation that it's a platform, not a media company, is the idea that its feed is just presenting to you what your friends want you to see. That's completely untrue.
Continue Reading Below
Facebook's utterly opaque algorithm shows you a curated selection of what your friends, their friends, and paid sponsors are posting. You don't get things ordered by the number of shares; you don't get them ordered chronologically; and you don't even get them ordered by who's paying. You get them ordered by a complex editorial process.
Just because that process is algorithmic doesn't mean it isn't editorial. Just because it doesn't have a direct mapping to a specific political bias doesn't mean it isn't curation.
I use an (open, public) algorithm to determine the winners of our Fastest Mobile Networks award each year. The weightings of the items in the algorithm are my editorial choice. Humans wrote and continually tweak Facebook's algorithm; they are the editors. Even if the algorithm programmed itself, it would be the editor, not your friends. Friends are not the selectors of the news, they are the raw material.
This doesn't just affect political news. I'm looking at the top items in my feed right now. Their ages are: four minutes, 10 hours, sponsored, two hours, three hours, 13 minutes. I have friends who posted things between four minutes and 10 hours ago. Many of their posts did not appear because Facebook is editing.
Twitter is a platform. Twitter shows you only who you follow, in pure reverse chronological order (if you so choose). Facebook is a magazine. It shows you the stories it thinks are the most thrilling, in the order of thrill.
Yes, you can affect the curation yourself by choosing "top friends," unfollowing or hiding posts, much as you can tear pages out of a newspaper or magazine if you don't like them. But the power of defaults is very, very strong, as this study of search engine results shows.
A Closer Eye on Facebook
Zuckerberg can't admit that Facebook is a media company because media companies are treated with a lot more scrutiny than supposedly neutral platforms are. This got driven home during the "Trending Topics" controversy, when Facebook was hammered by conservatives for supposed bias in its human editor corps. So Facebook got rid of the human editors, but not the editing. And once Facebook switched from human to algorithmic editors, those editors pushed more fake stories to the fore. (This story has a really good look at the "brains" of Facebook's virtual editor.)
The conversation around Facebook needs to change. It's not going back to a purely chronological feed. The algorithm is too successful, and the algorithm is a big part of what makes Facebook so sticky.
So Facebook needs to take responsibility for its position as a media company, and needs to be treated as one. If it continues to push fake news upwards in its algorithm, which it's clearly doing, those stories must be treated just as harshly as any other magazine publishing flat-out fakery. Facebook is the publisher. The linked sites are the writers. We are the readers. This is the media now.
Random Access: We're showing you the Google Daydream View and talking about Facebook's impact on the election, Snapchat Spectacles and new emojis.
Posted by PCMag on Friday, November 11, 2016