Facebook knows it encourages division, top executives nixed solutions

In some cases, Facebook worsens polarization and tribal behavior

A Facebook Inc. team had a blunt message for senior executives. The company's algorithms weren't bringing people together. They were driving people apart.

"Our algorithms exploit the human brain's attraction to divisiveness," read a slide from a 2018 presentation. "If left unchecked," it warned, Facebook would feed users "more and more divisive content in an effort to gain user attention & increase time on the platform."

That presentation went to the heart of a question dogging Facebook almost since its founding: Does its platform aggravate polarization and tribal behavior?

The answer Facebook found, in some cases, was yes.

Protesters rally at the State Capitol in Lansing, Mich., Thursday, April 30, 2020. (AP Photo/Paul Sancya)

Facebook had kicked off an internal effort to understand how its platform shaped user behavior and how the company might address potential harms. Chief Executive Mark Zuckerberg had in public and private expressed concern about "sensationalism and polarization."

But in the end, Facebook's interest was fleeting. Mr. Zuckerberg and other senior executives largely shelved the basic research, according to previously unreported internal documents and people familiar with the effort, and weakened or blocked efforts to apply its conclusions to Facebook products.

ZUCKERBERG: FACEBOOK MAY CUT SALARIES OF EMPLOYEES MOVING OUT OF SILICON VALLE

Facebook policy chief Joel Kaplan, who played a central role in vetting proposed changes, argued at the time that efforts to make conversations on the platform more civil were "paternalistic," said people familiar with his comments.

Another concern, they and others said, was that some proposed changes would have disproportionately affected conservative users and publishers, at a time when the company faced accusations from the right of political bias.

Facebook revealed few details about the effort and has divulged little about what became of it. In 2020, the questions the effort sought to address are even more acute, as a charged presidential election looms and Facebook has been a conduit for conspiracy theories and partisan sparring about the coronavirus pandemic.

Facebook CEO Mark Zuckerberg speaks about "News Tab" at the Paley Center, Friday, Oct. 25, 2019 in New York. (AP Photo/Mark Lennihan)

In essence, Facebook is under fire for making the world more divided. Many of its own experts appeared to agree -- and to believe Facebook could mitigate many of the problems. The company chose not to.

Mr. Kaplan in a recent interview said he and other executives had approved certain changes meant to improve civic discussion. In other cases where proposals were blocked, he said, he was trying to "instill some discipline, rigor and responsibility into the process" as he vetted the effectiveness and potential unintended consequences of changes to how the platform operated.

MARK ZUCKERBERG, FACEBOOK FOUNDER AND CEO: WHAT TO KNOW

Internally, the vetting process earned a nickname: "Eat Your Veggies."

Americans were drifting apart on fundamental societal issues well before the creation of social media, decades of Pew Research Center surveys have shown. But 60% of Americans think the country's biggest tech companies are helping further divide the country, while only 11% believe they are uniting it, according to a Gallup-Knight survey in March.

At Facebook, "There was this soul-searching period after 2016 that seemed to me this period of really sincere, 'Oh man, what if we really did mess up the world?' " said Eli Pariser, co-director of Civic Signals, a project that aims to build healthier digital spaces, and who has spoken to Facebook officials about polarization.

FILE - In this Aug. 11, 2019, file photo an iPhone displays the Facebook app in New Orleans. (AP Photo/Jenny Kane, File)

Mr. Pariser said that started to change after March 2018, when Facebook got in hot water after disclosing that Cambridge Analytica, the political-analytics startup, improperly obtained Facebook data about tens of millions of people. The shift has gained momentum since, he said: "The internal pendulum swung really hard to 'the media hates us no matter what we do, so let's just batten down the hatches.'"

In a sign of how far the company has moved, Mr. Zuckerberg in January said he would stand up "against those who say that new types of communities forming on social media are dividing us." People who have heard him speak privately said he argues social media bears little responsibility for polarization.

WHAT TO KNOW ABOUT FACEBOOK, THE SOCIAL MEDIA EMPIRE WITH ALMOST 3B USERS

He argues the platform is in fact a guardian of free speech, even when the content is objectionable -- a position that drove Facebook's decision not to fact-check political advertising ahead of the 2020 election.

'Integrity Teams'

Facebook launched its research on divisive content and behavior at a moment when it was grappling with whether its mission to "connect the world" was good for society.

Fixing the polarization problem would be difficult, requiring Facebook to rethink some of its core products. Most notably, the project forced Facebook to consider how it prioritized "user engagement" -- a metric involving time spent, likes, shares and comments that for years had been the lodestar of its system.

A video sign about Facebook is shown on a truck at the State Capitol during a rally in Lansing, Mich., Wednesday, May 20, 2020. (AP Photo/Paul Sancya)

Championed by Chris Cox, Facebook's chief product officer at the time and a top deputy to Mr. Zuckerberg, the work was carried out over much of 2017 and 2018 by engineers and researchers assigned to a cross-jurisdictional task force dubbed "Common Ground" and employees in newly created "Integrity Teams" embedded around the company.

Even before the teams' 2017 creation, Facebook researchers had found signs of trouble. A 2016 presentation that names as author a Facebook researcher and sociologist, Monica Lee, found extremist content thriving in more than one-third of large German political groups on the platform. Swamped with racist, conspiracy-minded and pro-Russian content, the groups were disproportionately influenced by a subset of hyperactive users, the presentation notes. Most of them were private or secret.

WHAT IS MARK ZUCKERBERG’S NET WORTH?

The high number of extremist groups was concerning, the presentation says. Worse was Facebook's realization that its algorithms were responsible for their growth. The 2016 presentation states that "64% of all extremist group joins are due to our recommendation tools" and that most of the activity came from the platform's "Groups You Should Join" and "Discover" algorithms: "Our recommendation systems grow the problem."

Ms. Lee, who remains at Facebook, didn't respond to inquiries. Facebook declined to respond to questions about how it addressed the problem in the presentation, which other employees said weren't unique to Germany or the Groups product. In a presentation at an international security conference in February, Mr. Zuckerberg said the company tries not to recommend groups that break its rules or are polarizing.

"We've learned a lot since 2016 and are not the same company today," a Facebook spokeswoman said. "We've built a robust integrity team, strengthened our policies and practices to limit harmful content, and used research to understand our platform's impact on society so we continue to improve."

Phone sitting on laptop with Facebook desktop site reflecting on screen. /iStock

The Common Ground team sought to tackle the polarization problem directly, said people familiar with the team. Data scientists involved with the effort found some interest groups -- often hobby-based groups with no explicit ideological alignment -- brought people from different backgrounds together constructively. Other groups appeared to incubate impulses to fight, spread falsehoods or demonize a population of outsiders.

In keeping with Facebook's commitment to neutrality, the teams decided Facebook shouldn't police people's opinions, stop conflict on the platform, or prevent people from forming communities. The vilification of one's opponents was the problem, according to one internal document from the team.

TWITTER BOTS PROPEL CORONAVIRUS REOPENING DISCUSSION: STUDY

"We're explicitly not going to build products that attempt to change people's beliefs," one 2018 document states. "We're focused on products that increase empathy, understanding, and humanization of the 'other side.' "

Hot-button issues

One proposal sought to salvage conversations in groups derailed by hot-button issues, according to the people familiar with the team and internal documents. If two members of a Facebook group devoted to parenting fought about vaccinations, the moderators could establish a temporary subgroup to host the argument or limit the frequency of posting on the topic to avoid a public flame war.

Facebook CEO Mark Zuckerberg pauses while testifying before a House Energy and Commerce hearing on Capitol Hill in Washington about the use of Facebook data to target American voters in the 2016 election and data privacy.(AP Photo/Andrew Harnik, File

Another idea, documents show, was to tweak recommendation algorithms to suggest a wider range of Facebook groups than people would ordinarily encounter. Building these features and combating polarization might come at a cost of lower engagement, the Common Ground team warned in a mid-2018 document, describing some of its own proposals as "antigrowth" and requiring Facebook to "take a moral stance."

GET FOX BUSINESS ON THE GO BY CLICKING HERE Taking action would require Facebook to form partnerships with academics and nonprofits to give credibility to changes affecting public conversation, the document says. This was becoming difficult as the company slogged through controversies after the 2016 presidential election. "People don't trust us," said a presentation created in the summer of 2018. The engineers and data scientists on Facebook's Integrity Teams -- chief among them, scientists who worked on newsfeed, the stream of posts and photos that greet users when they visit Facebook -- arrived at the polarization problem indirectly, according to people familiar with the teams. Asked to combat fake news, spam, clickbait and inauthentic users, the employees looked for ways to diminish the reach of such ills. One early discovery: Bad behavior came disproportionately from a small pool of hyperpartisan users. A second finding in the U.S. saw a larger infrastructure of accounts and publishers on the far right than on the far left. Outside observers were documenting the same phenomenon. The gap meant even seemingly apolitical actions such as reducing the spread of clickbait headlines -- along the lines of "You Won't Believe What Happened Next" -- affected conservative speech more than liberal content in aggregate.

CLICK HERE TO READ MORE ON FOX BUSINESS