Facebook defends 2018 algorithm changes, blames 'partisan divisions' on history

Internal Facebook documents reveal company's 2018 algorithmic changes may have made users more hostile

Facebook is defending its decision in 2018 to make changes to its algorithm that may have made the platform's users more hostile toward each other, according to a new report.

Three years ago, Facebook executives changed its algorithms to promote "meaningful social interactions," or MSIs, in an effort to encourage users to interact more with friends and family, but internal documents obtained by The Wall Street Journal suggest that the change to post rankings may have had the opposite effect.

"The goal of the Meaningful Social Interactions ranking change is in the name: improve people's experience by prioritizing posts that inspire interactions, particularly conversations, between family and friends," a Facebook spokesperson told FOX Business in a statement Thursday. "Is a ranking change the source of the world's divisions? No."

FACEBOOK DENIES HAVING ‘2 SYSTEMS OF JUSTICE’ FOR USERS AFTER SCATHING WSJ REPORT

The spokesperson pointed to "research" showing "certain partisan divisions in our society have been growing for many decades, long before platforms like Facebook even existed."

"It also shows that meaningful engagement with friends and family on our platform is better for people’s well-being than the alternative," the spokesperson said. "We're continuing to make changes consistent with this goal, like new tests to reduce political content on Facebook based on research and feedback."

Facebook staffers in 2018 reportedly flagged the MSI's impact on users who appeared to be becoming more hostile toward each other. The WSJ gave the example of a Buzzfeed article titled, "21 Things That Almost All White People are Guilty of Saying," which got tens of thousands of comments and shares while the outlet's other content related to news, self-care and animals received fewer interactions.

FACEBOOK REVIEWING FTC ANTITRUST LAWSUIT

"Our approach has had unhealthy side effects on important slices of public content, such as politics and news," a team of data scientists wrote in an internal memo obtained by the Journal. Another data scientist called it "an increasing liability" in a later memo.

Lars Backstrom, a Facebook vice president of engineering, said in an interview with the Journal that the algorithm posed a risk of promoting potentially harmful content.

"Like any optimization, there’s going to be some ways that it gets exploited or taken advantage of," he told the outlet. "That’s why we have an integrity team that is trying to track those down and figure out how to mitigate them as efficiently as possible."

GET FOX BUSINESS ON THE GO BY CLICKING HERE

After data scientists discussed the impact of MSIs and possible changes to the algorithm update with Mark Zuckerberg, the Facebook CEO reportedly resisted some of the ideas, according to a 2020 memo reviewed by the Journal. In 2021, Facebook announced that it would start testing ways to 'reduce the distribution of political content in News Feed.

Facebook — and other social media sites, like Twitter — have repeatedly been blamed for sowing divides among users around the world, especially during elections or in times of political and religious conflict, due in part to algorithms that show users the kind of content they enjoy interacting with so that they keep coming back to the platform.

The issue has gained traction in recent years as activists and lawmakers push social media sites to make changes to their algorithms and business models in an effort to prevent more polarization.

FOX Business' Edward Lawrence contributed to this report.