Outrage is growing regarding Facebook’s (FB) study on user emotions on the social network.
The study, published in the Proceedings of the National Academy of Sciences, looked to determine how happy or sad status updates affected other Facebook “friends.” Researchers from Facebook, University of California, San Francisco and Cornell University manipulated the News Feeds of nearly 700,000 users to control the amount of positive or negative emotional content that appeared in the feed.
According to the researchers, many believe that happy posts on Facebook may be depressing to online friends – an effect they call “alone together.” But Facebook’s study found that when positive posts were reduced, friends were less likely to post happy posts themselves and more likely to post negative posts. And when negative posts were limited, others were more likely to express positive emotions on Facebook.
While the study was in line with Facebook’s data use policy, privacy experts and even the study’s editor have criticized Facebook’s actions. Susan Fiske, the Princeton University professor of psychology who edited the study for PNAS, told The Atlantic that she was “a little creeped out” by the research experiment.
And many turned to Twitter to express their anger at the social network, including privacy expert Lauren Weinstein. Weinstein tweeted a cartoon featuring a stick figure holding a gun to its head with the text, “Facebook secretly experiments on users to try [sic] make them sad. What could go wrong?”
When asked about the study, Facebook directed FOXBusiness.com to a public post on the social network written by Adam Kramer, one of the study’s co-authors and a member of the Facebook Core Data Science Team. In the post, Kramer acknowledged the outrage over the study – but also justified the thinking behind Facebook’s manipulative actions.
“The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook,” wrote Kramer.
He also, however, seemed to downplay the potential negative effects of the study, pointing out that only 0.04% of Facebook users had their News Feeds manipulated.
“The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my co-authors and I are very sorry for the way the paper described the research and any anxiety,” wrote Kramer.
He added that Facebook would be updating internal review practices based on the backlash to this study, and that the social network’s practices had already changed dramatically since the study was conducted two years ago.