Facebook fires back after whistleblower claims

Although Facebook does not agree with Frances Haugen's claims, the social media giant agreed it's time for Congress to create standard rules for the internet

Following former Facebook product manager-turned-whistleblower Frances Haugen's shocking testimony to the Senate Commerce Committee, Science, and Transportation Subcommittee on Consumer Protection, Product Safety, and Data Security on Tuesday, the social media giant is firing back.

Ticker Security Last Change Change %
FB META PLATFORMS, INC. 306.84 -3.55 -1.14%

Facebook Director of Policy Communications Lena Pietsch blasted Haugen, noting she "worked for the company for less than two years, had no direct reports, never attended a decision-point meeting with C-level executives – and testified more than six times to not working on the subject matter in question." 

Although Pietsch said Facebook does not agree with the characterizations made by Haugen, the company did agree that it's time to create standard rules for the internet. 

"It’s been 25 years since the rules for the internet have been updated, and instead of expecting the industry to make societal decisions that belong to legislators, it is time for Congress to act," Pietsch concluded. 


During her testimony, Haugen argued that Facebook's products "harm children, stoke division, and weaken our democracy" and that the company's leadership "won’t make the necessary changes because they have put their astronomical profits before people."

She added that the documents she leaked to the Wall Street Journal show how Facebook "repeatedly misled the public about what its own research reveals about the safety of children, the efficacy of its artificial intelligence systems, and its role in spreading divisive and extreme messages." 

"I came forward because I believe that every human being deserves the dignity of the truth," she said. "As long as Facebook is operating in the shadows, hiding its research from public scrutiny, it is unaccountable. Until the incentives change, Facebook will not change. Left alone, Facebook will continue to make choices that go against the common good."

Former Facebook data scientist Frances Haugen speaks during a hearing of the Senate Commerce, Science, and Transportation Subcommittee on Consumer Protection, Product Safety, and Data Security, on Capitol Hill, Tuesday, Oct. 5, 2021, in Washington. ( ( (AP Photo/Alex Brandon))

While at Facebook, Haugen said she observed a pattern of behavior in which Facebook's inability to retain employees resulted in "an implicit discouragement from having better detection systems."

"Facebook is stuck in a cycle where it struggles to hire. That causes it to understaff projects, which causes scandals, which then makes it harder to hire," she said. "I worked on the counterespionage team, and at any given time, our team could only handle a third of the cases we knew about. We knew that if we built even a basic detector, we would likely have even more cases."

She also expressed concerns that the way Facebook is currently operating poses a risk to national security.  

"My team directly worked on tracking Chinese participation on the platform, surveilling, say, Uyghur populations, in places around the world. You could actually find the Chinese based on them doing these kinds of things," Haugen explained. "We also saw active participation of, say, the Iran government doing espionage on other state actors. So this is definitely a thing that is happening. And I believe Facebook's consistent understaffing of the counter espionage information operations and counterterrorism teams is a national security issue."

While Haugen said she does not believe Facebook intends to intentionally promote "divisive, extreme, polarizing content", she argued that they are "aware of the side effects of the choices they have made around amplification" and know that algorithmic based rankings keep users on their site longer, and as a result, earn them more money.

She also emphasized that even though there is "no unilateral responsibility" at Facebook and that company decisions are heavily based on metrics, the buck stops with Facebook CEO Mark Zuckerberg and "there's no one currently holding him accountable but himself."


Going forward, Haugen called on lawmakers to reform Section 230 of the Communications Decency Act, which protects tech companies from liability related to what their users post, and exempt Facebook's algorithms and engagement-based ranking from its protections. 

"Companies have 100% control over their algorithms, and Facebook should not get a free pass on choices it makes to prioritize growth and virality and reactiveness over public safety," she said. 

She also recommended that Congress seek out Facebook's research related to the "addictiveness" of its product and what the company knows about parents' lack of knowledge of its platforms. In addition, she called on more whistleblowers to step forward and ensure the public has the information needed to make sure technologies like Facebook are human centric rather than computer centric. 

While some lawmakers have called for breaking up Big Tech, Haugen argued such a move won't fix Facebook's fundamental problems.

"If you split Facebook and Instagram apart it's likely that most advertising dollars will go to Instagram and Facebook or continue to be this Frankenstein that is endangering lives around the world, only now there won't be money to fund it," Haugen said. "So I think oversight and regulatory oversight and finding solutions with Congress is going to be key, because these systems are going to continue to exist and be dangerous even if broken up."

For now, lawmakers have called on Facebook CEO Mark Zuckerberg to testify and have suggested more hearings related to Haugen's testimony could be scheduled in the future.