A former Facebook data scientist who became a whistleblower on her way out of the company last September is calling for more transparency and oversight of social media giants while arguing in favor of keeping the controversial Section 230 clause in place.
"It’s certainly true that social media is a natural monopoly because people look for social media networks that all their friends are on," Sophie Zhang told FOX Business Friday. "And that means that once there’s an established company, it’s really hard to introduce a competitor."
But unlike other "natural monopolies," like utilities companies, social media giants are not subject to tight government regulations, she said.
Still, just breaking them up is only part of the solution, she added.
As FOX Business previously reported, Zhang turned down a $64,000 severance package from Facebook in order to retain her ability to speak publicly about the company.
While humbly admitting she’s not a policy expert, Zhang says she has a few ideas on how to clean up coordinated inauthentic behavior – or activity that includes using networks of fake identities to share, like or comment on posts, artificially boosting their engagement metrics and broadening their reach.
Zhang made major headlines in September 2020 when Buzzfeed News reported on a leaked memo she wrote for Facebook employees that alleged leaders in countries including Brazil, India, Honduras and Azerbaijan used networks of fake accounts to promote their agendas or drown out critics as the company looked away. In some cases, the governments didn’t even try to hide the activity.
"On the Internet, a single person can pretend to be a crowd," she said. "There’s no way to do that in real life – I don’t know how anyone can go out on the street and suddenly pretend to be 100 people."
And when people try to force a crowd to support something, that can have unexpected results for the organizers, she said, giving the example of Romania’s former Communist dictator Nicolae Ceaușescu.
"He gave a speech to 100,000 people in Bucharest who were bused in and given signs to support him," Zhang said. "And the crowd turned on him during the speech, and suddenly, what had [began as] a display of strength for the Romanian Communist government became a national revolution."
Four days later, on Christmas in 1989, Ceaușescu and his wife were tried and executed, according to the Associated Press.
But over the Internet, rather than busing in people to act as supporters, they can just be created from scratch in large numbers at the click of a button with a fake name and stock photo -- and that’s what Zhang investigated at Facebook.
Facebook has publicly touted its crackdown on this kind of activity, announcing major breakups of bot or troll networks, mass bans and group shutdowns.
Its most recent report on coordinated inauthentic behavior, published on July 8, noted the breakup of an operation in Mexico like what Zhang had found in India and Brazil – networks using fake interactions to promote lawmakers in those countries during election season.
Facebook said it removed more than 3,000 accounts, pages and groups connected to Worgcorp, a political strategist and public relations firm.
"They also created pages designed to look like user profiles — using false names and stock images — to comment in Spanish and amplify content about various candidates in the June 2021 election in Campeche," the report reads. "They primarily focused on the gubernatorial election in the state of Campeche, including promoting two opposing candidates for governor."
The size and reach of social media giants make them desirable targets for such operations, but Zhang said there are pros and cons to breaking up the monopolies.
"Because of the fact that Facebook owns Instagram, Instagram was able to benefit from my expertise while I was working at Facebook," she said. "When I found the [Azerbaijan] government’s troll farm that was harassing their opposition on Facebook, we are also able to immediately, without any delays, take down the operation on Instagram as well."
But a Honduran operation used both Facebook and Twitter, and Zhang said Twitter addressed the issue months after Facebook did.
"I’m certainly not suggesting that Facebook should own Twitter," she said. "But I’m making this as an example of how breaking up the companies without doing anything else can have unforeseen consequences."
She also offered three suggestions for how to combat CIB.
One is to enforce a separation between teams responsible for oversight and teams responsible for maintaining good relationships with foreign governments. If the same team handles both, it can be easy to let bad behavior slide.
Another suggestion is government regulations requiring social media giants to be transparent about issues like CIB.
"Right now, Facebook just gives an incomplete picture that’s in its own interests," she said.
And the third involves "penetration tests" in which outside agencies would attempt CIB attacks and then publicly grade how well the social media giants handled them.
These regulations could potentially be enforced by the U.S., European Union and other allies, she said.
"Unfortunately, the United Nations, I don’t think it’s a good way to go," she said. "The U.S. should cooperate with the European Union on this because they both have similar concerns, but currently they’re going at it very differently."
As for Section 230, the controversial liability protection in the 1996 Communications Decency Act that has been the subject of criticism from both Republicans and Democrats in recent years, she said she thinks it should stay.
"I think Section 230 certainly is an important part of the modern Internet, and repealing it would destroy the Internet as we know it," Zhang said. "I think that most people who discuss repealing Section 230 are motivated by wanting less content moderation by social media, but the actual reality is that it would increase the content moderation."
Section 230 protects Internet platforms from being held responsible for content shared to their sites by third-party users. Lawmakers on both sides of the aisle have proposed repealing or reforming it.
But simply doing away with it could lead social media companies and other websites and apps that allow user-generated content to narrowly define what’s acceptable in order to avoid any potential liabilities.
But Zhang was quick to note her expertise is less on content moderation than on detecting organized attempts to use networks of fake entities to influence public opinion.