Facebook data scandal raises questions about too much privacy

The firestorm over Facebook Inc.'s handling of personal data raises a question for those pondering a regulatory response: Is there such a thing as too much privacy?

Recent scrutiny of data-analytics firm Cambridge Analytica has shown how questionable actors can abuse the power of networks that play an increasingly large role in society. Facebook claims Cambridge Analytica violated its policies, a charge the firm denies. The firm, which counts Donald Trump's presidential campaign among its clients, crunched the data of 50 million Facebook profiles claiming it could predict individual personality traits and make ads more effective.

Legislators, the Federal Trade Commission and other agencies now are considering rules to protect the privacy of users of social networks like Facebook. While those efforts remain in the early stages, even tech companies say privately they expect some regulation to happen down the road.

Yet some law-enforcement agencies, including the Federal Bureau of Investigation, and national-security advocates point to a tradeoff, noting that too much privacy can be as bad as too little. Bad actors take advantage of both extremes, abusing access to individuals on networks that are too open or freely conspiring on systems that are too closed.

Law-enforcement agencies rely on access to user data as an important tool for tracking criminals or preventing terrorist attacks. As such, they have long argued additional regulation may be harmful to national security.

Telegram is an example of a service offering users complete security. Encrypted from end to end, domiciled in a country out of reach of subpoenas -- and very easy to use -- the app is among the top choices of people worried about snooping governments and malicious third parties. Telegram's reputation has been a double-edged sword.

Clinton Watts, a senior fellow at George Washington University's Center for Cyber and Homeland Security, said such apps are a big concern for law enforcement. "This is perfect for terrorist groups that want to network, propagate their message and recruit new members," he said.

Telegram is popular in countries like Iran, where it was instrumental in helping the population organize the wave of antigovernment protests that swept across the country in early January. But it also has become known as the app of choice for Islamic State and other extremist groups, after U.S.-based tech companies like Twitter Inc. began cooperating with government agencies, removing accounts and content that promoted violence.

Governments have little recourse. Iran blocked Telegram during government protests earlier this year, and Russia is threatening to block it unless it turns over user data.

Mr. Watts, who previously worked as an FBI special agent on a counterterrorism task force, said law-enforcement agencies need to invest a lot more in human intelligence and undercover investigators to penetrate secure online spaces.

Some U.S. firms are already adapting to fears of new regulation and offering even greater security than Telegram. Signal, in San Francisco, is emerging as one of the more successful examples. It says it deletes all user information once it is no longer necessary for communication, making it impossible to comply with demands for users' personal data.

That would make Signal more secure than, for example, WhatsApp, the popular encrypted messaging service, which Facebook bought in 2014 and that stores information such as with whom users are communicating and when.

"When we receive a subpoena for user data," Signal founder Moxie Marlinspike posted on the company's website, we "have nothing to send back but a blank sheet of paper."

Observers warn the #deletefacebook movement will drive more users to these secure systems.

Telegram's founder, the Russian entrepreneur Pavel Durov, said the firm recorded 200 million active users in March, a 70% increase on the year. "We don't do deals with marketers, data miners or government agencies," he wrote in the post on Wednesday. "For us Telegram is an Idea: it is the idea that everyone on this planet has a right to be free."

Mr. Durov has relocated the company several times since leaving Russia, where it faces a court order to turn over encryption keys to the intelligence services. It is now based in the United Arab Emirates.

Telegram's terms are simple: No calls to violence, porn or copyright infringement on public channels. The app can't take action on private channels because all private content is encrypted and largely inaccessible even to the company. The Telegram press team didn't respond to repeated requests for comment, but the company says it closes hundreds of public channels every day that promote violence or extremist content.

Opportunities for terrorists to exploit secure networks to boost recruitment and spread propaganda were evident in the aftermath of the Friday's attack in France, when 25-year-old Radouane Lakdim shot at police and took hostages at a small-town supermarket.

Islamic State supporters immediately rallied on Telegram channels, using the incident to call on others to take action and launch a public campaign on Twitter, according to SITE Intelligence Group, which monitors extremist activity online.

Now that U.S. firms are cooperating to an extent with government authorities, apps like Telegram fill an important gap in the market by providing a platform for terrorists to radicalize and spur members to action, said Jesse Morton, a former al Qaeda recruiter who works as a coordinator at the Institute of Strategic Dialogue's Against Violent Extremism network.

"People that are more committed and pose a greater risk are still able to view generalized propaganda," Mr. Morton said. "It's a grooming process."