Apple plans to scan iPhone images for child porn, receiving praise and privacy concerns

Because the system only flags known images of child abuse, parents taking photos of their child in the bath, etc. shouldn't trigger a warning

Apple revealed plans to start scanning iPhone photos in the U.S. for images of child pornography, a move that received praise from child protection groups but raised red flags over privacy and the potential for misuse. 

Through the tool "neuralMatch," photos will be digitally scanned for matches to images of known child abuse before being uploaded to iCloud. 

Flagged photos will be reviewed by a person and the account will be locked and law enforcement notified. 

Because the system flags only known images of child abuse, parents' photos of their child in the bathtub, etc., shouldn't trigger a warning.

"Apple’s expanded protection for children is a game-changer," John Clark, president and CEO of the National Center for Missing and Exploited Children, said in a statement. "With so many people using Apple products, these new safety measures have lifesaving potential for children."

The company also plans to scan users' encrypted messages for sexually explicit content as a child safety measure, which also alarmed privacy advocates.

GOOGLE BANS ‘SUGAR DADDY’ APPS FROM ANDROID STORE

Matthew Green, a top cryptography researcher at Johns Hopkins University, said the technology could be used by bad actors to send seemingly normal photos meant to trigger the system to innocent people or it could be used by corrupt governments for surveillance. 

"What happens when the Chinese government says, ‘Here is a list of files that we want you to scan for,’" Green said. "Does Apple say no? I hope they say no, but their technology won’t say no."

Tech companies including Microsoft, Google, Facebook and others have for years been sharing digital fingerprints of known child sexual abuse images. Apple has used those to scan user files stored in its iCloud service, which is not as securely encrypted as its on-device data, for child pornography.

The Electronic Frontier Foundation, an online civil liberties advocacy organization, said the announcement was a "shocking about-face for users who have relied on the company’s leadership in privacy and security."

Apple was one of the first major companies to embrace "end-to-end" encryption, in which messages are scrambled so that only their senders and recipients can read them. Law enforcement, however, has long pressured the company for access to that information in order to investigate crimes such as terrorism or child sexual exploitation.

The Center for Democracy and Technology said the system will destroy the company’s promise of full encryption and claimed it could mistake something like art or a meme for child porn. 

Hany Farid, who invented PhotoDNA, which law enforcement uses to detect child porn online, said the benefits of the system would outweigh the risks. 

He argued that plenty of other programs designed to secure devices from various threats haven't seen "this type of mission creep."

CLICK HERE TO GET THE FOX NEWS APP

Apple also announced that its messaging app will identify and blur sexually explicit photos on children’s phones and can warn parents through a text message. It said that its software would "intervene" when users try to search for topics related to child sexual abuse.

Apple said the latest changes will roll out this year as part of updates to its operating software for iPhones, Macs and Apple Watches.

The Associated Press contributed to this report.