An update posted Sept. 3 on the Apple website said that the company has decided "to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."
The company stressed that the features in question are "intended to help protect children from predators." The features, supposedly developed in collaboration with child safety experts, aimed to enable parents with new communication tools to monitor their children’s online chats, to limit the spread of child sexual abuse material (CSAM) and guidance for parents and children if they encounter "unsafe situations."
The second item raised concerns from advocacy groups over possible privacy issues: Apple planned to use a tool that would scan files before they’re uploaded to the iCloud backup storage system as well as a second, later tool that would scan encrypted messages for sexually explicit content.
Critics argued that the tools could allow hackers or other governments to co-opt a "peephole," exposing personal, sensitive information.
Apple neither set a specific time for when it would roll out the new technology nor has it provided a date at which it might try to do so in the future after backtracking on initial plans to release the tech.
The intense backlash to the scanning technology was particularly bruising for a company that has made personal privacy a marketing mantra. Apple contends it is more trustworthy than other major technology companies such as Google and Facebook that vacuum up information about people's interests and location to help sell digital ads.
Apple CEO Tim Cook is known to repeat the catchphrase "Privacy is a fundamental human right."
Cindy Cohn, executive director for the Electronic Frontier Foundation, said, "If you are going to take a stand for people’s privacy, you can’t be scanning their phones."
Apple did not respond to FOX Business’ request for comment.
The Associated Press contributed to this report.