▶ Watch Video: Apple to check U.S. iPhones for images of child sexual abuse
Tech company Apple has delayed its plan to implement technology that would scan people’s iPhones and iPads and report images of child sexual abuse and pornography. The rollout, which was announced in August as part of Apple’s heightened protections for children, has not been issued a new release date.
“Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of child sexual abuse material,” the company said in a statement Friday.
Apple’s plans came under sharp criticism from privacy advocates who argued the technology could be retrofitted and used by government agencies or other officials as a form of spyware and was a steppingstone to large violations of user and public privacy.
“Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” the company said in its release.
The planned feature would have made direct updates to the operating systems of iPhones and iPads, allowing the software to detect explicit images involving a child that were stored in iCloud photos or directly on devices. The programming would suspend users the software flags as having child sexual abuse content and would then report it to the National Center for Missing and Exploited Children.
The update was part of additional features from Apple to ensure safety for child users. In addition to the scanning feature, the company also promoted a communication safeguard for underage users, which would blur explicit images from being sent and received and notify parents if their child opened explicit photos.
Famed former National Security Agency contractor and surveillance whistleblower Edward Snowden wrote in his newsletter Apple’s potential update “will permanently redefine what belongs to you, and what belongs to them.”
The Surveillance Technology Oversight Project, a New York-based anti-surveillance group, said in a statement Friday that the pause of the scanning technology was welcomed, but still did not do enough to prevent breaches of privacy from Apple.
“If you’re building a system to search, you can set it to search for anything,” Executive Director Albert Fox Cahn told CBS News. “When you look at something as grotesque as child abuse, we instinctively want to do anything we can to stop it. In the process, we can easily create systems that exceed their mandates.”
“Even worse, they’ve created a model that can be easily hijacked by any foreign government whose jurisdiction Apple operates in to search for political materials or religious tracts or anything else they want to target,” he said.
Apple has not announced when the new technology will begin but said the company plans to take the next several months to implement additional “feedback” into the programming.