The pushback against Apple’s plan to scan iPhone photos for child exploitation images was swift and apparently effective.
Apple said Friday that it is delaying the
“Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material,” a September 3 update at the top of the
Announced in August, the new feature for iOS 15 would have checked photos in an iPhone user’s photo library — on the device before sending the photos to iCloud — against a database of known CSAM images. If the automated system found a match, the content would be sent to a human reviewer, and ultimately reported to child protection authorities.
The fact that the scanning happened on the device alarmed both experts and users. Beyond it being generally creepy that Apple would have the ability to view photos users hadn’t even sent to the cloud yet, many criticized the move as hypocritical for a company that has leaned so heavily into privacy. Additionally, the Electronic Frontier Foundation criticized the ability as a “backdoor” that could eventually serve as a way for law enforcement or other government agencies to gain access to an individual’s device.
“Even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor,” the EFF
Experts who had criticized the move were generally pleased with the decision to do more research.
Others said the company should go further to protect users’ privacy. The digital rights organization fight for the future said Apple should focusing on strengthening encryption.
While other