Apple stalls monitoring of images on personal devices

Cantillon: Privacy issues appear to take priority over child protection for iPhone giant

It is unclear if Apple can improve features enough to keep both child safety campaigners and privacy advocates happy.
It is unclear if Apple can improve features enough to keep both child safety campaigners and privacy advocates happy.

The news that Apple has delayed in implementing new systems designed to pick up child sex abuse materials on the devices of its users is not entirely surprising.

When the company announced last month that it planned new safety features for its Phone, iPad and MacOS software that would pick up child sexual abuse material, the backlash was swift and it was brutal. Apple was accused of selling out on its previous assertions that “what happens on your iPhone stays on your iPhone”.

The concept itself was intended to do good. Apple had proposed matching images being uploaded from a user’s iPhone to their iCloud photo library against a database of known child sexual abuse images. A separate feature aimed at younger users would screen iMessages for explicit photographs.

Potential overreach

While the measures were welcomed by child protection groups in the US, the potential for overreach raised concerns among privacy experts. The new system put users at risk from totalitarian governments, opponents said, and provided a backdoor of sorts into Apple’s encryption.

READ MORE

For a company that had prided itself on its approach to protecting user privacy, being on the other side of the fence must have been an unsettling experience. And so last week, Apple rowed back on the plans, saying it would delay their introduction and take time “to collect input and make improvements”.

Where to now from here? It is unclear if Apple could improve those features enough to keep both child safety campaigners and privacy advocates happy.

Scanning for child abuse images is not new. Tech companies have been doing it for more than a decade. But that scanning usually takes place in the cloud; Apple was moving it to the device.

That approach may have worked for Apple in the past. But the idea that the company was reaching into people’s devices was a step too far, even in the name of child protection.