Subscriber OnlyTechnology

Apple’s proposed surveillance tech for iPhones aimed at child protection was seriously flawed

Client-side scanning aimed at child protection was seriously flawed

A report by some of the best-known names in computer security warns that CSS would introduce unacceptably serious security and privacy risks. Photograph: Getty

Apple’s announcement in August that the iPhones of people in the US would be scanned automatically for images of child sexual abuse provoked controversy from the start.

Apple’s proposed system, which uses a technique called client-side scanning (CSS), has become a telling example of an unsound idea which initially might look fine, even laudable.

The CSS proposal was praised and welcomed by many child protection organisations. However, rapid pushback from computing experts, privacy organisations and iPhone owners caused Apple to halt the effort, which remains on hold.

Many were taken aback that Apple, of all companies, would introduce a plan for blanket iPhone on-device surveillance

Apple’s system would work by scanning every image added to a person’s iPhone (the device “client”). For this to happen, the system would be integrated into every US iPhone as part of iOS, the operating system.

READ MORE

All images would be compared against a library of known child abuse images already in circulation. Any match would be flagged. Apple employees could review the image, and if a match was verified, the person’s iPhone account would be cut off, and Apple would notify the National Center for Missing and Exploited Children.

Many were taken aback that Apple, of all companies, would introduce a plan for blanket iPhone on-device surveillance, given that it’s a company that has made protecting user privacy a marketing point (although that claim has been variously disputed).

Apple dominates the US smartphone market, holding 64 per cent at the end of 2020, according to Counterpoint Research. That means a majority of US smartphone users would have always-on phone surveillance occurring in the background.

Detailed report

Now, a new reportby some of the best-known names in computer security and cryptography warns that CSS would introduce unacceptably serious security and privacy risks.

Entitled Bugs in our Pockets: The Risks of Client-Side Scanning, the detailed, 46-page report explains the many problems with CSS. This particular study is difficult for anyone in government, security or law enforcement to dismiss or ignore.

Released by over a dozen global experts in security and cryptography, the report calls CSS an unwarranted, ineffective and risky invasion of privacy and security that opens up perilous security holes and is inherently “dangerous”.

While security and privacy concerns were cited from the very start of the Apple initiative being announced – a backlash introduction for the general public to the concept of CSS – no analysis of CSS has taken such a deep, comprehensive, and most importantly, spectacularly authoritative dive into the issues and concerns it raises.

A much larger worry is a looming official push for expansion of CSS use globally

Among the authors are creators of well known and widely used security algorithms and security concepts that have been used to protect billions of devices and computing systems over the past few decades: Hal Abelson, Ross Anderson, Steven M Bellovin, Josh Benaloh, Matt Blaze, Jon Callas, Whitfield Diffie, Susan Landau, Peter G Neumann, Ronald L Rivest, Jeffrey I Schiller, Bruce Schneier, Vanessa Teague, and Carmela Troncoso.

“CSS neither guarantees efficacious crime prevention nor prevents surveillance. Indeed, the effect is the opposite,” the authors state in their introduction. “CSS by its nature creates serious security and privacy risks for all society while the assistance it can provide for law enforcement is at best problematic.”

The paper goes on to detail the “multiple ways in which client-side scanning can fail, can be evaded, and can be abused.”

EU objections

One author, Ross Anderson, professor of security engineering at the University of Cambridge, argues in a blog post that allowing the introduction of CSS onto personal devices will inevitably end up being challenged and disallowed at the highest level in the EU.

The European Court of Justice has already struck down the use of indiscriminate mass technological surveillance when it invalidated the EU Data Retention Directive in 2014, via a case brought by Digital Rights Ireland.

The huge risk is that if allowed, CSS will inevitably be a slippery slope for further uses that will endanger the lives of the vulnerable, as well as systems of democracy.

Anderson writes: “If device vendors are compelled to install remote surveillance, the demands will start to roll in. Who could possibly be so cold-hearted as to argue against the system being extended to search for missing children?”

But, he adds, "Then [Chinese] president Xi [Jinping] will want to know who has photos of the Dalai Lama, or of men standing in front of tanks; and copyright lawyers will get court orders blocking whatever they claim infringes their clients' rights. Our phones, which have grown into extensions of our intimate private space, will be ours no more; they will be private no more; and we will all be less secure."

Concerns about CSS go well beyond Apple's version of the software. A much larger worry is a looming official push for expansion of CSS use globally. The EU has considered allowing both CSS and server-side scanning of texts and video as well as photos, and for other uses beyond child pornography, according to past discussions and a leaked EU report.

That’s why this timely security report is so important. It’s a well-evidenced call for sanity and caution that officials, and society, must heed.