UK health authority broke law in giving patient data to Google DeepMind

Information Commissioner’s Office seeks lessons for wider health service after inquiry into handover of 1.6m records

The UK Information Commissioner’s Office on Monday said the Royal Free NHS Foundation Trust had not complied with the Data Protection Act when it turned over the sensitive medical data to Google DeepMind.
The UK Information Commissioner’s Office on Monday said the Royal Free NHS Foundation Trust had not complied with the Data Protection Act when it turned over the sensitive medical data to Google DeepMind.

A health authority in the UK broke the law by handing over the sensitive medical data of about 1.6 million patients to artificial intelligence company Google DeepMind as part of a clinical safety trial.

The Information Commissioner’s Office on Monday said the Royal Free NHS Foundation Trust had not complied with the Data Protection Act when it turned over the sensitive medical data.

Elizabeth Denham, the commissioner, said the trust had been asked to sign an undertaking committing it to changes to ensure it was acting in accordance with the law. Her office, which has the power to impose fines but did not do so in this case, said it would be working with the trust to make sure that happened.

In a blog post, Ms Denham asked what lessons could be learned from the case as organisations “increasingly look to unlock the huge potential that creative uses of data can have for patient care”.

READ MORE

Records

Under a 2015 agreement, the trust handed over about 1.6 million records, including information on individuals who had presented for treatment in the previous five years for tests, as well as electronic patient radiology records.

The purpose was to carry out clinical safety testing as part of the development of a new clinical detection and diagnosis application for the trust in relation to acute kidney injury.

The platform was formalised into a mobile application, known as Streams. Streams moved to live deployment in February this year and it is now in active use by Royal Free clinicians.

The app sends an alert to a clinician’s smartphone if a patient’s condition deteriorates and allows them to view a patient’s medical records.

It has been reported that DeepMind was allowing the Royal Free to use the patient monitoring smartphone app for free, but that it would have to pay a “service fee” if DeepMind ended up providing more than £15,000 in “support” per month.

Trial

This would likely involve computing power and DeepMind staff costs. The Royal Free told Business Insider last month it had not yet paid DeepMind any fees.

Ms Denham said it was welcome that the trial looked to have been positive.

“But what stood out to me on looking through the results of the investigation is that the shortcomings we found were avoidable. The price of innovation didn’t need to be the erosion of legally ensured fundamental privacy rights,” she wrote.

In her findings, the commissioner said the Royal Free had not to her satisfaction evidenced a valid legal condition for processing the sensitive personal data during the clinical safety testing phase.

DeepMind also processed the 1.6 million partial patient records for the purpose of clinical safety testing without those patients having been informed.

Ms Denham also said the processing of patient records by DeepMind “significantly differs” from what the patients might reasonably have expected to happen to their data when presenting at the Royal Free for treatment.

Legislation

The mechanisms to inform those patients that their data would be used in the clinical safety testing of the Streams application were “inadequate”.

The processing was neither fair nor transparent and was not lawful under data protection legislation, the commissioner found.

DeepMind was founded in London in 2010 and was acquired by Google in 2014.

In a statement, the company said: “We were almost exclusively focused on building tools that nurses and doctors wanted, and thought of our work as technology for clinicians rather than something that needed to be accountable to and shaped by patients, the public and the NHS as a whole.

“We got that wrong, and we need to do better.”