Google-Owned Tech Firm May Have 'Inappropriate Access' To 1.6m NHS Patient Files

Google-Owned Tech Firm May Have 'Inappropriate Access' To 1.6m NHS Patient Files

A tech firm may have received access to 1.6 million NHS patient files on an inappropriate legal basis, according to a letter sent by the National Data Guardian.

DeepMind, which is owned by Alphabet, the parent company of Google, accessed the patient information as part of a test for an app that can alert doctors to patients who are at risk from kidney injuries.

DeepMind received the data to test a smartphone app called Streams which is solely for the use of healthcare professionals and can identify patients who are at risk from acute kidney injuries (AKI) and alert doctors.

A letter, obtained by Sky News, from the National Data Guardian (NDG) Dame Fiona Caldicott to Professor Stephen Powis, the medical director of the Royal Free Hospital in London, raises questions about the legal basis behind the transfer of information.

She explained that the development of new technology "cannot be regarded as direct care", adding: "Implied consent is only an appropriate legal basis for the disclosure of identifiable data for the purposes of direct care if it aligns with people's reasonable expectations.

"When I wrote to you in December, I said I did not believe that when the patient data was shared with Google DeepMind, implied consent for direct care was an appropriate legal basis."

Dame Fiona's letter adds: "It is my view and that of my panel that the purpose for the transfer of 1.6 million identifiable patient records to Google DeepMind was for the testing of this Streams application, and not for the provision of direct care to patients.

"My considered opinion therefore remains that it would not have been within the reasonable expectation of patients that their records would have been shared for this purpose."

The Information Commissioner's Office (ICO) is investigating the transfer of data.

An ICO spokeswoman said the probe is "close to conclusion", adding: "We continue to work with the National Data Guardian and have been in regular contact with the Royal Free and DeepMind who have provided information about the development of the Streams app.

"This has been subject to detailed review as part of our investigation. It's the responsibility of businesses and organisations to comply with data protection law."

A spokesman for the Royal Free said: "The Streams app was built in close collaboration with clinicians to help prevent unnecessary deaths by alerting them to patients in need in a matter of seconds. It is now in use at the Royal Free, and is helping clinicians provide better, faster care to our patients. Nurses report that it is saving them hours each day.

"We took a safety-first approach in testing Streams using real data. This was to check that the app was presenting patient information accurately and safely before being deployed in a live patient setting.

"Real patient data is routinely used in the NHS to check new systems are working properly before turning them fully live. No responsible hospital would ever deploy a system that hadn't been thoroughly tested. The NHS remained in full control of all patient data throughout.

"This project, designed to help prevent unnecessary deaths using new technology, is one of the first of its kind in the NHS and there are always lessons we can learn from pioneering work.

"We take seriously the conclusions of the National Data Guardian, and are pleased that they have asked the Department of Health to look closely at the regulatory framework and guidance provided to organisations taking forward this type of innovation, which is essential to the future of the NHS.

"We are proud of the work we have done with DeepMind and will continue to be bold and brave for the benefit of our patients."

Commenting on the news, Phil Booth, co-ordinator of patient data protection group medConfidential, said: "Every flow of patient data in and around the NHS must be safe, consensual and transparent.

"Patients should know how their data is used, including for possible improvements to care using new digital tools. Such gross disregard of medical ethics by commercial interests - whose vision of 'patient care' reaches little further than their business plan - must never be repeated."

A spokesman for DeepMind Health said: "Nurses and doctors have told us that Streams is already speeding up urgent care at the Royal Free and saving hours every day.

"The data used to provide the app has always been strictly controlled by the Royal Free and has never been used for commercial purposes or combined with Google products, services or ads - and never will be.

"Clinicians at the Royal Free put patient safety first by testing Streams with the full set of data before using it to treat patients. Safety testing is essential across the NHS, and no hospital would turn a new service live without testing it first.

"We're glad the NDG has said that further guidance would be useful to organisations which are undertaking work to test new technologies.

"We also recognise that there needs to be much more public engagement and discussion about new technology in the NHS. We want to become one of the most transparent companies working in NHS IT, appointing a panel of Independent Reviewers, embarking on a major patient involvement strategy, and starting a ground-breaking project called Verifiable Data Audit.

"We believe that these steps are helping to set a new standard of transparency across the health system."

Close

What's Hot