Letter in response to Google DeepMind and healthcare in an age of algorithms

  • Dominic King
  • Alan Karthikesalingam
  • Cían Hughes
  • Hugh Montgomery
  • Rosalind Raine
  • Geraint Rees
  • On behalf of the DeepMind Health Team
Editorial

Clinical treatment has always been based on the intelligent use and analysis of patient data, but many of the technologies that help us make sense of complex data every day - from smartphones to machine learning tools - are yet to find widespread use in health systems like the English NHS. Important questions remain about the right way to deploy data-driven innovations to improve patient care, whilst maintaining public trust in the use and security of their sensitive health information. The article ‘Google DeepMind and healthcare in an age of algorithms’ (doi:10.1007/s12553-017-0179-1) discussed a now-superseded 2015 agreement between the Royal Free London NHS Foundation Trust (RFL) and DeepMind Health (DMH), using it to draw conclusions about these wider issues and criticising both parties in strong terms throughout. At the time of publication, both RFL and DMH expressed concern that this article contained factual errors, which risked causing confusion both about the substance of the 2015 agreement and, by extension, the broader issues raised.

Since then the situation has, to an extent, moved on. On 3 July 2017 the Information Commissioner’s Office (ICO) concluded its investigation into RFL, focusing on the Trust’s clinical safety testing of DeepMind’s clinical app Streams in late 2015 and 2016, which was intended to guarantee that the service could be deployed safely at the hospital. The ICO was not satisfied that there was a legal basis for this use of patient data in testing, and raised concerns about how much patients knew about this work. While the ICO’s investigation was into RFL, both RFL and DMH have accepted these criticisms, and each has taken steps to improve the legal foundations and transparency of future work. There is no doubt that mistakes were made, and lessons must be learned.

While the ICO's ruling confirmed several points made by Dr Powles and Mr Hodson in their article, it did not support others, and in some cases came to completely different conclusions. This case, like others before it, may well serve as a reference point for NHS organisations, innovators and researchers for years to come, and the right lessons will be learned only if the facts are correctly understood.

At the suggestion of the article’s authors and this journal’s editors, we have therefore taken the opportunity to outline our points of disagreement with the original article. In our detailed response1, we listed a total of 37 factual inaccuracies and unevidenced statements. For the purposes of brevity, we will only address some of the more significant points here.

DMH’s mobile-based clinical app, Streams, was built to alert clinicians using an existing NHS algorithm when the presence of Acute Kidney Injury (AKI) is suggested by routine blood tests. Streams was developed and built using synthetic data. Like the myriad other clinical software systems designed to improve patient care, Streams processes patient data securely, with clinician-only access to the output. Only data that are already available to clinicians (in existing electronic systems or paper notes) can be accessed by them via Streams. In health systems around the world, services of this kind are generally commissioned from third party providers by healthcare organisations, through service contracts and data processing agreements.

However, the authors suggest that DMH cannot be simply implementing clinical software. They surmise that illicit and illegal data mining or research must be being undertaken by DMH on patient data with or without the knowledge of the RFL. They state this could give ‘DeepMind a lead advantage in developing new algorithmic tools on... publicly-generated datasets’. They argued that DMH was not a data processor, but a co-controller, claiming that it seemed ‘DeepMind assum[ed] the role of a joint data controller,’ which could perform unknown commercial ‘repurposing of Trust-wide Royal Free data.’ This core allegation, on which most of their other claims rest, is unevidenced and untrue. In fact, in the ICO’s letter concluding their year-long investigation, the regulator found that RFL ‘has retained its data controller responsibilities throughout [the ICO’s] investigation, and continues to do so.’

The authors also state that RFL ‘contracted with DeepMind to analyse complex data and come up with solutions by applying DeepMind’s own expertise in analysis to an extent that Royal Free (sic) cannot begin to do’ including ‘developing software using patient data’. They assert that DMH could secretly use RFL patient data for aims other than direct care. Again, this is not true. DMH are not using any RFL patient data in the development of software or algorithms, nor are we processing RFL data using machine learning. This would not be covered by the current agreements and would require separate contractual agreements and governance approvals.

Indeed, such use of RFL data would be against the law, given the controller-processor relationship. DMH, the processor, can only process patient data as directed by RFL, the data controller. Whilst we agree that the 2015 Information Sharing Agreement (ISA) could and has been improved upon, it contained explicit restrictions on how DMH could process data, stating ‘[t]he processor will act in accordance with the Data Controller’s instructions … and will only use the personal data to provide the services under this Agreement.’ DMH has been very open about its ambitions to use technology, machine learning, and AI to enhance healthcare and patient safety, but, as a data processor, it cannot repurpose RFL patient data. Nor is it a ‘first mover’ in healthcare technology, which long predates DeepMind’s founding in 2010.

More damagingly, the article asserts that ‘[f]or millions of patients in the Royal Free’s North London catchment, [Google] now has the potential to know even more.’ The suggestion that ‘Google’ can access data processed for the Streams application is frightening for the public, and has no basis in fact. Such action would, indeed, be illegal under UK law. No RFL patient data were, are or will ever be connected to Google accounts or services, or used for any commercial purposes. Data are fully encrypted, stored in a high-security facility in England, and separated at all times from any other systems. They are accessed only by those individuals at DeepMind who need access to the data as part of the provision of Streams, who have been through the necessary information governance approvals and training. All data are deleted entirely from our systems when no longer being used. The authors also claim that the public ‘have no power to find out what Google and DeepMind are really doing with NHS patient data, nor the extent of Royal Free’s meaningful control over what Google and DeepMind are doing’. This is not the case: data access can be verified through audit, and is regulated by the ICO.

Finally, the authors - neither of whom are clinicians - make a series of medical claims when discussing the necessity of the data processed for Streams. They assert that the scope of data sharing exceeds its clinical utility and therefore must be being exploited for other purposes, giving examples of diagnoses that they argue are not relevant to AKI, such as ‘setting broken bones.’ In fact, AKI is a recognised complication of both trauma and elective orthopedic procedures, making broken bones clinically relevant.2 Further, a past history of broken bones may in itself help point to disease states which can predispose to (or be associated with) AKI (such as myeloma, or renal osteodystrophy respectively). Standard practice also requires that clinicians treating a patient with AKI (or any other medical emergency) have access to the full medical history. Such data are routinely accessed from existing electronic and written patient records or through direct communication with patients.

This is the key issue: DMH’s work with the RFL is not an ‘unfettered’ data grab of the sort they allege. AKI is a serious condition that is both common and potentially deadly, and the diagnosis and management of which is recognised to be falling short. Streams, which alerts clinicians to the potential presence of AKI, and rapidly presents existing data in a form which helps in disease management, is focused on saving lives and ensuring the right care reaches patients more quickly. Whilst DMH wholeheartedly acknowledges our own shortcomings, and we have redoubled our efforts to improve our legal frameworks and public transparency, there is no evidence that the misuses of data alleged by Dr Powles and Mr Hodson exist. They do not.

However, we do agree with the authors on one critical point: ‘Public health services such as the... NHS are deeply complex systems’. Data processing is one of the most complicated topics within this system, and efforts appear to be underway to provide updated guidance and clarity in this area. In a letter from the National Data Guardian (NDG) to RFL, dated 20th February 2017, Dame Fiona Caldicott wrote that ‘further guidance’ on patient data would be useful, and that the Department of Health was looking closely at this. The recent government response to the NDG’s ‘Review of Data Security, Consent and Opt-Outs’3 notes that NHS Digital and NHS England are ‘develop[ing] a framework… to support the system in sharing data for direct care’. Given the potential for data to improve patient care, this is a vital topic for debate within the NHS, as well as with patients, clinicians, academics, policymakers, activists and beyond. We support this debate and welcome the scrutiny of our work that it brings. It is in the interests of the NHS and all who depend on it that both the challenges and opportunities of data use are identified, debated with precision, and ultimately settled in the public interest.

Dr Dominic King BSc MBChB MEd PhD MRCS

Senior Clinician Scientist and Clinical Lead, DeepMind Health

Dr Alan Karthikesalingam MA MBBChir MSc PhD MRCS

Senior Clinician Scientist, DeepMind Health

Dr Cían Hughes MB ChB MSc MRCS

Senior Clinician Scientist, DeepMind Health

Professor Hugh Montgomery, MBBS BSc FRCP MD

Clinical Advisor, Deepmind Health

Professor Rosalind Raine MBBS, BSc, MSc, PhD, FFPH

Clinical Advisor, Deepmind Health

Professor Geraint Rees MA MB BCh PhD MRCP FMedSci

Senior Scientific Advisor, DeepMind Health

On behalf of the DeepMind Health Team

Thursday, 27th July 2017

Footnotes

  1. 1.

    See addendum

  2. 2.

    AKI is well known to be associated with acute orthopaedic issues not limited to trauma and bone and joint infections. 25% of elderly patients with a hip fracture develop AKI www.bmcnephrol.biomedcentral.com/articles/10.1186/s12882-017-0437-5

  3. 3.

Notes

Compliance with ethical standards

Conflict of interest

The authors declare that they have no conflict of interest.

Copyright information

© IUPESM and Springer-Verlag GmbH Germany, part of Springer Nature 2018

Authors and Affiliations

  • Dominic King
    • 1
  • Alan Karthikesalingam
    • 1
  • Cían Hughes
    • 1
  • Hugh Montgomery
    • 1
  • Rosalind Raine
    • 1
  • Geraint Rees
    • 1
  • On behalf of the DeepMind Health Team
  1. 1.DeepMindLondonUK

Personalised recommendations