We thank DeepMind’s staff and advisors for responding to our article, Google DeepMind and Healthcare in an Age of Algorithms,Footnote 1 on the public record. The response criticizes our work, and defends DeepMind’s. We are grateful to the journal’s editorial board for inviting us to reply.

A point at the outset. Our investigation and criticism of DeepMind is not motivated by any desire to see public health institutions bereft of advanced digital services. Quite the opposite. We are technology optimists, and truly believe that new technology can help humans solve real problems. We share the collective goal of saving lives and ensuring the right care reaches patients. At the same time, we are concerned that if long held principles are ignored or discarded in the process, the technology’s promise will turn sour.

There is a pressing need for open, fact-based, public discussion of technology companies entering into the provision of public services. The case study at the heart of our article is a particularly illuminating example. Google DeepMind’s first deal with the British National Health Service, involving a gift of at least 1.6 million detailed, non-anonymized health records from every patient in London’s Royal Free hospital trust in order for DeepMind to offer a smartphone app for kidney injury alerts, has been and remains troubling. We think this assessment extends to DeepMind’s response.

We must give a number of clarifications. DeepMind’s letter addresses arguments that we never make, and its continual assertions of “factual inaccuracies” and “unevidenced statements” are nothing of the kind. For example, readers will find no claims in our paper that DeepMind engaged in “illicit and illegal data mining,” “secret” use of patient data, “exploitation for other purposes,” or “misuses of data.” There is no need for us to imply such wrongdoing when there are more immediate and demonstrable concerns. DeepMind’s response diverts attention from the real challenges we raise: (1) the continuing absence of a valid legal basis for processing every Royal Free patient’s data from November 2015 to at least January 2017, when deployment of clinical app Streams commenced, offering direct care to some proportion of patients being monitored for kidney injury; (2) the total inadequacy of contractual and institutional protections against the possibility of misuse; and (3) broader issues about value and power.

Similarly, we have made no assertions that DeepMind’s parent company Google/Alphabet will access, or is accessing, British patient data. Rather, we exposed the absence of contractual precautions against that possibility, as well as the transparency paradox that precludes independent voices such as ours from scrutinizing corporate data arrangements, particularly those in which Google/Alphabet has ultimate control.Footnote 2 A complete explanation of the relationship between DeepMind and its parent—including the nature and limits of flows of intellectual property, data, algorithms, and finances—as well as sister companies, must be forthcoming if DeepMind is to continue providing services to the public sector, in health, energy, or any other domain.

In its letter, DeepMind seeks to diminish the value of our work for its focus on a “now-superseded 2015 agreement,” and by suggesting that a July 2017 ruling by Britain’s top regulator, the Information Commissioner’s Office (ICO), in part “came to completely different conclusions.” We expressly cabined our research to focus only on the period July 2015 to October 2016, precisely to combat the asynchronicity between privatized technological progress and public mechanisms of redress. The premature revocation of the 2015 agreement halfway through its term can only be read in our favor, given it involved a realization by the parties that they were collaborating on an untenable, possibly unlawful, foundation. As for the ICO ruling, there is only one point of difference and it is inconclusive. We questioned the self-designation of DeepMind as a mere “data processor” and said it was arguably a joint “data controller,” offering evidence and reasons in support. Similar arguments have been made by many data protection professionals, including the Chair of the National Association of Data Protection and Freedom of Information Officers.Footnote 3 The National Data Guardian has also shed doubt on the arrangement, stating “the contract appears to contain elements of a data sharing agreement (i.e. Data Controller to Data Controller), and therefore does not provide all of the necessary controls for the sharing of this data.”Footnote 4 The ICO, by contrast, simply took as an accepted fact that DeepMind’s self-designation was true. It made no positive finding and provided no evidence to support this classification, which remains in tension with the definition of processors/controllers under UK law.

We appreciate that neither of us are clinicians. Nor are the DeepMind letter’s authors legal or policy experts. As affirmed in countless reports on the subject, it is vital that a diverse range of voices are heard and heeded as new digital technologies are introduced to the NHS, by DeepMind and others. Indeed, since publishing our article, numerous clinical professionals have endorsed our work and expressed their shared concerns. They believe, like us, that it is perfectly possible to both respect individual rights and to enable promising technologies. Despite DeepMind’s continued assertions that it wishes to be one of the most open digital companies, it has continuously prioritized internal views and those of self-selected advisors, while seeking to discredit external opinions such as our own. We hope this will change.

Readers will find that DeepMind’s response continually returns to the touchstone of clinical will. We accept that Royal Free clinicians are well placed to understand what services will benefit their patients. But there is a regulatory regime in play, and it does not justify any scale of data transfer, to any third party, merely because such transfer is desired by clinicians—the vaguest possible term for hospital staff. Patients, and patient rights, must be respected. This is the overriding message of our article and of the ICO ruling. An activity does not become direct care simply because clinicians say so. Similarly, one is not a data processor by intent alone. Such questions are crucial matters of substance and circumstance. We explore them as such in our paper, and are disappointed that they have received so little examination in return.

A final clarification is that, despite the length and tone of DeepMind’s letter and appendix, with its formidable looking 47 points of disagreement, the arguments in fact address only a very small percentage of our article. DeepMind’s points involve considerable repetition, internal inconsistencies, and consistently ignore context in favor of oblique semantic arguments. We disentangle these issues, point-by-point, in the Appendix 1, seeking where possible to elevate the conversation constructively.