Back in February, I recounted the tale of DeepMind, a standout AI startup acquired by Google a few years ago. In the story, I noted that DeepMind had announced that it would be working with the Royal Free London NHS Foundation Trust, which oversees three hospitals, to test out its healthcare app
DeepMind’s healthcare app, Streams, is designed to help providers kick out patient status updates to physicians and nurses working with them. Under the terms of the deal, which was to span five years, DeepMind was supposed to gain access to 1.6 million patient records managed by the hospitals.
Now, the agreement seems to have collapsed under regulatory scrutiny. The UK’s data protection watchdog has ruled that DeepMind’s deal with the Trust “failed to comply with data protection law,” according to a story in Business Insider. The watchdog, known as the Information Commissioner’s Office (ICO), has spent a year investigating the deal, BI reports.
As it turns out, the agreement empowered the Trust hospitals to share the data without the patients’ prior knowledge, something that presumably wouldn’t fly in the U.S. either. This includes data, intended for use in developing the Streams’ app kidney monitoring technology, which includes info on whether people are HIV-positive, along with details of drug overdoses and abortions.
In its defense, DeepMind and the Royal Free Trust argued that patients had provided “implied consent” for such data sharing, given that the app was delivering “direct care” to patients using it. (Nice try. Got any other bridges you wanna sell?) Not surprisngly, that didn’t satisfy the ICO, which found several other shortcomings and how the data was handled as well.
While the ICO has concluded that the DeepMind/Royal Free Trust deal was illegal, it doesn’t plan to sanction either party, despite having the power to hand out fines of up to £500,000, BI said. But DeepMind, which set up his own independent review panel to oversee its data sharing agreements, privacy and security measures and product roadmaps last year, is taking a closer look at this deal. Way to self-police, guys! (Or maybe not.)
Not to be provincial, but what worries me about this is less the politics of UK patient protection laws, and bore the potential for Google subsidiaries to engage in other questionable data sharing activities. DeepMind has always said that they do not share patient data with its corporate parent, but while this might be true now, Google could do incalculable harm to patient privacy if they don’t maintain this firewall.
Hey, just consider that even for an entity the size of Google, healthcare data is an incredibly valuable asset. Reportedly, even street-level data thieves pay 10x for healthcare data as they do for, say, credit card numbers. It’s hard to even imagine what an entity the size of Google could do with such data if crunched in incredibly advanced ways. Let’s just say I don’t want to find out.
Unfortunately, as far as I know U.S. law hasn’t caught up with the idea of crime-by-analytics, which could be an issue even if an entity has legal possession of healthcare data. But I hope it does soon. The amount of harm this kind of data manipulation could do is immense.