Over the last couple of years, Google has offered intriguing glimpses at what it plans to do in the healthcare space, but has made few concrete product announcements.
The big G has apparently ruled out building its own EHR, at least according to Toby Cosgrove, MD, an executive advisor to Google and former president and CEO of the Cleveland Clinic. To my knowledge, though, virtually every other potential health IT option is still on the table.
That includes partnering with Fitbit on wearables data options (and then acquiring Fitbit), working with a Stanford University researcher to pilot-test “digital scribe” technology and moving ahead with healthcare projects based on its DeepMind AI platform. and a great deal more.
Despite this flurry of activity, however, it’s never been clear to me what Google’s larger strategies are in healthcare. While I know this is unlikely to be the case, on the surface Google’s entire healthcare effort comes across (to me at least) as a giant R&D effort with open-ended goals. Don’t get me wrong, even if Google has been in a comparatively loose experimental mode when it comes to healthcare technology, there’s always a method to its madness. It’s just that it hasn’t been making big, high-profile incursions into standard niches in the health IT arena.
This month, though, in perhaps its least speculative move to date, Google has set plans to work with healthcare AI vendor care.it to roll out a platform supporting autonomous patient monitoring. The new technology will leverage the Edge TPU (tensor processing unit) made by its Coral subsidiary, part of a suite of products designed to speed up neural networks on embedded devices. Together, the two companies will use neural networks to power a jointly-created “self-aware room.”
These self-aware rooms will monitor patients and send context-based smart notifications to staff caring for those patients. According to care.ai, part of what will make this possible is that its software runs deep neural networks at the network edge, making it capable of responding extremely quickly.
According to a story appearing in Becker’s Hospital Review, each of these self-aware rooms come equipped with an AI sensor combining care.ai’s machine learning platform along with a library of human behavioral data. Toss in the Edge TPU, which allows the room’s sensor to monitor patient behavior and send out the predictive alerts, and you’ve really got something.
Becker’s reports that the partners have launched pilot programs trialing the self-aware rooms with several U.S. hospitals and health systems. I don’t know about you, but it seems a foregone conclusion that the hospitals and health systems will learn an enormous amount from these exercises even if they never put one of the self-aware rooms into production.
Again, it’s worth bearing in mind that Google and care.ai are at a tentative stage, like a related effort by Microsoft and Nuance leveraging “ambient clinical intelligence” that will create documentation for physicians. But in both cases, we may be looking at the beginnings of some game-changing uses of AI in healthcare.