Wow. Nuance is on a roll. While I often write about companies that are seeing some success, I’ve seldom seen a vendor popping up in so many places at once. The latest deal involves the deepening of a relationship with Providence, a huge health system with 51 hospitals in seven states.
Under the terms of the new agreement, Providence will expand its use of Nuance’s cloud services and integrate clinical intelligence into its tech infrastructure. The partners will also develop improved revenue cycle technology.
In practice what this means is that Providence will leverage Nuance’s voice-enabled platform using ambient sensing technology to listen to client-patient conversations. The platform will also offer workflow and knowledge automation designed to help providers get better use of its Epic EHR.
Providence is building these services on its deployment of Nuance Dragon Medical one, which includes the Nuance Dragon Ambient Experience. As we’ve mentioned previously, Microsoft has integrated the Ambient Experience into Microsoft Teams.
Nuance DAX pulls together its conversational AI technology with Microsoft Azure to capture and put into context the discussions taking place during a physician encounter. The system then documents care automatically.
In making these deals, Nuance and its partners are pulling alongside giants such as Google, which continues to push into this space.
A year ago, Google announced plans to work with healthcare AI vendor care.ai to rollout a platform supporting autonomous patient monitoring. The platform, which will make use of the Edge tensor processing unit made by Google’s Coral subsidiary, is part of a suite of products designed to speed up neural networks on embedded devices.
Together, care.ai and Google are using neural networks to support a jointly-created “self-aware room.” As with the Nuance offering described above, these self-aware rooms will monitor patients, but in this case, the context-based smart notifications it generates will be sent to staff caring for those patients.
These ideas have been under development for a while, though they remained in the testing phase until recently. For example, in 2018 Stanford Medicine released a paper looking at the road to the “ideal” EHR. In the paper, Stanford researchers described how automated physicians’ assistants would “listen” to interactions between doctors ad patients and analyze what was being said. The system would then combine both the discussion and the verbal cues from clinicians, eventually recording all relevant information in the physical exam.
In Stanford’s vision of the future, the assistant would use AI capabilities to synthesize medical literature, the patient’s history and relevant histories of other patients in anonymized, aggregated form. The system would then provide different possible diagnoses for the clinician to address, taking patient characteristics such as lifestyle, medication history and genetic makeup into account.
To date, the efforts by Nuance, Microsoft and their provider partners don’t seem to be headed in the direction of providing differential diagnoses, and it’s hard to tell whether this will become a feature of these efforts.
Also working in this space is the likes of Saykara and Speke which we’ve covered before along with 3M/M*Modal, Amazon, and others. The ambient clinical voice space is one of the most interesting in health IT today.
In any event, however, Nuance sure seems to be making the ambient clinical intelligence thing its own. I will be keeping my eye on its next moves in this space.