Big Tech: Microsoft. Nuance Plan Tech Which Creates Documentation For Physicians

For a while now, since AI applications have begun to hit the mainstream, there’s been talk of the day when such tools might take over patient documentation and free up physicians to actually look at their patients. Now, Microsoft has come out with an announcement which, if characteristically light on details, demonstrates that the company believes that it can solve this problem.

Researchers have been laying out blueprints for AI-driven, hands-free documentation a while. One recent example comes from Stanford Medicine, which touched on similar issues in a paper outlining its vision for the future of EHRs.

The paper predicts that in the future, an automated physician’s assistant would “listen” to interactions between doctor and patient and analyze what was said, then record all relevant information.

In the future, it says, the assistant will use AI capabilities to synthesize medical literature, the patient’s history and relevant histories of other patients.  It will then list possible diagnoses for the physician to address, taking into account patient characteristics such as lifestyle, medication history and genetic makeup.

Another relevant example comes from Allscripts, which is building a new voice-enabled, AI-based EHR in partnership with health system Northwell Health. While they’re not shooting for hands-free records generation, the partners clearly also see voice recognition and AI as critical components of the new system.

Microsoft, for its part, has announced an initiative that could potentially move us further in this direction. Working with speech recognition software vendor Nuance, whose Dragon Medical platform is in wide use among physicians, Microsoft is working to deploy “ambient clinical intelligence” (ACI) technologies that will write clinical documentation on behalf of doctors.

ACI listens to clinical conversations doctors have with patients, integrates this content with other information from the EHR and generates a medical summary automatically. Project EmpowerMD is already testing out this approach in partnership with Nuance, basing its platform on Microsoft’s Azure cloud technology.

In its announcement, Microsoft says it will be working with EHR vendors to develop ACI offerings. In partnership with Nuance, the software giant says it expects to roll out this technology to a subset of physician specialties early next year. The new technology should include voice biometrics, text-to-speech and natural language processing.  As part of the deal, Nuance will be migrating the bulk of its on-premises internal infrastructure and hosted products to Azure.

To me, it’s rather predictable that Microsoft and Nuance aren’t rolling out a fully-formed product, but rather tossing out a few tantalizing details in an effort to show they’re hip to where all of this is headed. The truth is, nobody really knows how to do smart documentation yet.

Still, it’s likely that we’ll see at least an early-adopter version of ACI come out of this agreement more or less on schedule. While it may very well be underwhelming, I’m still eager to see how the partners’ first release works.

About the author

Anne Zieger

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.

1 Comment

  • As a sometime editor of voice recognition (by Nuance) medical reports, I, for one, do not want software ‘listening’ to the interaction between me and my physician and determining what the pertinent information is to be documented, because, basically, I want to live. What makes anyone think that a human ear should not be an important part of medical documentation? I suppose you think that doctors don’t make any mistakes, but you would be dead wrong. Drug names are mispronounced, frequently made grammatical errors change the context of the sentences, measurements are not expressed correctly, and VR’s propensity for changing positives to negatives (the patient is allergic to sulfa frequently gets changed to “NOT ALLERGIC” for some unknown reason) – all of these errors occur even after voice recognition has been ‘listening’ to the same doctor for many years. And even if this somehow improves with this brand new artificial intelligence, how are you going to know if it can be trusted? I’ll bet you this Microsoft/Nuance alliance will not have 1 medical transcriptionist on its staff.

    Yes, I do this for a living – well, just barely a living – and I am near the end of my tenure, thankfully. But as I get older, I will probably be a patient much more often, and really, that is who we are documenting this potentially life and death information for. If Microsoft wants to work on a medical problem, how about interoperability? That was supposed to be the thing that brought down costs that has not been successfully addressed.

    Sometimes I think that the ‘real’ artificial intelligence is possessed only by those who think that artificial intelligence can actually replace a human. And that the more it is touted and promoted, the farther apart human beings are going to be.

Click here to post a comment
   

Categories