Three Ways A.I. Is Improving EHR Performance

The following is a guest blog post by Wayne Crandall, President & CEO of NoteSwift.

Let’s be honest – we can all see the immense promise of a healthcare world fully connected through EHR systems, but first (and even second) generations of EHR software have not yet achieved that promise. Many practices have had the opposite experience; instead of benefits and promises, their EHR systems have been frustrating, cumbersome, and difficult to manage.

But don’t give up! Advancements in artificial intelligence are rapidly improving the ability of EHRs to become more accurate, more user-friendly, and more connected.

Here are three major ways artificial intelligence is improving the world of EHR:

1. Advances in voice dictation for note entry

Voice dictation is already a valuable tool for many doctors to make EHR entry take less time – in fact, 62% of doctors already use voice dictation software to assist with their EHR entry, and nearly another 20% have plans to add voice dictation to their workflow in the coming year. What’s really exciting is that advancements in Natural Language Processing (NLP) continue to make voice dictation more accurate and more useful for physicians and staff. Ongoing advancement in voice dictation will continue to improve EHR entry.

2. Structured Data Elements

Long the Achilles heel for EHR entry, structured data elements are the vital data formatting structure that allows practices to use their EHRs to achieve Meaningful Use and participate in HIE requirements. But most EHRs don’t offer consistent structured data formatting, and those that do offer some kind of structure are not compatible with other EHR systems. According to many researchers, unstructured data is the cause of much of the adoption and transition pain practices feel around EHR use. Many practices simply dump old data into their new EHRs, which both fails to meet MU requirements and reduces the physician’s ability to actually use the data to improve patient care.

Thankfully, solutions such as Samantha from Noteswift are using artificial intelligence to turn dictated patient narratives into structured data at the time of entry! The physician simply dictates the narrative, and Samantha automatically parses the narrative for structured data and enters it across the entire EHR. This streamlined process promises to save the physician time and dramatically improve the quality of data being entered into the EHR, which will lead to more HIE compatibility and better care.

3. Heuristic learning and clinician review

A recent study noted that there is currently a 7.4% error rate in voice dictation data entry when not supported by software optimization and clinical review tools. Thankfully, artificial intelligence solutions are rapidly improving the accuracy of dictation while also offering clinicians more robust tools for quickly and effectively reviewing and approving EHR entries.

For example, Samantha from Noteswift features AI-powered heuristic learning technology that improves clarity and accuracy of the EHR entry every time a physician uses the solution. So, the more you use it, the better it performs! In addition, because Samantha parses every entered narrative into structured data for the EHR, it is able to offer clinicians a quick and powerful tool for reviewing and approving the structured data entries that it can apply across the record.

EHR companies are continually working to improve their products, and advancements in artificial intelligence in solutions like Samantha offer major opportunities for practices to make their EHR experience more accurate and less burdensome than ever before.

About Wayne Crandall
Wayne Crandall’s career in technology spans sales, marketing, product management, strategic development and operations. Wayne was a co-founder, executive officer, and senior vice president of sales, marketing and business development at Nuance Communications and was responsible for growing the company to over $120M following the acquisition of Dragon and SpeechWorks.

Prior to joining the NoteSwift team, Wayne was President and CEO of CYA Technologies and then took over as President of enChoice, which specialized in ECM systems and services, when they purchased CYA.

Wayne joined NoteSwift, Inc. at its inception, working with founder Dr. Chris Russell to build the team from the ground up. Wayne has continued to guide the company’s growth and evolution, resulting in the development of the industry’s first AI-powered EHR Virtual Assistant, Samantha(TM).

NoteSwift is the leading provider of EHR Virtual Assistants and a proud sponsor of Healthcare Scene.

About the author

Guest Blogger

1 Comment

  • Just wondering, Mr. Crandall, if you have ever really taken a look at the ‘raw material’ that these voice recognition engines produce and if you did, would you be able to discern the errors it makes? I would hazard a guess that the 7.4% rate that you are touting is not correct. And if you are talking about systems that are not supported by HUMAN BEINGS, i.e., editors who are paid peanuts for listening to and fixing these errors, the rate must be even higher. And let’s not forget that the doctors themselves make errors, misspeak, etc., in their notes so when Samantha disseminates this information as structured data into the EMR, she may be compounding the error many times over. Why is it that people’s actual – as opposed to artificial – intelligence is being devalued? I think this blatant disregard is dangerous. Medical records are not just black and white. They have subtleties and nuances – something not picked up by this mechanical ear you deem to be intelligent.

Click here to post a comment

   

Categories