A few months ago, in a move that hasn’t gotten a lot of attention, the AMA and MedStar Health made an interesting play. The physicians’ group and the health system released a joint framework designed to rank EMR usability, as well as using the framework to rank the usability of a number of widely-implemented systems.
What makes these scores interesting is not that they’re just another set of rankings — those are pretty much everywhere — but that the researchers focused on EMR usability. As any clinician will tell you (and many have told me) despite years of evolution, EMRs are still a pain in the butt to use. And clearly, market forces are doing little to change this. Looking at where widely-used systems rate on usability is a refreshing look at a neglected issue.
To score the EMRs, researchers dug into EMR vendor testing reports from ONC. This makes sense. After all, though the agency doesn’t use this data for certification, the ONC does require EMR vendors to report on user-centered design processes they used for eight capabilities.
And while the ONC doesn’t base EMR certifications on usability, my gut feeling is that the data source is pretty reliable. I would tend to believe that given they’re talking to a certifying authority, vendors are less like to fudge these reports than any they’d prepare for potential customers.
According to the partners, Allscripts and McKesson were the highest-scoring EMR vendors, gaining 15 out of 15 points. eClinicalWorks was the lowest-scoring EMR, getting only 5 of 15 possible points. In-betweeners included Cerner and MEDITECH, which got 13 points each, and Epic, which got 9 points.
And here’s the criteria for the rankings:
- User Centered Design Process: EMRs were rated on whether they had a user-centered design process, how many participants took part (15+ was best) and whether test participants had a clinical background.
- Summative Testing Methodology: These ratings focused on how detailed the use cases relied upon by the testing were and whether usability measures focused on appropriate factors (effectiveness, efficiency and satisfaction).
- Summative Testing Results: These measures focused on whether success rates for first-time users were 80% or more, and on how substantive descriptions of areas for improvement were.
Given the spotty results across the population of EMRs tested, it seems clear that usability hasn’t been a core concern of most vendors. (Yes, I know, some of you are saying, “Boy howdy, we knew that already!”)
Perhaps more importantly, though, it can be inferred that usability hasn’t been a priority for the health systems and practices investing in these products. After all, some of the so-so ratings, such as that for the Epic product, come from companies that have been in the market forever and have had the time to iterate a mature, usable product. If health systems were demanding that EMRs be easy to use, the scores would probably be higher.
Frankly, I can’t for the life of me understand why an organization would invest hundreds of millions of dollars (or even a billion) dollars in an EMR without being sure that clinicians can actually use it. After all, a good EMR experience can be very attractive to potential recruits as well as current clinicians. In fact, a study from early last year found that 79% of RNs see the hospital’s EMR as a one of the top 3 considerations in choosing where to work.
Maybe it’s an artifact of a prior era. In the past, perhaps the health systems investing in less-usable EMRs were just making the best of a shoddy situation. But I don’t think that excuse plays anymore. I believe more providers need to adopt frameworks like this one, and apply them rigorously.
Look, I know that EMR investment is a complex dance. And obviously, notions of usability will continue to evolve as EMRs involve — so perhaps it can’t be the top priority for every buyer. But it’s more than time for health organizations to take usability seriously.