For Hospitals, EMRs Increase Costs For Several Years

In theory, the push for EMRs installation is all about money — saving it, that is.  Short term, it’s hoped that hospitals improve patient care and reduce the incidence of adverse events, but ultimately public policy types are paying for EMRs because they believe EMRs will squeeze costs out of the healthcare system.

That being said, it seems some recent research from the National Bureau of Economic Research has poked a sharp stick in this thinking.  Researchers there have concluded that EMR adoption drives up costs at hospitals, for years, regardless of the resources they have at their disposal (though location matters a lot — more to come).

The NBER study set out to measure the extent to which costs rose at U.S. hospitals between 1996 and 2009, and the relationship between this increase and hospital EMR adoption. They found that across all hospitals, EMR adoption is first associated with a rise in costs.

However, once the dust settles, big differences emerge between urban and rural hospitals. Hospitals situated close to IT organizations, usually those in urban areas, saw a decrease in costs after three years. Meanwhile, hospitals in “unfavorable” conditions, typically those in rural areas, saw a “sharp” increase in costs even after six years of operating their EMRs.

It is worth noting that cost savings from improved outcomes don’t seem to be figured into this analysis.  Also, it doesn’t seem to include any offsetting income from the marketing/PR value of a strongly branded EMR — which I believe has been too little studied.  Finally, the study’s results might be quite different if they ran from 2006 through 2012, say, as EMRs continue to mature and hopefully cost less to operate.

Still, these are troubling pieces of data. What cost trends have you seen at your hospital?  Do you think the NBER study is on target?

About the author

Anne Zieger

Anne Zieger

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.


  • Studies like this remind me of when the late George Carlin would do sports: “Here is a partial score, Princeton 86.”

    From the abstract, it seems that this is a partial score as well, costs only.

    These were all the rage when PCs first came into the work world. They would add up the costs of acquisition, training, maintenance, etc. However, it was a rare study that looked at the cost of the system being replaced, or attempted to quantify the ability to do things that could not be done under the old system.

  • What EMR solution on the market today decreases in cost as it matures? The cost just to upgrade every 12-18 months increases the cost. A recent Fitch report cited that CFO’s have not calculated an accurate TCO for EMR’s after the first five years of ownership.

  • I’m curious as to whether anyone has analyzed the choice of EHR versus the type of hospital. For instance, a small, rural hospital – IF it can get good Internet access (Federal grants help a lot of rural institutions including colleges, and maybe hospitals), might do best with a cloud based system. Yes, it would still need a shiny new network and PC’s, but it wouldn’t need the large IT staff and computer rooms that would otherwise be needed.

Click here to post a comment