In theory, the push for EMRs installation is all about money — saving it, that is. Short term, it’s hoped that hospitals improve patient care and reduce the incidence of adverse events, but ultimately public policy types are paying for EMRs because they believe EMRs will squeeze costs out of the healthcare system.
That being said, it seems some recent research from the National Bureau of Economic Research has poked a sharp stick in this thinking. Researchers there have concluded that EMR adoption drives up costs at hospitals, for years, regardless of the resources they have at their disposal (though location matters a lot — more to come).
The NBER study set out to measure the extent to which costs rose at U.S. hospitals between 1996 and 2009, and the relationship between this increase and hospital EMR adoption. They found that across all hospitals, EMR adoption is first associated with a rise in costs.
However, once the dust settles, big differences emerge between urban and rural hospitals. Hospitals situated close to IT organizations, usually those in urban areas, saw a decrease in costs after three years. Meanwhile, hospitals in “unfavorable” conditions, typically those in rural areas, saw a “sharp” increase in costs even after six years of operating their EMRs.
It is worth noting that cost savings from improved outcomes don’t seem to be figured into this analysis. Also, it doesn’t seem to include any offsetting income from the marketing/PR value of a strongly branded EMR — which I believe has been too little studied. Finally, the study’s results might be quite different if they ran from 2006 through 2012, say, as EMRs continue to mature and hopefully cost less to operate.
Still, these are troubling pieces of data. What cost trends have you seen at your hospital? Do you think the NBER study is on target?