The other day, I read an interesting piece about the University of Vermont Medical Center’s plans to create an integrated EMR connecting its four network hospitals. The article noted that unlike its peers in some other states, UVMC was required to file a Certificate of Need (CON) application with the state before it proceeds with the work. And that struck me as deserving some analysis.
According to a story appearing in Healthcare Informatics, UVMC plans to invest an initial $112.4 million in the project, which includes an upgrade to informatics, billing and scheduling systems used by UVMC and network facilities Central Vermont Medical Center, Champlain Valley Physicians Hospital and Elizabethtown Community Hospital. The total costs of implementing and operating the integrated system should hit $151.6 million over the first six years. (For all of you vendor-watchers, UVMC is an Epic shop.)
In its CON application, UVMC noted that some of the systems maintained by network hospitals are 20 years old and in dire need of replacement. It also asserted that if the four hospitals made upgrades independently rather than in concert, it would cost $200 million and still leave the facilities without a connection to each other.
Given the broad outline provided in the article, these numbers seem reasonable, perhaps even modest given what execs are trying to accomplish. And that would be all most hospital executives would need to win the approval of their board and steam ahead with the project, particularly if they were gunning for value-based contracts.
But clearly, this doesn’t necessarily mean that such investments aren’t risky, or don’t stand a chance of triggering a financial meltdown. For example, there’s countless examples of health systems which have faced major financial problems (like this and this), operational problems (particularly in this case) or have been forced to make difficult tradeoffs (such as this). And their health IT decisions can have a major impact on the rest of the marketplace, which sometimes bears the indirect costs of any mistakes they make.
Given these concerns, I think there’s an argument to be made for requiring hospitals to get CONs for major health IT investments. If there’s any case to be made for CON programs make any sense, I can’t see why it doesn’t apply here. After all, the idea behind them is to look at the big picture rather than incremental successes of one organization. If investment in, say, MRIs can increase costs needlessly, the big bucks dropped on health IT systems certainly could.
Part of the reason I sympathize with these requirements is I believe that healthcare IS fundamentally different than any other industry, and that as a public good, should face oversight that other industries do not. Simply put, healthcare costs are everybody’s costs, and that’s unique.
What’s more, I’m all too familiar with the bubble in which hospital execs and board members often live. Because they are compelled to generate the maximum profit (or excess) they can, there’s little room for analyzing how such investments impact their communities over the long term. Yes, the trend toward ACOs and population health may mitigate this effect to some degree, but probably not enough.
Of course, there’s lots of arguments against CONs, and ultimately against government intervention in the marketplace generally. If nothing else, it’s obvious that CON board members aren’t necessarily impartial arbiters of truth. (I once knew a consultant who pushed CONs through for a healthcare chain, who said that whichever competitor presented the last – not the best — statistics to the room almost always won.)
Regardless, I’d be interested in studying the results of health IT CON requirements in five or ten years and see if they had any measurable impact on healthcare competition and costs. We’d learn a lot about health IT market dynamics, don’t you think?