Digitizing electronic medical records can be a thorny issue for many providers. Those supporting implementation have argued it will reduce healthcare costs while opponents protest that it increases costs. Weighing in on the debate, a business school’s study findings provide some ammunition for both sides.
The study by Kellogg School of Management at Northwestern University in Chicago collected data from 4,231 hospitals showed that the provider’s location can be a significant factor in implementation costs.
Healthcare IT News highlighted some of the study’s findings:
“The study found that, for many hospitals adopting EMRs between 1996 and 2009, costs actually increased for the institution. However, if the hospital was located in a ‘strong IT location,’ such as an urban region, costs sharply declined after a one-year adoption time period. These costs typically fell below what hospitals were paying for IT services before EMR adoption.”
The study showed that three years after basic EMR implementation by hospitals, providers with a strong IT location showed a more than 3 percent decrease in expenditures and a 2.2 percent decrease after implementing an advanced EMR system. On the other hand, hospitals situated farther away from IT-dense areas showed more limited cost reductions after three years, the article said.
The brain trust behind the providers’ IT teams is also a critical factor. Costs of software systems could be reduced if provider staff is sufficiently knowledgeable about EMR systems and comfortable with using advanced clinical software.
The cost issue for EMR is understandably a contentious one, particularly for providers with limited budgets facing tough decisions on resource allocation. But providers who delay EMR implementation for too long face a different set of costs in the form of punitive Medicare reimbursement reductions set down by the Affordable Care Act.