MedCity Influencers, Health IT

Better Data Quality Means a Better Future for Public Health

Public health is heavily dependent on collecting and sharing accurate patient data. Standards for data collection and interoperability can move the needle toward better health data, but it is up to healthcare organizations to take ownership of their data quality by following best practices and adopting technologies that can detect and eliminate bad patient data before it is disseminated.

Quality health data is essential to quality healthcare. Providers can’t effectively treat patients unless they have timely access to thorough and accurate data about their medical condition and history.

Likewise, quality data is critical to public health measures. Without quality health data, it’s virtually impossible for public health officials to develop and implement effective policies to prevent the spread of disease.

That’s what happened during the early days of the Covid-19 pandemic in 2020, when the U.S. response to the spread of the virus “was hampered by slow and inconsistent reporting of critical public health data,” according to the O’Neill Institute for National and Global Health Law at Georgetown University. Two years later, lack of data on the monkeypox outbreak hindered efforts by the Centers for Disease Control and Prevention (CDC) to predict the virus’s path and impact.

For public health interventions to be effective, “data must be reliable: accurate, complete and timely; hence, the value of high-quality data,” writes the Centers for Disease Control and Prevention (CDC). “In fact, not only are high-quality data beneficial, but poor-quality data might at times be worse than having no data at all, since they can lead to misguided decisions strengthened by the illusion of doing the right thing based on evidence.”

An emerging data infrastructure

Electronic health records (EHRs), health information exchanges (HIEs) and other public health information systems provide vital health data about specific populations and can be leveraged to form public health priorities and guide policies. Public health data is exchanged between local, state and federal agencies, including health registries, public health programs, research organizations, Medicare, Medicaid, corrections, vital records, and emergency management.

presented by

Unfortunately, persistent data-quality issues beginning at the provider level continue to undermine population health. These include:

  • Inaccurate and incomplete patient records. Incorrect data, such as misspelled names or contact information, commonly are introduced into patient records during intake at a provider facility or over the phone. Data fields also are frequently left blank because a patient may not have the requested information, such as the name of a specialist or whether they’ve had a specific lab test.
  • Duplicate patient records. When patient records are inaccurate or incomplete, it is difficult to match them to the right patient. Consequently, provider intake workers often will resort to creating a new record for a patient rather than spend valuable time searching for the existing records. This results in multiple records for individual patients, none of which will be comprehensive.
  • Inconsistent data stored in disparate systems across different institutions. While healthcare interoperability has come a long way over the past decade, too much patient and population data remains siloed and thus inaccessible to all stakeholders – frequently within the same organization, such as when a hospital lab struggles to share digital data with a clinical department.
  • Outdated reporting for purposes of health equity and SDOH measures. There is a growing need for updated reporting to guide health equity and social determinants of health (SDoH) initiatives, which are increasingly recognized as necessary to improve patient outcomes for vulnerable populations and reduce healthcare costs.

Toward better health data standards

In addition to its negative impact on patient and population health, low-quality data costs healthcare organizations by decreasing operational efficiency. Healthcare workers must unnecessarily spend time searching for, correcting, or creating what often become duplicate patient records, while providers may order tests they were unaware already had been performed.

Further, patient data riddled with duplicate records, inaccuracies and omissions are of less value to health organizations as assets that could be monetized through their sale to research organizations, pharmaceutical and technology companies, and other buyers.

Improving the quality of data for use in public health will require the implementation of and adherence to data and interoperability standards. The good news is there are a number of initiatives whose goal is to improve how health data is collected and shared, including programs by the CDC, the Agency for Healthcare Research and Quality, the Regenstrief Institute.

It’s also important that healthcare organizations take proactive steps to ensure they are collecting and exchanging quality data. Applying standards of practice for data collection, for example, would make intake workers accountable for what they put into EHRs.

Healthcare organizations can accelerate their efforts to improve patient and population data quality by replacing or supplementing outdated disparate systems that do not communicate effectively across platforms with modern technology solutions that eliminate low-quality data such as duplicate patient records and overlays (when one patient’s medical information is placed in a different patient’s file).

A master patient index (MPI) that uses artificial intelligence (AI) and machine learning can help healthcare organizations improve the quality of their patient data while meeting state and federal measurement and reporting requirements. These modern data platforms allow healthcare organizations to:

  • Improve return on investment in their data
  • Aggregate data access
  • Free up staff time and resources with better tools and processes
  • Enhance health equity and SDoH initiatives
  • Be better prepared for the next public health crisis

Public health is heavily dependent on collecting and sharing accurate patient data. Standards for data collection and interoperability can move the needle toward better health data, but it is up to healthcare organizations to take ownership of their data quality by following best practices and adopting technologies that can detect and eliminate bad patient data before it is disseminated.

Photo: metamorworks, Getty Images

Avatar photo

Gregg Church is President of 4medica with 20-plus years of health IT experience. Since joining 4medica in 2010, Church has held rising positions leading the company to continued success in its 20-year transformational role in laboratory services, from its industry-first cloud-based integration platform to drive EHR adoption to launching innovative clinical data exchange and enterprise master patient index technologies that improve patient safety, operational sustainability and profitability for diverse healthcare organizations. He is a tireless advocate of the national need to reduce patient duplicate records across disparate systems to less than 1 percent.

This post appears through the MedCity Influencers program. Anyone can publish their perspective on business and innovation in healthcare on MedCity News through MedCity Influencers. Click here to find out how.