As consumers, we’re prone to give away our health information for free on the internet, like when we ask Dr. Google “how to treat a broken toe.” Yet the idea of our physician using artificial intelligence (AI) for diagnosis based on an analysis of our healthcare data makes many of us uncomfortable, a Pew Research Center survey found.
So how much more concerned might consumers be if they knew massive volumes of their medical data were being uploaded into AI-powered models for analysis in the name of innovation?
It’s a question healthcare leaders may wish to ask themselves, especially given the complexity, intricacy and liability associated with uploading patient data into these models.
Reducing Clinical and Staff Burnout with AI Automation
As technology advances, AI-powered tools will increasingly reduce the administrative burdens on healthcare providers.
What’s at stake
The more the use of AI in healthcare and healthcare research becomes mainstream, the more the risks associated with AI-powered analysis evolve — and the greater the potential for breakdowns in consumer trust.
A recent survey by Fierce Health and Sermo, a physician social network, found 76% of physician respondents use general-purpose large language models (LLMs), like ChatGPT, for clinical decision-making. These publicly available tools offer access to information such as potential side effects from medications, diagnosis support and treatment planning recommendations. They can also help capture physician notes from patient encounters in real-time via ambient listening, an increasingly popular approach to lifting an administrative burden from physicians so they can focus on care. In both instances, mature practices for incorporating AI technologies are essential, like using an LLM for a fact check or a point of exploration rather than relying on it to deliver an answer to complex care questions.
But there are signs that the risks of leveraging LLMs for care and research need more attention.
For example, there are significant concerns around the quality and completeness of patient data being fed into AI models for analysis. Most healthcare data is unstructured, captured within open notes fields in the electronic health record (EHR), patient messages, images and even scanned, handwritten text. In fact, half of healthcare organizations say less than 30% of unstructured data is available for analysis. There are also inconsistencies in the types of data that fall into the “unstructured data” bucket. These factors limit the big-picture view of patient and population health. They also increase the chances that AI analyses will be biased, reflecting data that underrepresents specific segments of a population or is incomplete.
And while regulations surrounding the use of protected health information (PHI) have kept some researchers and analysts from using all the data available to them, the sheer cost of data storage and information sharing is a big reason why most healthcare data is underleveraged, especially in comparison to other industries. So is the complexity associated with applying advanced data analysis to healthcare data while maintaining compliance with healthcare regulations, including those related to PHI.
Now, healthcare leaders, clinicians and researchers find themselves at a unique inflection point. AI holds tremendous potential to drive innovation by leveraging clinical data for analysis in ways the industry could only imagine just two years ago. At a time when one out of six adults use AI chatbots at least once a month for health information and advice, demonstrating the power of AI in healthcare beyond “Dr. Google” while protecting what matters most to patients — like the privacy and integrity of their health data — is vital to securing consumer trust in these efforts. The challenge is to maintain compliance with the regulations surrounding health data while getting creative with approaches to AI-powered data analysis and utilization.
Making the right moves for AI analysis
As the use of AI in healthcare ramps up, a modern data management strategy requires a sophisticated approach to data protection, one that puts the consumer at the center while meeting the core principles of effective data compliance in an evolving regulatory landscape.
Here are three top considerations for leaders and researchers in protecting patient privacy, compliance and, ultimately, consumer trust as AI innovation accelerates.
1. Start with consumer trust in mind. Instead of simply reacting to regulations around data privacy and protection, consider the impact of your efforts on the patients your organization serves. When patients trust in your ability to leverage data safely and securely for AI innovation, this not only helps establish the level of trust needed to optimize AI solutions, but also engages them in sharing their own data for AI analysis, which is vital to building a personalized care plan. Today, 45% of healthcare industry executives surveyed by Deloitte are prioritizing efforts to build consumer trust so consumers feel more comfortable sharing their data and making their data available for AI analysis.
One important step to consider in protecting consumer trust: implement robust controls around who accesses and uses the data—and how. This core principle of effective data protection helps ensure compliance with all applicable regulations. It also strengthens the organization’s ability to generate the insight needed to achieve better health outcomes while securing consumer buy-in.
2. Establish a data governance committee for AI innovation. Appropriate use of AI in a business context depends on a number of factors, from an evaluation of the risks involved to maturity of data practices, relationships with customers, and more. That’s why a data governance committee should include experts from health IT as well as clinicians and professionals across disciplines, from nurses to population health specialists to revenue cycle team members. This ensures the right data innovation projects are undertaken at the right time and that the organization’s resources provide optimal support. It also brings all key stakeholders on board in determining the risks and rewards of using AI-powered analysis and how to establish the right data protections without unnecessarily thwarting innovation. Rather than “grading your own work,” consider whether an outside expert might provide value in determining whether the right protections are in place.
3. Mitigate the risks associated with re-identification of sensitive patient information. It’s a myth to think that simple anonymization techniques, like removing names and addresses, are sufficient to protect patient privacy. The reality is that advanced re-identification techniques deployed by bad actors can often piece together supposedly anonymized data. This necessitates more sophisticated approaches to protecting data from the risk of re-identification when the data are at rest. It’s an area where a generalized approach to data governance is no longer adequate. A key strategic question for organizations becomes: “How will our organization address re-identification risks–and how can we continually assess these risks?”
While healthcare organizations face some of the biggest hurdles to effectively implementing AI, they are also poised to introduce some of the most life-changing applications of this technology. By addressing the risks associated with AI-powered data analysis, healthcare clinicians and researchers can more effectively leverage the data available to them — and secure consumer trust.
Photo: steved_np3, Getty Images
Timothy Nobles is the chief commercial officer for Integral. Prior to joining Integral, Nobles served as chief product officer at Trilliant Health and head of product at Embold Health, where he developed advanced analytics solutions for healthcare providers and payers. With over 20 years of experience in data and analytics, he has held leadership roles at innovative companies across multiple industries.
This post appears through the MedCity Influencers program. Anyone can publish their perspective on business and innovation in healthcare on MedCity News through MedCity Influencers. Click here to find out how.