Artificial Intelligence

How AI almost led Dr. Farzad Mostashari down the wrong path on health equity

In a virtual panel discussion about public and population health at J.P. Morgan’s annual conference, the CEO of Aledade shared how the quest for efficiency in a world of harried physicians’ practices almost led Aledade to use a flawed ML algorithm to engage at-risk patients of those practices.

Achieving healthcare efficiency and reducing costs are increasingly the dual goals that hospitals, payers and providers are aiming for and many are looking toward machine learning algorithms to help deliver these while also avowedly not compromising care quality and patient outcomes. But leveraging AI/ML comes with its own pitfalls and human judgment can make the difference between reinforcing systemic racism and alleviating it.

That was the message of Dr. Farzad Mostashari, former National Coordinator for Health IT and CEO of Aledade, who participated in a panel discussion about public and population health moderated by Hemant Taneja, managing partner at venture capital firm General Catalyst, and with co-panelist Caroline Savello, chief commercial officer of Color, the Burlingame, California-based genomics testing and population health company. [A third panelist, Dr. Robert Wachter, professor and chair of the Department of Medicine at UCSF joined the virtual panel that was part of the annual virtual J.P. Morgan Healthcare conference.]

presented by

The conversation  inevitably turned to health equity, a topic that is on top of mind of scores of healthcare stakeholders given how Covid-19 has laid bare the inequities in the U.S. health system.

Savello painted the health equity challenge as largely an issue of access. Through Color’s experience in the pandemic she forwarded the argument that the infrastructure that has been stood up as a result of Covid — whether through government effort or private effort — can and should be leveraged to attack this problem of access to healthcare in underserved communities.

Savello described how while Color is providing vaccines and Covid testing are being in historically Black churches, Color is also performing HIV screening and testing in those same locations.

“Color is running 8,000 healthcare delivery sites now across the country,” she declared. “We have started piloting HIV tests in African American churches next to our vaccine and testing sites. We see 40% opt in rates for that HIV testing and screening because it’s right there, it’s convenient.”

What’s even more revealing is a different statistic. Well over 60 percent of those who opted in had either never engaged in HIV screening ever in their lives or hadn’t done so in the past year. In other words, this effort of going where the community is leads to higher engagement.

“When you see  childhood immunizations happening in schools or you see HIV testing happening in historical African American churches, I think it really  changes the nature of how even individuals think about healthcare and where they can access it for the basics,”Savello said.  

When it was the turn for Farzad Motsashari, former National Coordinator for Health Information Technology and currently the CEO of Aledade, to comment on health equity and the Covid impact on communities of color he framed the discussion as a national reckoning on race that was happening simultaneously with that of the tumult brought on my George Floyd’s murder in May 2020.

Later, he went on to describe the challenge of health equity being hard to achieve given the system not even requiring providers to measure disparities.

“We [Aledade] take global risk and so we have 1.7 million patients of practices that we have partnered with – over 1000 practices. And we get health plan contracts or Medicare contracts that reward the practices and us if we can reduce hospitalizations, reduce the bad stuff while improving quality,” he explained. “And there’s no requirement there that you even look at disparities. And so I think that’s where we should start. All these value-based payment models, there should be a requirement that you stratify — and in fact CMS could even do this for us — stratify the quality reporting that we are already doing in the utilization reporting and the cost reporting by race.”

He said one can’t wait for perfect data to begin to start measuring based on race. However, the most interesting comment from Mostashari came in the form of an anecdote he shared on the dangers of AI/ML in healthcare, which is produced verbatim below.

I hadn’t planned on sharing this but it may be interesting in the context of technology in particular thinking aboutML/AI applications. It, to me, was a great example of where we need human judgment around the questions we are asking.

We wanted our practices to reach out to patients who need care during this time, who are suffering for lack of primary care, and it’s population health 101, right, to reach out to patients.

It turns out that the practices are pretty busy, they are understaffed and it’s hard to reach out to a bunch pf patients. So the question that was asked was, ‘Gee, maybe we should prioritize this list not only on the basis of the risk that the person has, but also on the basis of their likelihood to be successfully engaged by the practice.’

Seems like a good question right? Let’s do the most useful work. Let’s put higher up on the list the people [with whom] you are going to have a conversion rate. So we did that. We had a nice ML model that would dramatically improve the experiences of the practices to have more efficiency. And then, thank goodness, one of our staff said, ‘Well what’s the impact this is going to have on racial equity?’

And we found that while 15 percent of our patients are Black, only 8% of those deemed most impactable by a phone call were Black.

This is the very definition of systemic racism. Why? Because historically when those people got called, either the phone number wasn’t working or they hadn’t updated it or they were’t picking up the phone for whatever reason or when they picked up the phone the way the person on the other end spoke to them, resulted in lower engagement rates. So we can ask ML /AI to predict who is going to be less responsive and it will tell us “Hey, yeah, don’t call those Black people.” And it will just exacerbate the disparities we already have in the system.

Or we can ask a different question, “What predicts a higher engagement from the Black Population?”

And I just think it would have been so easy for us to just [say], ‘Here’s a little random forest thing, put the thing in get the answer, sort the list’ and never even thought about how we were further reinforcing the disparities that exist in our system.

Photo credit: Andrii Shyp, Getty Images