Health Tech, SYN

Why ChatGPT In Healthcare Could Be a Huge Liability, Per One AI Expert

One healthcare AI expert thinks ChatGPT could have dangerous implications for the field if it’s not used with the proper context. People already turn to sources like Reddit and TikTok when they have a question about a medical concern they’re experiencing — Americans putting too much trust in ChatGPT to answer these questions could make this phenomenon a much bigger problem.

ChatGPT refers to an AI model that analyzes language to answer questions and produce in-depth written content that humans can easily understand. The technology, which was developed by OpenAI and launched last November, seems to be dominating headlines lately — from news of its freshly launched application programming interface to announcements from various companies that have adopted the AI bot to reports of the many fun and silly ways that the technology can be used.

So how does ChatGPT fit into healthcare? 

One healthcare AI expert thinks that the tool could have dangerous implications for the field if it’s not used with the proper context.

“I think ChatGPT is an awesome advancement. There’s a bunch of really cool things that it can do. But it’s a tool. A hammer can be used to build a house or bludgeon somebody to death — it’s about how you use the tool that matters. ChatGPT is not necessarily scary, but its application in places where it will cause harm is the scary part,” Carta Healthcare CEO Matt Hollingsworth said in a recent interview.

Hollingsworth’s company develops AI software to reduce the amount of time clinicians spend on mundane administrative tasks. He has years of experience in the healthcare AI space, and he said one thing that worries him about ChatGPT as it pertains to healthcare is the technology’s high level of accessibility.

Most Americans have used chatbots before, often during customer service interactions. Because of this, people feel comfortable using ChatGPT — not only to produce things like a poem about their pet or a new dating app bio, but also to get a medical diagnosis, Hollingsworth pointed out.

This is worrisome because ChatGPT’s objective function is to produce convincing-sounding content.

“ChatGPT has no concept of truth or correctness, or even the probability of correctness, like some algorithms do. Its job is to be convincing. That means the algorithm can generate stuff that is entirely wrong. For instance, a couple days ago I asked it to write a bio about me out of curiosity. It wrote a nice three-paragraph essay which sounded great but was 100% wrong,” Hollingsworth declared.

For applications where accuracy doesn’t matter — such as creative writing — ChatGPT can be an effective productivity tool. But for uses where accuracy does matter — such people self-diagnosing themselves —  the tool could be dangerous because it will sound very authoritative and factual while delivering completely incorrect information, Hollingsworth explained. People already turn to sources like Reddit and TikTok when they have a question about a medical concern they’re experiencing — Americans putting too much trust in ChatGPT to answer these questions could make this phenomenon a much bigger problem.

In Hollingsworth’s view, he doesn’t think health systems will start using ChatGPT on their websites to answer patient’s medical questions. He said that would end up being way too much of a liability.

Hollingsworth also mentioned some other applications of ChatGPT in the healthcare space that are more innocuous. For example, he said providers and insurers could use the technology to produce messages to each other during the prior authorization, claims and denial processes.

“I can imagine a world in which they’re just using ChatGPT to argue with one another. It’s kind of funny thinking about it, but I can almost guarantee they’re gonna be arguing with one another about this stuff that way. I would just be shocked if they didn’t since it’s just such an obvious application,” Hollingsworth said.

ChatGPT could also be used to help doctors write up summaries after they visit with patients. This would not only save clinicians time, but it could also improve their communication cadence, as not everyone is a great writer, Hollingsworth pointed out.

Photo: venimo, Getty Images

Shares1
Shares1