Diagnostics, MedCity Influencers, SYN

Real-time Interpretation: The next frontier in radiology AI

To date, AI has demonstrated value in its ability to handle asynchronous tasks such as image triage and detection. What's even more interesting is the potential to enhance real-time image interpretation by giving the computer context that lets it work with radiologists instead of trying to replace them.

In the nine years since AlexNet spawned the age of deep learning, artificial intelligence (AI) has made significant technological progress in medical imaging, with more than 80 deep-learning algorithms approved by the U.S. FDA since 2012 for clinical applications in image detection and measurement. A 2020 survey found that more than 82% of imaging providers believe AI will improve diagnostic imaging over the next 10 years and the market for AI in medical imaging is expected to grow 10-fold in the same period.

Despite this optimistic outlook, AI still falls short of widespread clinical adoption in radiology. A 2020 survey by the American College of Radiology (ACR) revealed that only about a third of radiologists use AI, mostly to enhance image detection and interpretation; of the two thirds who did not use AI, the majority said they saw no benefit to it. In fact, most radiologists would say that AI has not transformed image reading or improved their practices.

Why is there such a huge gap between AI’s theoretical utility and its actual use in radiology? Why hasn’t AI delivered on its promise in radiology? Why aren’t we “there” yet?

The reason isn’t because companies haven’t tried to innovate. It’s because they were trying to automate away the radiologist’s job — and failed, burning plenty of investors and leaving them reluctant to fund other projects aimed at translating AI’s theoretical utility into real-world use cases.

AI companies seem to have mistaken Charles Friedman’s fundamental theorem of biomedical informatics: it isn’t that a computer can accomplish more than a human; it’s that a human using a computer can accomplish more than a human alone. Creation of this human-machine symbiosis in radiology will require AI companies to understand:

  • The radiologist’s clinical proficiency and build algorithms to give the computer that context
  • The discrete tasks of the workflow and build tools that automate the rote or tedious ones
  • The user’s experience and build an intuitive interface

Together, these features, delivered as a unified cloud-based solution, would simplify and optimize the radiology workflow while augmenting the radiologist’s intelligence.

History Lessons

Modern deep learning dawned in 2012, when AlexNet won the ImageNet challenge, leading to the resurgence of AI as we think of it today. With the problem of image classification sufficiently solved, AI companies decided to apply their algorithms to images that have the greatest impact on human health: radiographs. These post-AlexNet companies can be viewed as falling into three generations.

The first generation approached the field with the assumption that AI know-how was sufficient for commercial success, and so focused on building early teams with knowledge around algorithms. However, this group drastically underestimated the difficulty of acquiring and labeling large-enough medical imaging data sets to train these models. Without sufficient data, these first-generation companies either failed or had to pivot away from radiology.

The second generation corrected for failures of their predecessors by launching with data partnerships in hand – either with academic medical centers or large private healthcare groups. However, these startup companies encountered the twin problems of integrating their tools into the radiology workflow and building a business model around them. Hence they ended up building functional features without any commercial traction.

The third generation of AI companies in radiology realized that success required an understanding of the radiology workflow, in addition to the algorithms and data. These companies have largely converged on the same use case: triage. Their tools rank-order images based on their urgency for the patient, thereby sorting how work flows to the radiologist without interfering in the execution of that work.

The third generation’s solutions for the radiology workflow are a positive advancement that demonstrate there is a path towards adoption, but there is still much more AI could do beyond triage and worklist reordering. So where should the next wave of AI go in radiology?

Going For The Flow

To date, AI has demonstrated value in its ability to handle asynchronous tasks such as image triage and detection. What’s even more interesting is the potential to enhance real-time image interpretation by giving the computer context that lets it work with the radiologist.

There are many aspects of the radiologist’s workflow where radiologists want improvements and that AI-based context could optimize and streamline. These include, but are certainly not limited to: setting the radiologist’s preferred image hanging protocols; auto-selection of the proper reporting template for the case; ensuring the radiologist’s dictation is entered into the correct section of the report; and removing the need to repeat image measurements for the report.

Individually, a shortcut that optimizes any one of these workflow steps – a micro-optimization – would have a small impact on the overall workflow. But the collective impact of an entire compendium of these micro-optimizations on the radiologist’s workflow would be quite large.

In addition to its impact on the radiology workflow, the concept of a “micro-optimization compendium” makes a feasible and sustainable business possible; whereas it would be difficult, if not impossible, to build a business around a tool that optimized just one of those steps.

Radiology Tools for Thought

In other areas of software development, we are witnessing a resurgence in “tools for thought” – technology that extends the human mind – and in these areas, creating a product that improves decision making and user experience is table stakes. Uptake of this idea is slower in healthcare, where computers and technology have failed to improve usability and workflow and continue to lack integration.

The number and complexity of medical images continues to increase as novel applications of imaging for screening and diagnosis emerge; but the total number of radiologists is not increasing at the same rate. The ongoing expansion of medical imaging therefore requires better tools for thought. Without them, we will eventually reach a breaking point when we cannot read all of the images generated, and patient care will suffer.

The next wave of AI must solve the workflow of real-time interpretation in radiology and we must embrace that technology when it comes. No single feature will address this problem. Only a compendium of micro-optimizations, delivered continually and at high velocity via the cloud, will solve it.

Photo: metamorworks, Getty Images


Avatar photo
Avatar photo

Francisco Gimenez

Francisco is a partner at 8VC and focuses on Bio-IT investments and Enterprise AI.

Francisco previously was the Resident Data Scientist at Formation8 Partners where he worked with portfolio companies to strategize, prototype, and recruit for data products. He was the founder of Catenus Science, a data science consulting and recruiting firm that used an apprenticeship model to help early-stage companies build data science teams. Francisco received his Ph.D. from Stanford in Biomedical Informatics, where he was a Ruth L. Kirchstein Fellow. His research focused on clinical decision support for Radiology which won him the Martin Epstein award for best paper at the American Medical Informatics Association in 2014. He was the commencement speaker for the Stanford School of Medicine in 2015.

Prior to that he got his B.S. in Electrical Engineering and Computer Sciences from UC Berkeley while doing research in Parkinson’s disease at UCSF.

This post appears through the MedCity Influencers program. Anyone can publish their perspective on business and innovation in healthcare on MedCity News through MedCity Influencers. Click here to find out how.

Shares0
Shares0