Microsoft rolled out several AI tools at the HLTH conference last week to enable providers to harness the power of data to be more efficient in their care delivery efforts. The overarching goal is to reduce costs, burnout and improve outcomes. Called Microsoft Fabric, which was rolled out broadly in May, the announcement on October 10 marks the availability of Fabric’s capabilities specifically for the healthcare vertical – to clinicians, administrators but perhaps, most importantly, to patients.
One of the biggest challenges of healthcare data is that it is fragmented and exists in silos across one or many organization. What Microsoft wishes to do is to access this disparate data — be it clinical data sitting in EHRs, or imaging data sitting in imaging systems or unstructured data in a clinical note — de-identify it and integrate it. When the data becomes part of one huge data lake, it can be queried and AI algorithms can be built on top of this data to be used in various applications explained Umesh Rustogi, general manager, Microsoft Healthcare Industry Cloud, on Sunday, October 8 in a presentation in Las Vegas to journalists ahead of the formal announcement. Rustogi added that “a care manager can then use this data to draw insights about gaps in care, assess the cost of the gap in care, use this data to then create machine learning models for creating or making predictions about risk of readmission, making predictions about propensity to have certain diseases and so on.”
The Funding Model for Cancer Innovation is Broken — We Can Fix It
Closing cancer health equity gaps require medical breakthroughs made possible by new funding approaches.
A Microsoft customer, Doug King, Northwestern Medicine’s senior vice president and CIO, was on hand to sing praises of this unified data capability. He explained that especially after large-scale consolidation among health systems in the marketplace, there is a lot of siloed data sitting in various systems of the now-merged entity. King said:
And so not having that in a single area, single curated type of place makes it very difficult to really gain insights out of that data and to be able to have actionable things for a provider or a physician or a hospital operator to make decisions on how we can improve and better patient care. Especially when you start getting into the world of AI and all the promise that it can bring, all the great things it will be able to do for healthcare, it’s very difficult, if not impossible, to get great AI type of models with disparate data. With Fabric, one of the things that we are very hopeful for is… a curated data set, to be able to train algorithms on a more robust data set that is secure and to be able to generate algorithms that are higher quality and that we can train faster.
Data is both the bane and boon of healthcare, so how well Microsoft and other companies integrate disparate data from a wide variety of sources, how they clean it and make it less noisy, remove errors and duplication – assuming this is precisely what Microsoft is helping health systems do — will determine how good the data is. And in turn how powerful those algorithms can be. Otherwise, be prepared for what else, but a lot of legal liability.
Generative AI use case in the patient portal
While all the data curation, data integration, data de-identification — done right — can help healthcare organizations harness actionable insights, Microsoft’s afternoon presentation was most notable — to a non-technical, non-healthcare person like me — for what it can mean for patients. For this, Microsoft is leaning on generative AI and those capabilities were highlighted by Linishya Vaz, principal project manager, Health and Life Sciences at Microsoft.
Imagine you’ve had some bloodwork or radiological images done. The language of the radiologists’s report is replete with scientific terminology — it’s almost deliberately designed to keep patients in the dark or guessing about what the reports say. I’ve taken to Google search to divine the meaning of imaging and other lab reports as I am sure countless people have over the years.
Microsoft is aiming to make this easier for patients, assuming of course that their imaging provider/health system is a Microsoft customer and has deployed these capabilities.
In the hypothetical example that Vaz shared, a patient with chest pain has undergone a chest X-Ray. The report reads like this:
“Left anterior chest wall dual-lead pacer stable from prior
examination. Lungs hyperinflated to clear. No pneumothorax or pleural
eltusion. Pulmonary vasculature normal. Heart size normal. Osseous
structures demineralized, however intact.
Impression: Hyperinflated lungs in keeping with emphysema. Osteopenia.”
Now Microsoft’s generative AI capability can simplify the scientific jargon in plain English. And, per Vaz’s presentation, create a report that the patient can view in the patient portal.
* A device with two leads (dual-lead pacer) is stable in the left front part of your chest, as seen in the previous examination.
* Your lungs are overinflated, which is consistent with a condition called emphysema
* There is no air (pneumothorax) or fluid (pleural effusion) around your lungs
* The blood vessels in your lungs appear normal.
* The size of your heart is normal
* Your bones show a decrease in density (demineralization), but they are intact. This is called osteopeniaIn conclusion:
Your lungs are overinflated, which is consistent with emphysema, and you have osteopenia (decreased bone density).
The patient would also see this all-important disclaimer at the bottom of the report: This simplified version was generated by an Al assistant.
The power of this technology in helping patients decipher a radiology or lab report – assuming of course that the generative AI is simplifying accurately – cannot be overestimated. It will utterly eliminate the need for patients to conduct time-consuming web searches to understand what is happening to them. And make them feel empowered.
Ambient AI in a clinical encounter
Microsoft has been touting its ambient AI capability for years ever since it acquired Nuance (whose earlier incarnation had bought Dragon Systems, makers of the popular Dragon speech recognition software). October 8 was when I saw it in action. I volunteered to play the role of a patient (not hard to do given I recently broke my foot playing pickleball that requires me to wear an orthopedic boot). The physician’s role was being played by an actual user of the ambient AI assistant — branded DAX Copilot — Dr. Matthew Anderson, senior vice president and medical director of virtual health for Advocate Health and Atrium Health’s senior medical director for primary care. [Advocate Aurora Health and Atrium Health completed their merger in December 2022 becoming the 5th largest nonprofit health system in the country.]
Dr. Anderson asked me questions about how I got hurt, while DAX Copilot was running on a devicce nearby. Soon after our encounter concluded, DAX Copilot had summarized the information for all to see on a wall-mounted screen. It didn’t create the clinical note as a free flowing conversation. It created sections – there was a section for family history for example. From what I saw, it incorrectly reported two words I used. I said “instead of falling flat, my foot had fallen perpendicular on the court.” The ambient AI heard “pendicular” and “cord.”
OK, so it’s not full proof but it gets most of the grunt work done, and the doctor can review and fix any errors before approving the note. In our journalistic reporting, we also use audio-to-text AI transcription services, and like DAX, they are largely correct. We still review them for accuracy and can play back the audio but it’s a significant timesaver from having to hear the audio and having to transcribe it ourselves. AI still requires humans in the loop. The bar for accuracy is of course much higher for such encounters where patient safety is paramount.
Microsoft’s ambient AI is great for physicians in that it automates the creation of the clinical note. But it’s also great for patients used to doctors turning away from them to type into the health system’s electronic health record to record the doctor-patient encounter. You have your doctor’s full attention and a screen is not in the way.
How much will such capabilities cost health systems? That was an answer no human would give me that day or later.
Photo: metamorworks, Getty Images