Devices & Diagnostics, Health IT

Meet the ‘preeminent AI company on earth,’ but can it succeed in healthcare?

Nvidia holds a dominant position in terms of making the chips that power artificial intelligence projects, but can the Silicon Valley tech company with roots in the world of gaming and graphics succeed in healthcare?

 

Last year, my brother, then an employee at Silicon Valley-based tech company Nvidia, declared that all the AI and deep learning that is happening in healthcare is being powered by Nvidia’s graphics processing units (GPUs).

I was aware of an AI partnership Nvidia had with Partners HealthCare but brushed aside his comments as being from someone who has clearly imbibed a bit too much of the corporate Kool-Aid.

But recently Nvidia began popping up more and more in our healthcare world. Startups that MedCity has written about — such as Arterys, and Recursion Pharmaceuticals,  — actually base their products on Nvidia’s chips and software infrastructure. Jen-Hsun “Jensen” Huang, Nvidia’s co-founder and CEO, was a featured speaker at Partners Healthcare-hosted World Medical Innovation Forum in Boston that focused on AI. The company has also taken bets in healthcare startups – it is an investor in Zebra Medical – and its Inception deep learning AI incubator has more than 300 healthcare startups, per a company representative.

All of which means we need to take a deeper look at the Santa Clara, California company’s efforts in transforming healthcare through AI. The mainstream business world has already heralded Huang as a visionary but healthcare is a thorny beast. We have seen high-profile casualties, most notably IBM Watson’s partnership with MD Anderson and scrutiny over the collaboration with Memorial Sloan Kettering. Google’s DeepMind has also stumbled badly in the U.K. as it aims to bring AI-driven tools to the National Health Service.

So will Nvidia fare any better? [It’s important to provide a giant disclaimer here — as a result of my brother’s untimely death late last year, I have inherited some Nvidia stock and know some senior employees at the company. Huang also spoke at my brother’s memorial at Stanford University.]

sponsored content

A Deep-dive Into Specialty Pharma

A specialty drug is a class of prescription medications used to treat complex, chronic or rare medical conditions. Although this classification was originally intended to define the treatment of rare, also termed “orphan” diseases, affecting fewer than 200,000 people in the US, more recently, specialty drugs have emerged as the cornerstone of treatment for chronic and complex diseases such as cancer, autoimmune conditions, diabetes, hepatitis C, and HIV/AIDS.

Well, the first question to answer is whether Nvidia is as ubiquitous in the machine learning, deep learning world in healthcare as my late-brother claimed? It appears he wasn’t far off.

“While Nvidia is very low in the stack in terms of the AI world, they produce the GPUs that are underlying a lot of the machine learning that everyone is using in healthcare,” said Dan Housman, managing director and chief technology officer of ConvergeHEALTH, the software products group at Deloitte.

A startup executive was more emphatic about the company’s DL chops in healthcare.

“There are no viable competitors to Nvidia GPUs for training deep learning models right now, as well as deploying inference services at scale,” said Dan Golden, head of machine learning at Arterys. “Although there are some alternatives (such as Google TPUs), Nvidia GPUs are the most cost-effective and available solution today.”

Arterys, a cloud-based imaging startup, is one of those AI companies that the FDA has actually blessed. Powered by Nvidia’s technology, Arterys can “analyze seven clinically-vital dimensions of heart blood flow data at the same time: three in space, one in time, and three in velocity direction,” wrote Karim Karti, president and CEO of GE Healthcare’s imaging business in a recent MedCity post. The company’s first-of-its-kind, FDA-approved deep learning application allows radiologists to take cardiac MRI analysis from about one-hour to only minutes, or even seconds, Karti explained.

GE Healthcare embraces Nvidia’s GPU platform
GE Healthcare is no stranger to this world of deep learning in imaging. In fact, performance was one reason [the other being cost] that GE began eyeing Nvidia’s technology back in 2007 for it CT machines. The company began to observe exponential increases in data quantity as well as dramatic increase in image precision which meant the FPGAs [Field Programmable Gate Arrays] being used by the CT machines were simply archaic.

“Our teams were trying to work around the performance requirements and if you think about the evolution of CT since then, the number of turns has increased radically,” explained Keith Bigelow, general manager, analytics at GE Healthcare. “So you can think of the volume just from repetition increasing and then the image precision or quality has also gotten dramatically up. So we would see jumps in data quantity sometimes at a geometric scale and it was at that point that the team realized that we couldn’t continue on the path we were on right now.”

Which brought Nvidia into the GE orbit.

“That was when the GE CT team made the leap to Nvidia and began embedding GPUs in our CT machines and our sister groups across the organization started to see the volume of scans created by the CT team as well as the volume of data being processed through those chips and it became more and more interesting to sister teams,” he said.

As an example, GE’s Revolution Frontier CT is two times faster in imaging processing than its predecessor, due to its use of Nvidia’s AI computing platform, Bigelow pointed out, in a recent phone interview.

“It also is expected to deliver better clinical outcomes in liver lesion detection and kidney lesion characterization because of its speed – potentially reducing the need for unnecessary follow-ups, benefitting patients with compromised renal function, and reducing non-interpretable scans with Gemstone Spectral Imaging Metal Artefact Reduction,” Bigelow claimed.

Currently, machine learning algorithms built on the Nvidia platform are found in everything from GE’s EKG devices to its Vivid and Logiq ultrasound devices, CT devices, MR devices, Mammo devices and X-Ray machines.

In cardiac ultrasound, “This advanced tech allows cardiologists to look at the heart from an entirely new perspective and see subtle differences in the heart tissue that was previously impossible to detect,” Bigelow said.

And similarly on the software side, GE’s PACS (picture archiving and communications systems) and VNA (vendor neutral archive) — the software that radiologists use to read and interpret images — are also souped up with deep learning and machine learning algorithms to improve workflow, efficiency and hopefully, patient outcomes as well. Devices of all sizes and software run on the same Nvidia platform no matter what specific chipset is being used, which is important for consistency, he explained.

Bigelow, who joined GE about two years ago and began looking under the hood, realized how core Nvidia’s technology is for GE’s imaging business but so rarely talked about.

“We are GE Healthcare and we talk about GE,” he said other GE employees told him. “I said, ‘I know but Nvidia is the kind of the preeminent artificial intelligence firm on the earth and we’ve been partners with them over for over a decade and nobody has ever mentioned it.'”

Nvidia’s reinvention from gaming to AI
Nvidia’s foray into healthcare is part of the company’s overall reinvention from a graphics, gaming company to an AI company – for instance, the company is also pursuing driverless cars, robots, and drones. It is a transformation that Kimberly Powell, vice president of healthcare, has personally witnessed when Nvidia began looking at medical imaging about 10 years ago.

The interest in imaging makes sense for a company that has made its name in the gaming industry where the ability to render images and create high-quality graphics is paramount.

“The instruments – CT scanners and MR, ultrasound and X-Ray, the sensor technology were getting so great and they were producing so much data that they needed new, powerful supercomputers to process that data,” Powell said in a phone interview. “The old technologies just weren’t able to keep up. There were new algorithms that were very very mathematically complex that needed to be used in order to create these new amazing medical images and bring new clinical breakthroughs.”

Some of the early imaging algorithms used Nvidia’s accelerated computing architecture for different applications — for instance, reducing radiation dosages drastically in CT and making it safer for chronic disease patients to be imaged repeatedly, Powell said. But now the focus is on deep learning, which she described as a new approach to performing artificial intelligence.

“When this new computing paradigm showed up, one of the earliest adopters were in the medical imaging field in both pathology and in radiology,” she recalled. “And so you can see that over the last several years, at all of these medical imaging research conferences over 50 percent of the research is being done using deep learning now.”

Collaboration with Partners HealthCare
That deep learning expertise is being leveraged by Nvidia and partners in at least two efforts that could fundamentally alter not only the medical imaging and the radiology industry but the world of pathology and genomics too.

The first is a collaboration with Partners HealthCare. Two years ago Nvidia became the technical founder of the MGH (Massachusetts General Hospital) and BWH (Brigham Women’s Hospital) Center for Clinical Data Science that would bring Nvidia’s know-how to advance AI in radiology, pathology, and genomics. GE Healthcare joined as a partner last year followed by Nuance Communications this year.

In a phone interview, Mark Michalski, executive director of the center, explained how GPUs are central to the deep learning models they are trying to build and deploy to transform patient care.

What GPUs do is solve many many simple problems all at once. They have thousands of cores and each one of them dedicated to solve a simple problem. The reason why they were created is to make graphics in video games where you needed to do something called ray casting. To create these beautiful graphics you have to do a calculation of what should show on each one of these pixels one at a time.

But it turns out that architecture is useful for lots of other things and it also turns out that architecture may be similar to the way that our brain works – maybe that’s why they are so good for neural networks.

So the first set of things that we are going to try to do is take problems in radiology and diagnostics at large where we can use machine learning to either interpret those images more rapidly, more quantifiably or both.

Let’s say we have an image of an tumor. We want to be able to characterize that tumor faster, more quantitatively or both. One way is to measure the volume of the tumor and we can do that automatically with machine learning and that also makes it more quantifiable.

What, however, is more exciting is the ability to see things that humans cannot.

We are looking for other biomarkers in the images. We are looking for features in those images that we may not have a very good way of understanding or seeing as humans but we may be able to see them with the help of a machine” Michalski said. “That would be another nice thing.”

It’s a ‘nice thing’ Nvidia’s Powell said a radiologist at Mayo Clinic seems to have already witnessed. She mentioned Dr. Bradley Erickson who was looking for biomarkers in brain tumors. The current standard is to do a biopsy and then perform pathology that will yield a result that indicates a treatment path. Based on his years of practice, Dr. Bradley knew which tumors responded to which treatment, and when he went back and looked at the images, he found that actual biomarker information was inherent in the images itself, Powell said.

Erickson has coauthored a study that shows that using convolutional neural networks could help to identify a biomarker from MRIs in low-grade gliomas, a common brain tumor, and could be an alternative to surgical biopsy and histopathological analysis.

“Deep learning technology (which relies on NVidia’s GPUs) has enabled us to routinely measure and find information in medical images that was unthinkable just 5 years ago,” Erickson wrote in an email. “We can now find molecular properties of tumors with accuracy that can rival histopathology (microscopic or DNA study of tissue).”

Nvidia’s Powell concluded that “there’s texture on the images to say which biomarker that tumor has and as a radiologist, he wasn’t trained to see that but the computer allowed him to now see that.”

AI taking over human jobs in radiology
While indeed exciting, that, of course, raises the obvious question of machine learning or AI replacing radiologists. Neither Powell nor Michalski of Partners was ready to concede that it is a zero-sum game when you put machine against human in the world of radiology.

“When we try to draw a line around a tumor, AI does that very nicely. But other tasks don’t work so nicely,” Michalski said. “There are parts of it that definitely can be replaced – that a machine learning algorithm even today can do probably as well or better than a human. And then there are components that are much, much harder to automate. It’s not binary.”

For Powell, the question of machine replacing humans in radiology is entirely misplaced and narrowly focused to the U.S.

“I actually see it in a different way. I just got back from Japan and in Japan, they have the lowest ratio of radiologists per capita than any of the developed countries — 35 radiologists for 1 million people. We have one for every 10,000. While we could think of it that way, [as AI replacing human radiologists] I don’t.

Smarter people than myself have raised the specter of massive employment disruption as AI gets adopted across industries. However, I doubt that many can find a strong argument against adoption if AI can indeed help in early diagnosis of cancer, for instance.

Applying these models to clinical practice
For that to happen, these deep learning algorithms need to be used broadly in clinical practice.  At Partners, the Center for Clinical Data Science is getting ready to do just that.

In a matter of weeks, DL algorithms will be rolled out in the realm of stroke through an Institutional Review Board (IRB) approval. Here’s how Michalski described the effort.

We are trying to improve speed to diagnosis and to quantify stroke. We recently learned through the Dawn trial that we can treat people with strokes for much longer after they have their initial stroke than we originally thought we could. So what that means for the healthcare system is that we have to find ways to make sure that diagnosis happens rapidly and that they get treated.

It was all too frequent that patients with stroke would come to you and you felt that there was nothing you could do because the time window has passed. So diagnosis and characterization of strokes is more important than ever not just at a tertiary care facilites but at community hospitals and community centers so machine learning can help us with some of those things.

GE’s Bigelow echoed the importance of the Dawn trial results adding that “our teams are creating an AI-powered Stroke Solution Suite that simplifies and organizes images for faster and more accurate stroke detection and treatment in situations where time = brain.”

If the IRB’s validate the algorithms, the team will seek FDA approval to roll them out more broadly, Michalski said.

Comparisons to IBM Watson Health
Tech company partnerships with providers — which is essentially what the Nvidia-Partners HealthCare collaboration is — naturally prompt comparisons to the other big tech company foray into healthcare using AI: IBM Watson Health and its partnership with Memorial Sloan Kettering and MD Anderson. Unless you are living under a rock, you know about the many stumbles IBM Watson has had in bringing its AI expertise in healthcare, the most recent negative headline involving a round of layoffs.

Neither Powell nor Michalski offered any meaningful comment on why their partnership is any different from IBM’s beyond the “we are solving problems together” and “each of us is domain experts” rationale.

But there are actually some pretty big differences. First, a big medical device giant — GE Healthcare — is part of the mix, and that is “pretty interesting,” said Housman of ConvergeHEALTH at Deloitte, which in fact counts IBM Watson as a partner. [He added, however, that such partnerships aren’t unique in that Amazon Web Services has a partnership with Philips Healthcare but couldn’t say whether that collaboration focused on machine learning.]

Second, once the algorithms being developed are validated through IRBs they are going to undergo FDA review – that’s clear from any conversation with any member of the trio, whether GE, Partners or Nvidia. IBM Watson’s current products on the market — such as Watson for Oncology, Watson for Genomics or Watson Clinical Imaging Review — haven’t required any regulatory approvals or clearances. Perhaps down the road products in the pipeline, especially imaging products would according to an IBM Watson spokeswoman.

And third, when was the last time you saw an ad on TV heralding a new world where Nvidia’s AI would chip away at the world’s healthcare problems? No, you didn’t because there aren’t any, but you sure remember those IBM Watson ones, right?

While Powell and Michalski were reticent to comment on IBM Watson Health’s AI efforts in healthcare, GE’s Bigelow was less so.

IBM didn’t invent the X-Ray. GE did. We have over a 100 years of clinical expertise and most of the ground-breaking work across every modality over the past 100 years has been pioneered by GE. And if you think why is this a magical triumvirate of Nvidia, Partners, and GE, it’s because it’s best-in-class partners who know the domain. The clinical domain coming from Partners. The device and software domain coming from GE and the GPU platform coming from Nvidia.

The devil is in the data
Well, that’s all very well, of course, but as anyone who has scratched the surface of AI knows, it’s really about how good the underlying data is. In other words, garbage in, garbage out.

It’s one thing for AI to deal with structured data like images. It’s quite another to look at the EHR, clinical notes, claims and billing code data —many of which Partners is very much looking into — that are rife with human error. These additional data sets are necessary to create an accurate, overall picture of patients not available if you only count imaging data.

But the data problem is not simply an issue of error-prone or “dirty data” that needs to be cleaned up.  The 800-pound gorilla in the room is the issue of data governance for all the different data silos. Here’s how Housman of Deloitte explained it.

There’s a natural concern with poinitng random analytics against the EHR. Among the data governance topic is data stewardship. You mention cleaning the data so that it can be ready for these tools. The reality is that in the course of doing day-to-day work — just entering information into your EHR, going through the regular workflows, whatever happens in the clinic,  the admnistration function — there’s all sorts of variability as to what gets entered at what place and how and where, that can really impact the utility of the data.

And governance involves someone figuring out not just access but what are we going to do about cleaning things up and what layer of the data collection process are we going to clean it up. If you can’t capture the data at this point, we can’t train a model on it. 

Well, someone has to decide actually we will clean it up here and now we have a complete workforce that curates documents from pathology notes— they read through the notes and extract it. All that system of cleaning and prepping data tends to fall down if governance hasn’t been planned out first.

Who’s going to do it? How much are we going to invest and what’s important to prioritize and how far back do we go? We see that as being a pretty high prioroty item. If we are going shopping for AI, you end up with this pretty big pile of data governance kitty litter in your shopping cart which you didn’t know you wanted. You wanted a cat. 

Data governance issues have created a public nightmare for Google’s DeepMind concerning its work with National Health Service in the U.K. Such problems could easily occur Stateside said Kristin Feeney, a senior data scientist, and Housman’s colleague at ConvergeHEALTH, Deloitte.

Project Clara – the medical imaging supercomputer
Meanwhile, many paragraphs ago I noted that Nvidia is involved in at least two efforts that could radically alter the world of radiology and ultimately patient care. That second initiative is what the company is calling Project Clara and was announced in March.

Here’s how Nvidia’s Powell explained the vision:

The idea behind this is that medical imaging has become computational. Today 50 percent of the medical imaging research is being done with deep learning. So when you have this paradigm shift and this huge explosion of this computational demand – it’s increased 10 times over the last six years – you really have to rethink the computing architecture that serves this important application sort in medical imaging.

Project Clara is a new computing platform for innovation and applications for medical imaging. Project Clara can bring all the modern computing architecture we’ve all enjoyed — our connected phones to the cloud and constant over-the air updates and the virtualization of things, remote access to things — now to medical imaging so that all medical imaging devices that are in the installed bases can be upgraded, new medical imaging devices can be augmented with supercomputers and then you can even imagine business models evolve with performing AI -as-a-service for medical imaging.

GE Healthcare’s Bigelow can hardly contain his excitement for what Project Clara could set in motion. Imagine being able to use the latest algorithms almost instantly without having to go and buy new GE imaging equipment. What in effect you achieve is a software-defined medical device.

“We see an evergreen fleet and this is actually one of the challenges especially for higher-end medical devices where if you are a provider — a Tier 1 or Tier 2 hospital — looking at the acquisition of a new device, you are wondering, ‘Is this going to be like a car and it will depreciate as soon as I drive it off the lot?'” Bigelow explained. “The beauty of creating software-defined medical devices is really that we can keep them evergreen and they get better, they detect more medical conditions, they capture images better than the day you got them because the pooled data that drives the quality of image capture or the image interpretation simply gets better with every additional scan.”

This is not unlike self-driving cars getting smarter the more miles they drive. But apparently, Project Clara’s potential is even greater. Here’s Bigelow again:

Imagine a medical institution with 10 years of CTs. And they ask can GE look through all of my historical scans and try and find data for say Stage 1 lung cancer because if we can intervene earlier on a patient at Stage 1 instead of a Stage 3 or Stage 4 we are going to have a radically better outcome. So here when you think about distributed compute – well, GE is going to have all of the plumbing in place from our devices to our health cloud to the Nvidia platform both on premise and in the cloud because we already support their cloud instances as well, so that we can dynamically spool up and elastically spool up to handle the demand of 10 years of data at a clinical institution and spin back down again.

What would be the alternative?

Well, we could orders tons of machines but that would be a waste of money since this is a one-time exercise.  From that point forward, we would be able to do inferencing on the fly.

And when we develop new algorithms in partnership with clinical institutions, we might spin off all that Clara compute again to run it thorugh these images to see if we could find something else again.

It’s like a hose where you can open it up to full power for some short period of time and then you turn it down to a trickle to manage the ongoing runrate of new data that comes in and that is that dynamic load balance of compute is what Clara is all about for us in terms of one of its key benefits.

So as new algorithms are developed through Clara, providers can virtually upgrade their current GE software and equipment so that say a Kentucky provider can take advantage of an algorithm developed say at Partners’ Clinical Data Science Center.

Money, money money, but no one disclosing business models
While all this sounds fascinating, even to an English major like myself, no one wants to disclose business models.

Theoretically, if GE and Partners develop algorithms that a provider in Kentucky can take advantage of through a subscription service without having to buy a new device or workstation or PACS, shouldn’t Partners get a cut given their IP was involved?

“What I would say is that it is an evolving landscape [of business models],” Michalski of Partners said. “Probably best to ask GE or Nvidia what their business models would be.”

Bigelow, for once, was not sharing.

We cannot disclose the exact commercialization plan at this time, but we can say that both Partners and GE Healthcare are excited about making these algorithms available to hospitals and health systems globally, thus bringing best in class technology and clinical expertise to patients around the world,” Bigelow declared. “We are committed to making these algorithms easy to access for our customers, regardless if they are authored by GE, by GE and Partners, or by any of the startups working with GE Healthcare.”

Which brings us finally to Nvidia and how they make money or expect to in healthcare.

“Research hospitals are one of the early and largest adopters of our GTX supercomputers because we’ve essentially made an appliance that allows you to do deep learning so now the domain experts have a platform way to develop these applications,” Powell said. “As you know our platform lives inside all these instruments and also lives in all of the workstations where radiologists do their review. That’s the way we monetize today.”

Nvidia doesn’t break out its healthcare industry revenue but noted in its last quarterly earnings call on May 6 that Siemens is a new customer using its high-end GPUs for its CT and ultrasound devices. The medical imaging market globally is projected to increase to $46.2 billion by 2024 up from $29.5 billion in 2016.

Deloitte’s Housman thinks it’s clever in the way Nvidia has approached the healthcare industry.

So as they partner with specific healthcare institutions or specific software companies or instrument manufacturers or healthcare providers, it’s almost like they are the power company trying to do development work to get more factories near their power company because it’s going to come back to them as everyone will use their GPUs,” Huisman added. “So their strategy is quite smart in that they are doing so well with embedding GPUs into the cloud, in places like Amazon Web Services to some degree in Google.”

Selling more and more GPUs and supercomputers, of course, is great for investors and Nvidia’s stock has appreciated tremendously over the past 12 months. As for Nvidia-powered AI transforming patient care and delivering better outcomes while lowering costs, we have to depend on that over-used but apt phrasing: only time will tell.

Meanwhile, Chinese chip companies are not sitting idle as they hope to unseat Nvidia’s strong position in the AI market.

Photo: LeoWolfert, Getty Images

Update: The post has been updated with comment from Dr. Erickson from Mayo Clinic.