Policy, Health IT

FDA digital health draft guidance scales back regulation of certain types of software

Bradley Thompson of the law firm Epstein Becker Green expressed disappointment with portions of the long-anticipated clinical decision support draft guidance from the FDA.

data, patient, medical records, health data, healthcare data

The U.S. Food and Drug Administration released draft guidance on digital health regulations designed to reduce ambiguity over the kinds of clinical decision support tools and patient decision support tools that needed to be reviewed by the FDA and those that didn’t.  The regulator also issued final guidance on standardizing the way safety, effectiveness and performance are assessed for Software as a Medical Device. But Bradley Merrill Thompson, general counsel for the industry group Clinical Decision Support Coalition, speaking for himself, said he was disappointed by some of the proposals which the coalition had been waiting on for several years.

CDSC has worked to develop a proposal for how to draw the line between regulated and unregulated clinical decision support software.

presented by

Reducing the types of software tools that need FDA clearance would free up the regulator to focus its attention on technologies it regards as a higher priority. The FDA offered a few examples in its draft guidance documents of what would and would not require its approval.

  • Software that manipulates or analyzes images and other data obtained from a radiological device, such as a CT imaging device, to create 3D models of the region intended to be used in planning orthopedic/dental surgical treatments with a device still requires FDA clearance.
  • Software that helps to identify drug-drug interaction and drug-allergy
    contraindication alerts, based on FDA-approved drug labeling and patient-specific information, to prevent adverse drug events doesn’t require FDA clearance.
  • The FDA would still evaluate software used alongside home blood testing required with the use of anticoagulents for dosing adjustments based on the outcome of the home blood test without the patient seeking consultation with their healthcare provider.

The FDA also shared a statement by Commissioner Dr. Scott Gottlieb prepared for an appearance before the Senate HELP Committee hearing this week.

“We believe our proposals for regulating [clinical decision support] and [patient decision support] not only fulfill the provisions of the [21st] Cures Act, but also strike the right balance between ensuring patient safety and promoting innovation.”

Despite the intention behind the new draft guidance, Thompson, a member of law firm Epstein Becker Green in Washington D.C., expressed disappointment with some parts of the draft guidance. The clinical decision support guidance had been six years in the making but in an emailed statement about FDA Commissioner Scott Gottlieb’s comments, he said he couldn’t praise it.

He noted:

The problem is that FDA seems to have walked away from making a risk-based determination.

Big picture, clinical decision support software is software that takes some type of medical knowledge and applies it to an individual patient to make an individual recommendation. Based on that, people familiar with healthcare can probably readily see a wide spectrum of risk associated with such software.

At the risky end, there is software that makes very important and direct recommendations for specific chemotherapy treatment based on a wide variety of data.  And if that software makes the wrong recommendation, it is quite possible the patient would suffer, even die.

He also shared what he had hoped the FDA would do:

“What I think many of us in [the] industry were hoping for was an effort by FDA to distinguish high from low risk as a basis for regulation. We didn’t get that. Worse, it appears based on the guidance that FDA is not interested in drawing that line.”

He also expressed alarm with the FDA’s conclusion on clinical decision support aided by machine learning, noting the FDA didn’t seem to have any plans for differentiating between how this software is regulated based on risk.

Photo: nevarpp, Getty Images