MedCity Influencers, Opinion

Precision medicine will have a transformative effect on clinical trials

Experimental models such as virtual clinical trials are designed with patient-generated data in mind.

Paper made silhouettes with one of them of orange color to stand out from the rest

Evidentiary data has always had a tremendous impact on medical research. Much of that data has been generated via traditional clinical trials, which are often costly, lengthy and resource intensive but have played an important role in advancing medicine. However, researchers today have the opportunity to extract and use routine patient data to develop targeted treatments and medications to support precision and personalized medicine. As policy makers and the healthcare community recognize and embrace its potential, precision medicine will alter how research is performed and knowledge is created.

 The National Institutes of Health and Food and Drug Administration launched a $1.45 billion initiative earlier this year to bring data from multiple populations and sources — including electronic health records (EHRs) — to improve care and outcomes. In recent hearings on the 21st Century Cures Act, both agencies told the House Energy and Finance Committee that information from EHRs is essential for Precision Medicine. NIH leaders intend to gather the health records and DNA of 1 million people.

 Clinicians, researchers, and societies have no shortage of information. That’s good news and bad news. Referential information is growing exponentially, thanks in part to the proliferation of publications. However, researchers and clinicians should not be overwhelmed to stay abreast of the latest developments in their fields.

At the same time, workflow is a focus. How do we provide the right information at the right time to the right professional in the right environment and on the right platform? Is it on a handheld device or in an EMR?

 It takes a great deal of time, energy and funding to conduct clinical studies and randomized clinical trials. The process takes several years, from performing the trial, collecting data, then analyzing and publishing it. Once that occurs, expert advisory panels interpret the published results and provide recommendations on best practices.

So, the basis for referential and workflow content comes from “the literature.” Publications report on the clinical trials, and practitioners hopefully incorporate the new findings into treatment. However, there’s another readily available source of data, and it won’t take years to develop or analyze.

 One definition of precision medicine is the ability to extract patient data from electronic repositories to answer clinical questions more precisely. By pulling patient data – some of it genomic, and some not – from EMRs or imaging systems, we can query that data with analytics immediately. We can visualize, plot and structure the data to respond to key clinical problems.

To apply precision medicine in this manner, ecosystems will be created to pull and aggregate large patient databases from medical consortia, such as Stanford University Hospital or Cleveland Clinic. Existing patient data is essentially treated as a huge clinical trial. In amassing and extracting from these data sets, medical professionals can see first-hand how a group of patients reacted in a specific way, over multiple years. By normalizing patient data to search retrospectively for trends and successes, practitioners can more quickly assess and improve patient care.

This alternative thinking has already been courted by some key early adopters through other experimental models such as virtual clinical trials. These are designed with patient-generated data in mind. Using digital avenues like social media platforms for recruitment, participants are then sent remote monitoring, diagnostic and reporting tools to be used in the convenience of their own homes. This addresses the major roadblocks for clinical trials: accessibility to the few designated sites across the country, associated administrative costs and the high drop-out rate among participants.

 Even considering the above, these approaches will not make traditional clinical trials and the use of their published results obsolete in the near future. Because the need for standard trials is to ensure the safety and efficacy of treatments as determined by stringent regulatory standards. With the integration of digital and electronic patient engagement and assessment tools, there is an added layer of complexity of compliance to such standards. But as the FDA continues to reconfigure its approval process of digital health and medical technologies, these models will start to gain traction – and with it, reduced costs and speedier results.

If so, will it dissuade some researchers from undertaking lengthy studies? And how will this affect the cost of patient care?

 Also, what about privacy and security? All academic medical centers are now huge repositories of data. Assuming that patient data will be relayed anonymously for analysis, will patients and their providers be satisfied that their privacy has been maintained? Trust will, and should, be an issue.

 Precision medicine will become the predominant way that patient outcomes are demonstrated and used. By democratizing and changing the research infrastructure, researchers will be able to provide healthcare teams with the best and most timely data.

After all, who doesn’t want their medical care to be based on the most optimal information available?

Photo: FotografiaBasica, Getty Images


Avatar photo
Avatar photo

John Danaher, M.D.

Dr. John Danaher is President of Clinical Solutions at Elsevier, a global information analytics business that helps institutions and professionals advance healthcare, open science and improve performance for the benefit of humanity.

This post appears through the MedCity Influencers program. Anyone can publish their perspective on business and innovation in healthcare on MedCity News through MedCity Influencers. Click here to find out how.

Shares1
Shares1