Health IT, Health Tech, Legal

Does Epic’s CEO have a point on privacy?

With data-blocking rules expected to go into effect soon, patients should finally be able to access and freely share their health data. But should those rules also include privacy protections for those that opt to share their data with health apps? 

Epic CEO and Founder Judy Faulkner attends 2019 Forbes Healthcare Summit. Faulkner has raised concerns over HHS’ proposed information blocking rule, saying it could compromise patient privacy.

A new policy — of which Epic CEO Judy Faulkner is no fan — will soon go into effect that would accelerate the sharing of health data. The head of the Office of the National Coordinator for Health IT (ONC) said on Wednesday that the final version of the information blocking rule will come out “relatively soon.” 

Though Dr. Don Rucker couldn’t share an exact date with lawmakers, he said the agency had been finding a balance between promoting transparency while protecting patients’ private health information, a concern that some groups have raised with the proposed rule.

In late January, shortly before the rule was expected to go into effect, Epic Systems CEO Judy Faulkner reportedly sent out an email to hospital CEOs asking them to sign a letter to the Department of Health and Human Services, raising their concerns with the proposed policy. A handful of them actually went along with it — 60 hospital systems signed the letter, according to CNBC. 

“By requiring health systems to send patient data to any app requested by the patient, the ONC rule inadvertently creates new privacy risks,” Epic wrote in a public statement on its home page shortly after the emails were reported.

Many viewed the effort by Epic as an effort to stymie competition. Health and Human Services Secretary Alex Azar denounced the company’s concerns as “scare tactics,” and a separate group of healthcare experts has pushed for the ONC to publish the final rule immediately. But when it comes to health apps, does Epic’s CEO actually have a point regarding patient privacy? 

First, some context. Previous regulations, such as the 21st Century Cures Act, require health systems and payers to share patient information. The proposed new rule would simply clarify and add teeth to past interoperability efforts. For instance, it would make it easier for patients to access their health data, while fining health IT vendors up to $1 million every time they fail to share information. It also encourages healthcare companies to adopt standardized APIs, which would make it easier to share health information across software systems — including smartphone apps.

For the thousands of patients who have struggled to carry their records between health systems, the regulations pose a long-awaited solution. Epic’s largest competitor, Cerner, has notably endorsed the rule. But despite the overarching support, some providers have raised concerns over the details, out of fear that patients might be giving away more information than they know by opening their records to health apps.

Sharing data is ‘business as usual’
Most healthcare apps share data, whether users realize it or not. These privacy concerns have existed long before the ONC’s information blocking rule was first drafted. But the rule’s emphasis on open APIs highlights a need for more transparency on what patients are consenting to when they agree to share their health information. 

“Where people might be a little surprised… folks sort of assume your health information might be protected no matter who has it because it’s always sensitive and it’s always personal,” said Dianne Bourque, an attorney with Mintz Levin, who is an expert in healthcare privacy and data security. If you’re downloading an app off the app store, that information is likely not protected.”

Numerous studies of consumer-facing health apps have confirmed that the majority of them do, in fact, use that data for commercial purposes. Most apps that patients download off of the app store aren’t subject to HIPAA, making for fewer restrictions on how they can use that data.

A study published in September by Quinn Grundy, an assistant professor of nursing at the University of Toronto, showed 71% of the top medication apps had at least one promotional strategy involving user data. Grundy said she had picked 24 of the top-rated Android apps for medication adherence, all of which included some degree of sensitive health information.

“We had hoped these apps would show a better degree of privacy than apps for monitoring your diet or going for a run,” Grundy said. “We found sharing is pretty on par with other types of apps in general on the Android platform. …  It kind of represents business as usual.”

Grundy and her team created fake user profiles and tracked how the apps shared 28 different types of user data, ranging from email addresses and device operating systems to physician names and drug lists. The most common type of data sharing was pretty innocuous, such as sharing device information to detect bugs and ensure the app was functioning properly. But a third of the tested apps shared data with analytics or advertising companies.

“We could tell the app sent your time zone to a digital advertising company. Or they sent your information to data analytics companies,” Grundy said. “Potentially, when you put together all of the little pieces, you have a clearer picture of the person even if you don’t know their name.”

 Other researchers have pointed to similar results. Another study published in JAMA Network showed the majority of the top 36 apps for depression and smoking cessation transmitted data to services provided by Facebook or Google. Of the 36 apps, 29 shared data with these services, but only 12 accurately disclosed this in a privacy policy.

 Testing conducted by Consumer Reports showed that all five of the top period-tracking apps shared data with advertisers and marketers. Some of that data — such as whether a user intends to become pregnant —  is valuable to advertisers. 

But these concerns go beyond seeing unwanted diaper ads. Experts worry that it could affect users’ access to housing or insurance. An editor at ProPublica faced a catch-22 when his insurance company denied coverage for a new mask for his CPAP machine because he wasn’t using it enough. The reason he hadn’t worn the mask, was because he needed a new one.  He later learned the machine had been transmitting his usage data. 

Similarly, with health apps, Grundy said users should expect that their data is shared.

“Health apps are such a booming market. … Companies are valued by how much patient data they have access to more than how much money they make,” she said. “Healthcare is a particularly sensitive and special place. That would be a great place to strengthen some of these protections.”

When HIPAA does apply
It can be difficult to determine what privacy requirements a health app is subject to, and what companies can do with users’ health information. The details are complex, even for those steeped in it. 

“As an attorney, it’s not easy to decipher,” Mintz Levin’s Bourque said. “It’s not always clear where information is coming and going. … It takes a little bit of digging to understand.” 

Most of the time, when users download an app off the app store, their health information generally isn’t protected under HIPAA. For example, a Fitbit or another fitness tracker purchased by the user generally wouldn’t be subject to these requirements. 

On the other hand, if a patient is prescribed an app by a physician, or is given access to one through an insurance plan, that health app would likely fall under a HIPAA business associate agreement. They would be restricted from using health information for marketing or commercial purposes unless that data is de-identified. 

“If you were using a product in cooperation with a provider or health insurer, under the law there are ways they’re allowed to aggregate and share that data,” Bourque said. “They can use it for quality purposes, teaching and training. They can take data without your name, address and numbers associated with you. The remaining data is not subject to privacy law anymore.” 

“It’s not all nefarious,” she added. “If they see people who take a lot of steps see a massive decrease in hospitalization rates… that’s really important data.” 

Contractors that provide services to the developer, such as cloud storage, would also be subject to the privacy and security terms of the business associate agreement.  

Of course, HIPAA isn’t the only privacy law governing patient data. Federal law protects the privacy of patient records for substance use disorders and HIV testing. Companies also have to consider state law, such as the newly-enacted California Consumer Privacy Act. On top of HIPAA, to be compliant with the CCPA, companies must give users the ability to see what information a company has collected on them, and allow users to opt out of having their data shared with third parties.

Initially, all of these requirements can be a shock for founders that are new to the healthcare space. 

“You see a little bit of a culture clash if you’re representing one of the startup companies working with health plans,” said Anne Redman, a partner with Seattle-based law firm Perkins Coie. “Their level of risk tolerance for being casual about compliance is not at all the same as a startup company.” 

Privacy proposals
Stakeholders have come up with different proposals for how to balance patients’ privacy needs without being overbearing. 

One popular idea circulated by provider groups is making a checklist available to the public with information about whether an app follows best practices for data use and has a model privacy notice. Then, patients can decide how they want to proceed. 

“For example, a patient’s app should have to reveal to the patient that his or her health information will be sold to other companies. And a health insurance company should be prohibited from using a patient’s medical record to increase prior authorization requirements,” American Medical Association President Dr. Barbara McAneny wrote in a letter addressed to the ONC on the proposed rule. 

Another group developing the Android equivalent of Apple Health is also grappling with this question. JP Pollak, founder and chief product officer of the Commons Project, said the nonprofit was developing a system where patients can securely store their health data and share it with trusted partners. But by storing health information, that app would be beholden to similar decisions that Epic and Cerner face on what other services to connect with. 

“From all of the headlines and the comments from Epic… the argument is being pitted as free-flowing data from the EMRs and health systems is either the greatest thing ever or the most dangerous thing that could ever happen to people’s data. I think the reality is somewhere in the middle,” Pollak said. “The concern about downloading data and freely sharing it throughout the ecosystem is certainly real. But there are checks in place. The Epics and others aren’t letting just any developer hook up to their system and share data.”

Pollak’s concern is more focused on the resharing of data further downstream. It’s not quite clear how far information blocking regulations would extend.

Pollak said the Commons Project was discussing ways to balance patient autonomy with some safeguards in place. One idea is instead of the standard click-through privacy policy, having patients walk through a survey similar to informed consent for a clinical trial, making it clear to the patient what they’re consenting to when they share their data. 

Another option would be to make it the standard to only share health data with verified third-party apps that meet best practices but giving patients the ability to circumvent that after going through a consent process. 

“We are trying to balance patient autonomy with some protections and safeguards,” he said. 

Ultimately, even the best-informed patients will still face difficult decisions on what to do with their health data. Cancer survivors and parents of children with rare diseases find critical support in talking to others in online groups through Facebook and other platforms. Those same patients have also discovered security vulnerabilities in these groups that could potentially expose health information. Still, they have few other options. 

“We need to be sure we’re supporting people making the decisions that are right for them,” Pollak said. “Ultimately, what it comes down to is we need transparency across the ecosystem.” 

Despite Faulkner sounding the alarm about the potential risks that openness brings, opening the flow of healthcare information seems somewhat inevitable. But hopefully, with the proper guardrails in place, people — not companies — will be able to make informed decisions on how their data is shared.

Photo credit: Steven Ferdman, Getty Images