MedCity Influencers

Mental Health Data Sells, But Who’s Buying?

Implementing a thorough federal privacy law and broadening HIPAA protections to include new mHealth technologies are crucial steps in enhancing personal data security. Equally important, companies and developers should adhere to strict ethical standards and robust security protocols during app development to protect users' sensitive information.

It is evident that, in recent years, more and more people have turned to digital mental health services for help and support. These changes seem to be associated with two key reasons: the increase in mental health disorders, which affected 23% of the U.S. population last year, and the impact of the recent Covid-19 pandemic and its quarantine measures. To understand the size of this shift, in 2023 alone, more than 375 million health apps were downloaded, in what appears to be a partial replacement of traditional face-to-face therapeutic regimes. 

At first glance, this of course appears, and should be considered, a positive phenomenon. It shows that people are not only concerned about their well-being, but can also have access to services that support them on their path to better health. However, within this otherwise highly beneficial technological facilitation, researchers have identified an ongoing issue of personal data misuse, including the sale of sensitive health information to data marketers. According to a Duke University study, certain data brokers have been selling highly sensitive data related to people’s mental health disorders across an open market, with very little to no restriction as to how this information is going to be used. According to the researchers, 26 out of 37 contacted data brokers had no problem responding to questions concerning very sensitive mental health data, and 11 organizations were eventually even willing to sell the requested data. 

When it comes to the specific type of mental health data that was misused, these brokers primarily advertised sensitive information about individuals dealing with depression, insomnia, anxiety, ADHD, and bipolar disorder, as well as information on date of birth, marital status, net worth, gender, age, religion, and whether or not children were living at home. In case you are wondering about the price, this ranged from $275 for 5,000 aggregated counts of Americans’ mental health records to $100,000 per year for subscription access to data containing information on individuals’ mental health issues. 

presented by

The same research looks at the current laws and regulations controlling these digital tools, and the results are alarming, to say the least. Most mHealth applications are exempt from the Health Insurance Portability and Accountability Act (HIPAA), which only covers a limited number of covered health firms. That literally means that many private organizations that operate mHealth apps are not legally obliged by HIPAA to keep users’ data private. Yes, you’ve read that right. Similarly, technologies using wearables that also collect health data are often outside HIPAA’s scope. 

But what does this actually mean? In essence, it means that users’ health data can be lawfully shared and sold by mHealth apps and wearables to third parties without their knowledge or approval. Taking a moment to consider this, one can’t help but question how personal data of such a sensitive nature is left unregulated, leaving users vulnerable and their mental health at risk. That type of misuse of mental health data can have serious real-life consequences for people who are undergoing some sort of psychological challenge. We can at this point presume that a loss of confidence is one of the most harmful effects of this phenomenon; when personal data is exposed, people may lose faith in healthcare services altogether. This breakdown of trust can lead to additional mental health problems as people might delay seeking necessary help or even become afraid to seek assistance again. 

Not only that, but such breaches could lead to stigma and discrimination. Data misuse could limit certain opportunities in the workplace or even in a chance to own a house. Negative consequences of the same nature could also extend to legal settings. A lawyer, for example, could utilize facts about a person’s medical history – information that, as previously said, could be obtained lawfully through data-selling practices – to question their credibility, even if it is irrelevant to the case. 

At last, perhaps the most profound impact of data misuse has to be the emotional and psychological toll it takes. A recent research article that studied data breaches in several digital mental health providers, suggests that this can deeply harm the victims. Learning that personal mental health information has been leaked can trigger intense feelings of shame, anxiety, and paranoia, exacerbating conditions like depression or PTSD. Rather than reaching out for assistance, people may then isolate themselves even further, thus worsening their overall health condition. Ironically, the very tools meant to aid mental health could, in the wrong hands and due to insufficient data security and regulations, create a vicious cycle of fear, distress, and harm. 

presented by

To clarify, this discussion is not about spreading fear, but rather about raising awareness to foster a positive change in an already promising technological landscape. The intention in pointing out these issues is to encourage significant enhancements. Implementing a thorough federal privacy law and broadening HIPAA protections to include new mHealth technologies seem to be crucial steps in enhancing personal data security. Equally important, companies and developers should adhere to strict ethical standards and robust security protocols during app development to protect users’ sensitive information. Raising awareness and implementing recommended security measures can help establish a trustworthy and secure digital environment for individuals.

Photo: Anastasia Usenko, Getty Images

Alex Malioukis is a Licensed Psychologist at The Self Research Institute, with a Master’s degree in cognitive neuroscientist and clinical neuropsychologist, and a passion for the underlying mechanisms of what makes us human (work still in progress!). Alex works in the research faculty for the Self Research Institute, using self-tracking practices and technology to assist individuals and organizations in overcoming obstacles, realizing their full potential, and achieving their objectives.

This post appears through the MedCity Influencers program. Anyone can publish their perspective on business and innovation in healthcare on MedCity News through MedCity Influencers. Click here to find out how.