Health Tech, Telemedicine

Report: Covid-19 apps fall short in privacy, security

An analysis of more than 100 Covid-19 apps by the International Digital Accountability Council found that many fell short of best practices for privacy and security. Some concerns including using third-party SDKs that were not core to the app’s functionality, and sending unencrypted transmissions.

One of the many responses to the Covid-19 pandemic has been the swift development of digital tools to help prevent the spread of the disease. Governments and private entities have proposed all sorts of apps, from symptom checkers to telehealth apps. And that’s not including the contact tracing framework being developed by Apple and Google.

Given the sensitivity of users’ health and location data, it’s reasonable to expect that it would be protected. But that’s not always the case.

presented by

An analysis of 108 Android apps related to Covid-19 found that many of them do not meet best practices with respect to privacy and security. The report, conducted by the International Data Accountability Council (IDAC), found that apps were not always transparent about what information they collected and did not always encrypt information that they transmitted.

“Our investigation did not reveal intentional or malicious misconduct. In many cases, we found that governments, developers, and their partners took great care to protect the privacy of users and adopted best practices in the design of the apps,” IDAC wrote. “However, our investigation did uncover several instances in which apps fell short of best practices related to privacy and security, and potentially exposed the public to avoidable risks and potential harms.”

The group evaluated apps from 41 countries, including contact tracing apps, symptom checkers, telehealth apps, and quarantine administration apps used by governments where quarantine rules are strictly enforced. They highlighted four areas of concern:

  1. Apps that were not transparent about data collection and third-party sharing
  2. Apps that included third-party advertising and analytics software development kits (SDKs) not related to their core functionality
  3. Apps that sent unencrypted transmissions
  4. Apps that requested invasive permissions, such as a user’s location or camera

For example, Kencor’s Covid-19 app, which allows users to check their symptoms, uses third-party SDKs for Google Ads and Crashlytics. That doesn’t necessarily mean that user data is actively being transmitted, but it does pose the potential for personal information to be collected. Kinsa’s app for a connected digital thermometer also uses the Crashlytics SDK.

“In our view, analytics and advertising SDKs should not be present in COVID-19 apps because of the potential for these SDKs to collect personal information,” the report stated. “The presence of these SDKs does not necessarily imply that the SDK is actively transmitting user data to third parties. Nevertheless, the use of these mixed-purpose SDKs presents a challenge and additional burden on the developer to ensure that the ad and analytics components are not being used or are disabled to prevent the inadvertent transmission of personal data to third parties.”

Care19, a contact tracing app used by North Dakota, also shared user data with third parties, including Bugfender and Crashlytics. A separate report by Jumbo Privacy found that the app also shared users’ data with Foursquare using the Identifier for Advertisers, which tracks whether users have clicked on an ad, for example. Care19 changed this in a recent update, according to Infosecurity Magazine.

Government-developed apps were also not exempted from the report. A symptom checker app developed by the Centers for Disease Control and Prevention transmitted information that was not encrypted. Researchers were not able to view a user’s activity, but they could find information about their carrier, operating system and device resolution.

IDAC shared its report as constructive feedback meant to improve the digital response to Covid-19.

“If responsible steps to rein in the COVID-19 pandemic and reopen our devastated economy require changes in how much information people share about their health and movements, the public should be able to trust that their data will be used responsibly,” the report concluded.

 

Photo credit: HYWARDS, Getty Images