Privacy remains an issue with several women’s health apps

The billion-dollar industry has faced scrutiny for its lax approach to private information

Close up of a woman holding a smartphone

Health apps collect many kinds of data from users, and only a small part of it is directly provided by the user, a new study has found.

Charday Penn/E+/Getty Images Plus

With millions of users globally, the women’s health app market is projected to cross $18 billion by 2031. Yet these apps are among the least trusted. They collect data about users’ menstrual cycles, sex lives and pregnancy status, as well as information such as phone numbers and email addresses. In recent years, some of these apps have come under scrutiny for privacy violations.

A number of problematic practices persist, researchers reported in May at the Conference on Human Factors in Computing Systems in Honolulu.

The team evaluated the privacy policies and data management features of 20 of the most popular female health apps on the U.S. and U.K. Google Play Store. They found instances of covertly gathering sensitive user data, inconsistencies across privacy policies and privacy-related app features, flawed data deletion mechanisms and more.

The researchers also found that apps often linked user data to their web searches or browsing, putting the user’s anonymity at risk. Some apps required a user to indicate whether they had had a miscarriage or abortion to use a data-deletion feature. This is an example of dark patterns, or manipulating a user to give out private information, the study authors point out.

Study coauthor Lisa Mekioussa Malki, a computer science researcher at University College London, spoke to Science News about the privacy and safety implications of the findings. This interview has been edited for length and clarity.

SN: Women’s health and fertility apps have drawn concerns about privacy. But in your study, you point out that the data collected by these apps could also have physical safety implications.

Malki: It’s one thing to think about privacy as safeguarding data as an asset from an organizational perspective, but I think it needs to go a step further. We need to consider the people using these apps, and what the implications of leaking that data are. Obviously, there’s the key issue of criminalization [of abortion in the post-Roe United States], but there’s also a lot of [other] issues that could result from reproductive health data being leaked.

For example, if someone’s pregnancy status is leaked without their consent, that could lead to discrimination in the workplace. There has been previous work that’s explored stalking and intimate partner violence. In communities where abortion is stigmatized, and issues around women’s and reproductive health are stigmatized, the sharing of this information could lead to real concrete harms for people within their communities.

SN: Apps often say, “We don’t sell your data.” But the information we enter is still available to advertisers and others. This seems to make it very difficult for users to understand what they’re consenting to when they use the apps.

Malki: These apps collect a lot of different data points from users, and only a small part of it is directly provided. Obviously, there’s information that a user inputs when they register, including their health data. There are some limitations [by law, based on your location] on sharing and commercializing that data. Though, in a few apps, the privacy policy explicitly states that things like the user’s pregnancy trimester could be shared with third-party advertisers.

But there’s also a lot of data that apps will collect from the user’s device: IP address and information about how they use the app — like what articles they click on, what pages they access, etc. And actually you can discover quite sensitive insights about a person’s life. That data is, according to the privacy policy, to be shared with analytics companies in particular.

It’s quite concerning because there’s not much transparency around exactly what types of behavioral data are being shared. It could just be, “Oh, the user logged in.” Or it could also be, “They opened an article about contraception or pregnancy.” And that could be used to create inferences about users and predictions that are actually quite sensitive. It’s absolutely not reasonable to expect that the user would have a perfectly airtight understanding just based off reading a privacy policy.

SN: What advice do you have for women and others who use these mobile health apps?

Malki: A key issue we identified was that a lot of people, when they saw a scary news article [about data breaches], they immediately deleted the apps. That won’t necessarily protect user data. The developers often keep backups on their own servers.

So one piece of advice is either looking for a data or account deletion feature in the app, or even directly contacting the developers. If you live in Europe in particular, you can contact developers and cite your right to be forgotten.

SN: And what can developers do to design more ethical apps?

Malki: A lot of time, particularly when the app development team is quite small and perhaps limited in resources, data privacy is a compliance issue rather than a humanistic and user experience issue. So I think a shift in understanding is needed — who the users are, what potential risks they could be facing, what needs they have — and building that into the design process from the beginning.

We’ve developed this groundwork for understanding and identifying the characteristics of privacy policies. So what researchers and developers, even auditors and compliance people, can do in the future is use that framework to automate the analysis of a larger set of privacy policies on a large scale. Our codebook provides a framework for doing that.

About Payal Dhar

Payal Dhar is a freelance journalist and author based in Bangalore, India.

More Stories from Science News on Health & Medicine