This June marks the one year anniversary of the landmark Dobbs v. Jackson Women’s Health Organization decision that overturned Roe v. Wade, effectively ending a woman’s constitutional right to abortion in the United States. In its aftermath, a resounding alarm echoed throughout the tech world and among health app users over data privacy and protection. The growing concern is that prosecutors in states with abortion bans (now 14 states) could subpoena data, such as location, search history, and personal health information, to criminalize individuals in abortion-related cases.
The allure of logging health details into an app is simple: ease, function, the ability to take control of your health, and informative feedback/insights at the touch of your fingertips. But unlike traditional medical records, the Health Insurance Portability and Accountability Act (HIPAA) does not protect this data as it is intended for personal use. With no federal legislation in place, it’s up to tech companies themselves (or individual states) to ensure data privacy and protection for consumers.
Ever since the Supreme Court draft decision was leaked in May of 2022, women’s health apps, namely period-tracking apps, have been catapulted to the forefront of debate over ethics, data privacy, and protection. Thousands on Twitter called for the deletion of period-tracking apps altogether. The explosive divide and demand for change highlighted a growing mistrust among users and tech companies.
To fully grasp the gravity of the implications is to understand how many women nationwide rely on these programs. Nearly a third of women in the United States have used a period-tracking app, according to a 2019 survey from the Kaiser Family Foundation. One of the most popular apps alone, Flo, has over 240 million downloads and 50 million active users per month.
Users have increasingly relied on health apps and consented to inputting personal data, but it wasn't until the fallout of Roe v. Wade, that people truly understood the downstream impact of tech without data privacy at the forefront. If they weren't aware before, they certainly are now.
“People are paying attention to the broken systems around our data and how it’s protected,” says Tazin Khan, longtime cyber security specialist and founder of Cyber Collective, a community driven research organization educating individuals on technology, security, and privacy online. “It has ignited the advocates, the ethicists, and the people that care to be fast and move hard to make sure that protection is in place.”
Khan describes data privacy and data protection as two-fold: “Privacy regulation is essentially around the compliance of businesses and how they are maintaining data and hygiene and the way that they’re collecting, storing, and redistributing data,” she says. “It is not about consumer data protection. Consumer data protection is very different, right? Do I have the right to delete? Do I have the right to access my data? Do I have the right to opt out of being opted into something?”
While the overturn of Roe v. Wade has certainly highlighted significant needs for improvement in both categories, it has also brought attention to what some companies are doing right.
Female-led women's health apps putting data privacy first
For Berlin-based period-and-ovulation tracking app Clue, data privacy was always a part of the company’s ethos. Founded and led by women, the Berlin-based app is protected by the European Union’s General Data Protection Regulation (GDPR), one of the strictest data privacy and protection laws in the world. While it covers various aspects of data protection, including websites, it also includes provisions that protect personal data privacy on apps, from consent requirements, to transparency, to user rights, data security, and more.
In light of growing concerns from American users, the app’s co-CEOs, Carrie Walter and Audrey Tsang, released a statement to its community of 11 million active users stating that private health data will never be shared, including to authorities. “Your personally identifiable health data regarding pregnancies, pregnancy loss or abortion, is kept private and safe. We don’t sell it, we don’t share it for anyone else’s use, we won’t disclose it,” says the release. The GDPR establishes protections over personal data and holds organizations accountable with severe penalties for breaching these protections with fines up to tens of millions of euros.
With the advent of the Dobbs decision, privacy advocates and legislators have been working to impose similar federal protections in the U.S. On the state level, select states have introduced comprehensive data privacy laws, such as the California Consumer Privacy Act (CCPA), which grants users more control over the personal data that businesses can collect. Several tech companies in the U.S. have amended their data privacy and protections, largely in response to the demand of consumers, and users have been receptive to these changes.
Also governed by the GDPR is Natural Cycles, the first FDA-cleared birth control app in the U.S., which measures fertility through body temperature. The company is headquartered in Sweden with operations in the United States, Germany, Switzerland, and the UK. Unlike other apps on the market, the company has integrated a subscription-based model, so selling data to third parties was never a part of their revenue stream. (For many companies, it’s common practice to purchase data from third parties for advertising purposes or to gather information about consumer behavior.)
“We always cared about data privacy and data protection,” says CEO and co-founder Elina Berglund. “But after the Dobbs decision, we felt like we had to take it one step further.”
Natural Cycles recently developed its NC° Secure program, an advanced data protection program that includes encryption and pseudonymization (a data management system where identifiable information fields are replaced with a pseudonym). Additionally, the company is rolling out a ‘Go Anonymous’ mode. “We’re separating the personal identifiable information from the census related to your health or fertility, such that not even we at Natural Cycles can know which user has sensitive data,” says Berglund. “If one day, we get subpoenaed, we ourselves cannot hand out any information on a user because we don’t know who they are.”
The only way to link personal identifiable data (such as name, etc.) from sensitive data (such as period data) is through the user’s own key. So while the anonymous user will be able to get the same personalized insights, including fertility status, within the NC° app, there will be some limitations when it comes to getting personal reminders and help outside the app that require both sensitive and personal data (such as email communication, customer support help, account recovery, etc.). Before a user enters Go Anonymous, the app walks them through these limitations and lets them decide if they want to choose that mode or not.
Taking a broader approach to period tracking is Stardust, a free, astronomy-focused app that provides insights on users’ cycle, horoscope, and mood. Owned and operated by women, the company leads a privacy first model (as stated on its Instagram bio) and has been vocal about user protection and transparency in a post-Roe world.
“Given the current political climate, we have taken rigorous measures to protect users, especially those in states where abortion is being criminalized,” reads Stardust’s privacy policy. “We believe all period trackers should stringently protect the privacy of users—and be transparent about exactly how they do so.”
Stardust’s policy page maps out exactly what data is collected, how it is being used, and addresses burning questions, such as what happens if law enforcement subpoenas information (in this case, the app will not share period data because it is not connected to user’s login information) and how you can delete your data in the app.
For other apps, such as Drip, privacy is integrated into the fertility app design itself. When the app was created in Berlin in 2017, developer Marie Koschiek wanted to create a safe and trustworthy product that was non-commercial, free, and gender-neutral, using scientific methods for fertility awareness, as well as being secure and open source—meaning the app is maintained and developed through open collaboration. No data is collected and information is stored locally on the user’s device rather than in the cloud. Additionally, the app does not allow any third-party tracking.
“On the day that Roe vs. Wade was overturned, we saw a significant increase in downloads and users from the U.S.,” says Koschiek of the app, which is run by a collective. “We also had people from the U.S. contact us directly to offer help and support for developing Drip.”
It’s no surprise that those looking for low-risk assessment would download an app like Drip. However, the reality is that the zero data collection/locally stored app design is a rarity. In a world where technology plays such a pivotal role in our daily lives, how can we better educate ourselves as users before putting personal health information into an app?
Red flags to look out for, from a cyber security expert
It’s no question that consumers share concerns over confidentiality and lack of security over personal health information. More than 92 percent of people believe privacy is a right and their health data should not be available for purchase by corporations or other individuals, according to a survey of 1,000 patients across the U.S. conducted by the American Medical Association.
When it comes to downloading an app, for health purposes or otherwise, education is the best tool in navigating the tech landscape and determining what apps are more secure. Here, Khan of Cyber Collective breaks down three red flags to look out for before inputting personal information.
1. You don’t get access to the tool unless you share private information
If you can’t sign up for a service without providing your name, email, and address, it’s likely a red flag. Ask yourself, what are they doing with this information and why is it being collected?
2. Terms and conditions are in ‘legalese’
Is the language overly complex and difficult to understand? Does the app ask you to hit accept without prompting you to read through the terms and conditions first? The best privacy policies are written in simple, concise language that answers your questions, as opposed to prompting more.
3. The app starts asking for access to things that it doesn’t need in order to function
It’s important to think critically about the function of the app and why it is being downloaded. For example, if you download a flashlight app and it starts asking for access to your photos or mic, it’s important to question why. If the answer doesn’t seem right, it’s a sign to delete the app.
For Khan, education goes both ways— “If you have the propensity and the time, let whatever entity know that you wanted to download the app, but you don’t feel comfortable using it because you saw these red flags,” she says. “Share how you are feeling because tech companies don’t hear enough from us.”
While these women's health apps are taking significant measures to secure and update their data privacy policies and protections, it is important to educate yourself as a consumer in terms of what information you’re sharing and with whom. As technology continues to evolve and play an integral role in our daily lives, it is crucial to have awareness of the function of the apps you’re using, why data is being collected, where it’s being stored, and your rights as a user in the process.
“If we want real change, we have to lean into curiosity,” says Khan. “We have to ask questions and we have to be informed.” For more information on data privacy and tracking legislation in the U.S., Khan recommends visiting the International Association of Privacy Professionals.
—Written by Danielle Torres