Period-tracking apps are just the tip of the post-Roe privacy iceberg

This article is part of an ongoing series examining the implications of the cancellation of Roe v. Wade on the pharmaceutical industry.

Long before the U.S. Supreme Court overturned the constitutional right to abortion, digital privacy hawks had been reporting issues with period-tracking apps. Without stronger data protections, these experts warned, app information could be used to target users with ads or even determine insurance coverage or loan rates.

SCOTUS’ overthrow of Roe c. Wade last month raised concerns to a new level – that app users’ fertility, missed periods and more could be used against them in criminal and civil proceedings as proof that they had an abortion. As recent studies show, these fears are not in vain.

A research team from the Organization for the Review of Care and Health Apps (ORCHA), a company that tests health apps for the UK’s National Health Service, reviewed the privacy policies of 25 of the most popular period tracking apps and found that 84% share data with third parties. Nearly two-thirds defer to authorities for legal obligations, findings show posted last week.

This includes details about sexual activity, birth control use, and period times. Contact details of users who tracked their cycles were also sold as marketing contact lists, with nearly 70% of apps that share data claiming to do so for marketing purposes. Only two – FitBit and Natural Cycles – have been recommended as safe and secure. ORCHA did not name those who performed poorly.

Consumer Reports, which conducted its own evaluation this year, named names. The organization tested the most popular period-tracking apps in the US and found that Flo, Clue, Stardust, Period Calendar and Period Tracker all faltered in privacy. CR only recommended three apps: Euki, Drip, and Periodical, which store all data locally and don’t allow third-party tracking.

The lack of privacy in period-tracking apps should come as no surprise, says Eric Perakslis, scientific and digital director of the Duke Clinical Research Institute and professor at Duke School of Medicine.

Perakslis co-led a study last year with privacy-focused group the Light Collective looking at data shared by genetic testing and digital medicine companies. They found that several of the companies customer information sent to Facebook for ad targeting, often in violation of their own written policies. His team also found issues with where and when users of health apps and websites are asked for permission to share this data.

“Ad technology is probably the most common software on the Internet,” Perakslis said. “How do Facebook or Google make money? They pretty much do this through advertisements.

The lack of proper standards governing adtech is an industry-wide problem. Marketing, he points out, was also the reason a third of the top 100 hospitals were recently discovered sharing data with Facebook.

Most consumers learn on their own how these processes work. The Federal Trade Commission and the Federal Communications Commission haven’t exactly jumped on Big Tech.

“There aren’t hundreds of ongoing cases where the breach notification rule through the FTC has been enforced,” Perasklis said. “HHS enforces the provisions of HIPAA, but this is not a crime for which many examples or penalties have already been meted out.”

But what’s happening with the erosion of women’s reproductive rights in America has put period-tracking apps in the spotlight, and for good reason. These apps are used by about a third of women, according to a Kaiser Family Foundation survey conducted in 2019.

Flo, for example, has 200 million users worldwide. According to a Rules 2021 Along with the FTC, Flo passed users’ health information to numerous third-party marketing and analytics companies, including the aforementioned social media giants.

Targeted ads are one type of harm. “Being presented with targeted content can be intrusive and a problem if someone doesn’t have complete personal and ambient privacy,” Perakslis explained. This often happens to the most vulnerable in society, which means that not all of their devices are always theirs and not shared with others.

What’s even more concerning now, with more than two dozen states set to ban abortion, is that health and pregnancy information stored in one app and then passed on to others could incite someone to search for a person, or to sue them or their doctor.

“It has been the concern of activists for many years that this behavioral advertising network will eventually become an incredibly useful tool in surveillance and in authoritarian law enforcement,” said Cooper Quintin, senior technologist at the Electronic Frontier Foundation.

There is precedent for using big data in this way. In 2016, for example, Rewire noted that an anti-abortion group was targeting ads for women in and around Planned Parenthood centers based on location data. The fact that smartphones can be turned into advanced monitoring devices – complete with GPS, camera and microphone – makes fertility tracking particularly difficult.

“The last time abortion was illegal in the United States, you didn’t have a smartphone constantly recording everyone’s location,” Quintin said. “You didn’t have a giant behavioral advertising network. You didn’t have a giant surveillance network. It’s a very different scenario this time around.

The Flo case highlights one reason why unauthorized data sharing can continue to happen. The potentially specious ways in which health apps handle consumer information may escape the FTC’s definition of a health data breach. The agency said it was undertaking a review of the rule and considering initiating public comment.

Women’s health apps are just the “tip of the iceberg” when it comes to health data security, observed Fatima Ahmed, ORCHA’s clinical lead for maternity and women’s health, in a statement on the study results.

Sharing data with third parties isn’t the only privacy issue the researchers found, either. “Even app developers who promise to stop sharing names and addresses, for example, need to know that people can be identified by an IP address,” Ahmed added.

“Disidentification is almost like a board game at this point, it’s so easy to re-identify people,” Perakslis added.

When contacted, he said, the companies his team studied were generally unaware of the disconnect between their consumer-facing privacy statements and what their ad technology actually did. He recommends marketers educate themselves.

“How many buy the software and then hire a forensic scientist to go and see what it actually does? Zero,” he said. “Because they trust the sellers, they only do certain things. Ultimately, you will be held accountable for what is in your [tech] stack, just like a biopharmaceutical company will be held accountable for what it puts on the shelf or on the supply chain.

Perakslis, who has worked in the pharmaceutical industry for nearly two decades, also suggested marketers consider creating the equivalent of an institutional review board to ensure their data practices are on the rise.

“Having an ethics committee; work with them,” he insisted. “There are people who can help you avoid making bad mistakes.”

Companies should think about doing good data science, verifying the provenance and lineage of health information. “You wouldn’t for a second want illicit substances entering your supply line – stolen or impure products, reused syringes,” he said. “They have to think about not allowing it in their data. They need to think about the fact that this data may have been exploited, stolen, taken without consent, all these different things.

Quintin was even more pointed:

“I hope this is a moment of judgment for the ad industry,” he said. “I hope a lot of people in the industry will seriously consider whether or not they want to be the little helper of oppression.”

Lance B. Holton