Does Femtech Give Users Control of Their Health or Take It Away?

Molly McHugh

The Ringer

March 18, 2019

At the 2015 Apple Worldwide Developers Conference, many people celebrated as the iPhone Health app finally recognized periods. Menstruation-tracking had been a much-requested feature, one that the tech giant was rightly criticized for not including when it introduced the app the year before.

Apple was something of a holdout in the market, as was Fitbit, which didn’t add “female health tracking” until May 2018. But by 2013, period trackers like Clue, Glow, and Period Diary were already viable options. And since then, fertility-tracking devices like Bellabeat, OvuSense, Daysy, Tempdrop, and Ava have further saturated the “femtech” market, giving the modern person a chic, tech-savvy solution for monitoring their body.

Trackers without a hardware component (as opposed to Fitbit and its ilk) work by asking users to input data like when their periods start and end, how heavy their cycles are, and related factors like mood, sexual activity, physical pain, body temperature, and pulse. Some fertility-oriented apps even ask users to log their sexual positions. As with all health apps, period trackers’ efficacy depends on how much information a user is willing to feed it: “The more data you enter, the more accurate your predictions,” as Glow puts it. “Finally understand your body,” promises Ava.

The motivation behind femtech apps is understandable. Access to both abortion and prescription birth control remains out of reach for many women and others in need; some politicians still see every female body as a “host.” In this sense, technology can help people gain more agency over their own health—and not only as it applies to reproduction. For example, the Apple Watch’s EKG featurecan flag potential heart problems, which is crucial, notes Sloan Gaon, CEO of the health data and technology company PulsePoint, given that heart disease kills one in four women.

But the trade-off for users is familiar: information for personal data. PulsePoint found that concerns about health data are predicated on control. Many people, especially younger people, Gaon said, are willing to sacrifice their personal data so long as they understand what, exactly, they’re offering up, how it’s being used, and whether they can easily change data settings as they see fit. (Gaon said that all of the data PulsePoint collects is not personally identifiable.)

While femtech could lead to better health outcomes for those who track menstruation and/or fertility cycles, which could result in better representation and advocacy in health care, there’s also a lot to be lost. Fertility and period trackers are marketed as tools of gaining control, but what if they’re instead new methods of giving it up?

Menstruation and fertility apps are something of a double-edged sword for users, feeding not only a market that has long dismissed them but also an advertising economy that has long mistreated them. Furthermore, while women and LGBTQ health representation in technology is essential, the reality is that many health apps try to shove users and their bodies into strict categories. Some don’t track abortions, or even irregular periods; others have algorithms that don’t factor in non-male partners. In many ways, they are not, as Vox’s Kaitlyn Tiffany aptly put it, made for women.

Many of the same systems in health apps that help people chart their cycles and related markers are collecting that data and offering it to third parties for marketing and advertising purposes. Period tracking, fertility, and pregnancy apps have proved to be a particularly fruitful source to this end. Until recently, Flo was reportedly one of a handful of health apps sharing information with Facebook for advertising purposes; researchers who dove into similar apps’ terms of service found they rely on user data to fund their products.

A Mozilla team recently explored how it could demystify targeted advertising to users, and how those ads differed among genders. “When we were doing our initial research, we were interviewing women to get a better sense of what advertising they were seeing,” says Becca Ricks, a digital researcher and artist who recently completed a fellowship with that team. “We consistently found that [women] were getting tons of advertisements about fitness and dieting. A lot of them were getting pregnancy ads or fertility ads.” Many internet users are familiar with how browser activity tracking results in targeted ads. But when those ads are designed to target individual users’ body types, the practice becomes even more concerning. An Asian woman whom Ricks’s team interviewed said she “was really creeped out” when she saw an ad that specifically advised her to sell her Asian eggs.

For women of a certain age, it’s not unusual to open Facebook or Instagram and be greeted by a well-designed, millennial-targeting ad for egg freezing or donation and fertility services or apps. The week after my wedding, in fact, Facebook ads began asking me whether I needed a “Fitbit for fertility,” among other pregnancy-related services. (I did not.) My social media and browser histories, as well as cookies that follow me around the web, likely fed algorithms information about roughly when I was getting married. For me, the targeted ads were, at worst, obtrusive and obnoxious. But for those struggling with conception or miscarriage, they can be brutal. In December, the Washington Post’s Gillian Brockell wrote an op-ed detailing the intense pain of losing a child in a stillbirth and then being subjected to targeted ads across various social media platforms as if she had returned home with a healthy baby. Brockell acknowledged that the trade-off to participating in social media is providing user data, but she asked why, then, if platforms can track us and reap the benefits of advertising dollars, could they not also use the available data to make more sensitive choices? Ricks says her team talked to women who had also experienced miscarriages or stillbirths, only to be fed pregnancy-related Facebook ads afterward.

Ricks and her Mozilla team created a browser extension called Fuzzify.me that unveils some of the ad-targeting process. Ricks says the problem is in the asymmetry: People believe they have a certain amount of power over the personal information after they’ve offered it to a platform—whether it’s a social network or a health tracker—but in reality it’s nearly impossible to control. Often, people feel misrepresented, or even stereotyped, when they discover what the ads they’re fed imply about them. “There were a lot of [Fuzzify.me users] who were like, ‘I am completely miscategorized by this system,’” says Ricks. “I think there’s sort of a duality of ‘this feels hyper-personalized and invasive’ versus ‘I’m being miscategorized and I don’t know what to do about it.’”

That intrusion is particularly trained on those in marginalized demographics, including people who are nonwhite, disabled, LGBTQ, and/or women. “People who historically have been discriminated against are often subjected to these sort of regimes of surveillance and control,” says Ricks. “[It’s important that we] take history into account when thinking about data collection and targeted advertising for women.”

[Women are] being told, essentially, “I’m supposed to start having babies because I’m at this stage of life.”

Sensitive personal information, Ricks adds, can be used to impose societal norms. Her team spoke with 19- and 20-year-old women who were served ads for egg-freezing or fertility services, which those women interpreted as pressure on them to be considering. They’re being told, essentially, “I’m supposed to start having babies because I’m at this stage of life.” Maybe the majority of female users of a certain age using Facebook are having children or considering parenthood—but shouldn’t more choices for women and other users ensure that certain experiences aren’t presented as the norm? Instead, it seems that internet platforms are largely reinforcing dated ideas about women and their bodies through targeted advertising—and that they’re being aided by femtech.

Social media content amplifies the message. The aforementioned Ava fertility-tracking bracelet is a mainstay of Instagram influencers, who artfully pose (often in beautiful robes atop plush couches or beds) donning Avas on their wrists. The ads (which come with personalized discount codes and often do not include either the #ad or #sponsored hashtag now required of them) aren’t always worn by people actively trying to get pregnant; instead they recount the benefits of using Ava to better understand their cycles and bodies.

Many mommy bloggers praise Ava as well. The result is that much of the content about Ava centers on conventionally beautiful, non-disabled, young cisgender women who are either having babies, want to have babies, or are trying to understand their bodies better so they may someday have babies. Even fertility is fodder for Instagram sponcon.

So just how effective are femtech solutions? Gaon, the PulsePoint CEO, says that when he and his wife—who now have a 4-year-old son—were trying to conceive, they downloaded a handful of ovulation and menstrual-cycle apps. They learned that the apps are only as good as the information you put into them. “Think about diet apps, and you’re on a diet,” says Gaon. “They tell you to input everything that you eat, and you eat a candy bar and you don’t put it into the diet app. How useful is it then?” Using femtech apps without inputting detailed information not only decreases their efficacy, but those flawed outcomes can lead to incorrect predictions that become data points informing the larger health market. It matters whether what the apps are concluding is wrong.

Gaon says he believes the answer lies in regulation. FDA-approved apps and devices are bound by far stricter regulations regarding data-sharing and consumer privacy than their non-FDA-approved competitors.

Naturally, the pace of the technology industry is far faster than the slow churn of government. Former FDA chief Scott Gottlieb seemed interested in advancing this pace; his resignation last week could stall the momentum. But Gaon is certain that stricter regulations for med tech will come either way. “If you look at what Europe did with GDPR [General Data Protection Regulation], it’s severely restricted the use of data in many different ways and giving the control back to the user,” says Gaon. “What we’ve seen in the U.S. is a patchwork of legislation. You have California, you have Wisconsin, you have Virginia—they’re all trying to pass legislation around the use of data.” While this piecemeal approach may create loopholes, baseline federal legislation that offers consumers total ownership of their own data would create a framework that would be more difficult for the bad actors to manipulate, according to both Gaon and Maria Simeone, PulsePoint’s head of marketing. “We need comprehensive federal regulations and frameworks, not state regulation,” Simeone says via email. “Without it, we risk the enormous potential data and technology has to transform the health care industry.” She also suggests that only federal regulation can rebuild consumer trust in the technology industry.

The federal government and its health regulations haven’t always served marginalized groups well, though. “Inherently, female health applications are being controlled by laws and regulations and restrictions created by men,” Gaon says. “If you think about it, up until 1993, women [with child-bearing potential] weren’t even able to participate in clinical trials.”

“Think about diet apps. They tell you to input everything that you eat, and you eat a candy bar and you don’t put it into the diet app. How useful is it then?” —Sloan Gaon, CEO PulsePoint

At the moment, the majority of health-tracking apps and devices (no matter whom they target) aren’t FDA-approved. In August 2018, period-tracking app Natural Cycles became the first of its class to receive FDA approval, but it wasn’t without controversy. OB-GYNs cautioned that women who are trying to avoid getting pregnant should forgo apps in favor of traditional birth control methods like the pill, IUDs, condoms, and so on. There is also a wearable patch for breast cancer screenings that has FDA approval. Otherwise, the femtech landscape remains the Wild West. For now, the onus is on users to scrutinize each app’s terms of service and privacy policies, and, if they choose to use one, to be critical about the conclusions they’re getting. “[These technologies] reinforce the idea that our bodies have to be quantified and analyzed,” says Ricks, who points out the data collected is averaged out, creating a version of normal—and what happens when someone doesn’t see themselves there? “It encourages women to compare their own health against some sort of normalized standard,” she says.

Provided regulatory progress is made on a federal level, the future of health tracking holds promise. Women’s health is a global crisis, and the seeming lack of concern from the medical industry at large surrounding the issue can be terrifying. Simeone says the potential of femtech is undeniable, though whether it stands to benefit users as much as it does investors and developers remains a question. The market could be worth as much as $50 billion by 2025, according to one estimate. “But will it provide real value to fix what’s broken in an antiquated health care system or be another ‘shrink it and pink it’ marketing approach?” she asks. If femtech can aid in the fight to change this, perhaps the trade-offs will be worth it. But for now, while the products that define femtech are shiny and new and, in many ways, impressive, they can perpetuate old ideas. “Having complete control over ourselves and our bodies,” says Ricks, “is a false narrative we’ve always been told.”

RELATED POSTS

LOOKING TO LEARN MORE ABOUT PROGRAMMATIC HEALTH? SIGN UP FOR OUR NEWSLETTER!