Caroline has spent over a decade calling for more data to be collected on women – but what happens when that data gets used against us?
“A monumental day in America with the Supreme Court overturning Roe versus Wade.”
Fox News
“Never before has the court granted and then taken away a widely recognised constitutional right.”
NBC News
“50 years’ worth of women’s rights in America overturned in an instant.”
Channell 4 News
Caroline Criado Perez, narrating: Watching from the UK, the overturning of Roe V. Wade was, well, frightening. There’s no other word for it.
“Tens of millions of women across the US do not have their right to an abortion guaranteed by the constitution.”
“Abortion rights activists are raising concerns that your phone could soon help incriminate you.”
ABC Newss
Caroline, narrating: I think for some people, men especially, it’s hard to understand why women might find a legal decision made over 4,000 miles away frightening. So let me explain.
For me, this legal decision reminded us how fragile our rights are. Because of the Supreme Court judgement, women in the United States no longer have a constitutionally protected right to bodily autonomy. It’s frightening to see how easily women’s rights, even those that have existed for 50 years, can be rolled back. It was a reminder that we can never be complacent, that we can never relax. We can never take our rights for granted. We can never assume progress only goes in one direction. And that is what’s frightening.
For women in America, the implications of the Supreme Court decision are huge. It will result in an increase in unsafe abortions. Desperate women, and let’s not forget children, are going to die. There will also be an increase in deaths from pregnancies gone wrong, an ectopic pregnancy, for example, or a septic miscarriage. In countries where abortion is illegal, doctors are reluctant to wade into this grey area, and so women are also going to be denied life saving treatment. Many of us have known about these inevitable outcomes for a long time, but there’s one deeply insidious consequence to overturning Roe V. Wade that you might not have thought of. I know I hadn’t.
It centres on women’s privacy.
A few days after the decision was handed down by the Supreme Court, I got on a call with Patricia and our executive producer, Basia, to talk it through.
Basia Cummings: I found it fascinating that one of the first things that took off on the internet post the Roe V. Wade thing was loads of women telling each other, delete your period tracking apps, which seemed immediately to me like just fear that wasn’t based in facts. How could they use period data against you?
Patricia Clarke: They totally can. And I read something, I’ve just been doing all this research. Here we go. In recent years, several federal law enforcement agencies, including the US Customs Border Protection and Immigration Customs Enforcement also purchased smartphone app geolocation data from data brokers like Venntel and Babel Street without a warrant or court order.
Basia: So one of the things that you could do, Patricia, is go to all of those companies mentioned and ask them whether they will have a policy on compliance on any kind of reproductive lawsuits as in abortions, anything like that.
Caroline, narrating: I’m Caroline Criado Perez, and this is Visible Women, my weekly podcast from Tortoise, investigating how we finally fix a world designed for men.
For years now, I’ve been calling for more data on women. And I stand by that call. We need data on women’s bodies and women’s lives, medical data, economic data, car crash safety data. The gender data gap is a clear injustice and I remain committed to fixing it. But for a while now I’ve been worrying that there might be a darker side to my beloved gender data. And the question that’s been bothering me is this: what happens when our data is used against us?
This episode was a really disturbing one to research. Our findings were pretty shocking and kind of dystopian. And it’s made me look at my phone in a whole new way.
Now, almost for as long as I’ve been banging on about the gender data gap, I’ve been receiving pushback from some feminists who want me to just stop talking about sex differences, because they fear that men will use these differences against us. And it’s not an unreasonable objection. Historically, the female reproductive system has been used as an excuse to keep women out of the workplace, to keep us out of sports, to keep us trapped in our homes. My position on this has always been that, yes, I acknowledge that danger, but pretending women are exactly the same as men is dangerous. It’s literally killing women. So since in a misogynistic society, it feels like maybe we’re kind of screwed either way, I’d rather know about my body and live in a world that recognises that it exists.
But there’s another issue to consider when it comes to gendered data. It’s one that was brought into sharp focus for me by a Wired magazine article I read back in February. The article was about an algorithm developed in Argentina that claimed to be able to correctly predict teen pregnancy five years before it happened. The algorithm was allegedly trained on data collected by government agents who visited the homes of teenage girls. The majority of these girls were living in poverty and many of them were migrants and from indigenous families. It’s not clear how they consented to the collection of their intimate data, which included photos and GPS locations, or if they could even refuse. Now, of course, there are many good and entirely non-creepy reasons to collect data on teen pregnancy. This same article pointed to the work of a feminist organisation, which also used data on teen pregnancy. They used this data to highlight the high number of teen pregnancies that result from sexual violence and to campaign for women’s rights, including the right to safe abortion. But the teen pregnancy algorithm was more sinister. It later transpired that it was funded by an Argentine nonprofit, which is run by a vocal opponent of abortion rights.
There’s one person who’s really helped shape my thinking recently when it comes to privacy, Danielle Citron.
Danielle Citron: Data is power, right? Knowing about us, knowing about our intimate lives, about what interests us, what turns us on, what we search, who we love, who we want to date, all of that is power to somebody else. And so data is fundamentally fraught with so much possibility and so much flourishing. And at the same time, in the wrong hands, it can be a weapon.
Caroline, narrating: Danielle is a Professor of Law at the University of Virginia and the author of a soon-to-be-published book called The Fight for Privacy. I have to admit, before I read it, I hadn’t really understood privacy. At least I hadn’t understood how important it was for women.
When I think about privacy, I always think of Catharine McKinnon, another law professor. One of the things that really stuck with me from her work is her description of privacy as ‘a right of men to be let alone to oppress women one at a time.’ And historically speaking, it’s hard to disagree with her.
Over the 19th and 20th centuries, the US legal system repeatedly refused to prosecute men for beating their wives, on the basis that these men had a right to privacy in their homes. Well into the 1980s, some US states were allowing rape in marriage on the basis of family privacy. Rape in marriage wasn’t fully criminalised in the US until 1993. England and Wales got there in 1991. And in the debate to outlaw this form of rape, once again, lawmakers raised concerns about preserving the privacy of marriage. It’s fair to say privacy doesn’t have a great rep when it comes to women.
Danielle: Men’s privacy counted and always counted more. And I think it’s because we didn’t ever see women’s privacy as an interest worth protecting. And I’ve long wanted us to protect it.
Caroline, narrating: Danielle writes in her book about all the ways women’s intimate privacy is so specifically at risk. One of the most obvious examples she gives us for the present day is in so-called revenge porn, the non-consensual digital sharing of intimate images. But perhaps the biggest issue is health data. And for me, health data is perhaps the greatest point of tension between Danielle’s work and mine. Because while the data gap on women’s bodies is such an urgent one to address, Danielle’s made me realise that we need to be really careful in how we collect that data. What I’m talking about is apps. The way women use technology has undergone a huge shift in the last decade. Period tracking apps, fertility tracking apps, health and fitness apps, smart watches…
Danielle: We innocuously all day long share information about ourselves: our search results, our temperatures, our physical symptoms. All the ways that we share information with just daily routine tools, our Fitbit, our period tracking app, our search engines, our phones. They tell a story about our vulnerabilities. They tell a story about some of our chronic diseases, which mean we will likely miss some work. But it’s a pretty complete picture of your intimate life.
Caroline, narrating: In the US, your medical records are protected by national standards called the HIPAA privacy rules. But medical information you enter into an app like the missed period you logged on your menstrual cycle tracker or a map search you did to find the way to the gynaecologist, that isn’t protected by those rules. This is a particular problem for women because women are just much more likely than men to use health apps. In fact, Danielle says women and girls are 75% more likely to use health apps than men and boys.
Danielle: My understanding is that one-third of all American girls and women are using some form of a health, either app, Fitbit. They’re sharing health information in ways that are super unprotected and men and boys much less so. Those kinds of numbers are true in the UK and elsewhere.
Caroline, narrating: You might be thinking, “Okay, Caroline, so these apps are collecting my data. So what? That’s how it works. I get to use a free app and they get my data. I’m pretty happy with that trade off. I don’t have anything to hide. I just want to be healthy.”
Danielle: What is interesting and should trouble all of us is that data brokers, there are over 4,000 companies. Their business model are dossiers on each and every one of us, and that includes UK citizens. And they have like 3000 data points on all of us. What they’re doing and tracking and tracing data brokers are including things like rape survivor, miscarriage patient. There are data brokers devoted solely to health. They score us and rank us based on our likelihood to develop certain chronic diseases.
Caroline: How do they know that I’ve had a miscarriage, for example?
Danielle: Because of your online searches.
Caroline: Right.
Danielle: Not sacred, those are sold to marketers and advertisers.
Caroline: I feel kind of sick listening to this because I’ve had a miscarriage and I did do lots of searching when I was trying to figure out if it was happening. And I feel kind of sick. Sorry.
Danielle: No. And it’s the most sacred to you. Like you think, this is between me and the search engine and me and the person I love and my family members, people I’m close with and no one should know this. And yet a data broker is selling you as, for each and every one of us, who’ve had health conditions and miscarriages and terminations.
Caroline, narrating: To give you an idea of how accessible this data is, in May this year, a journalist for the US news site, Motherboard, bought a week’s worth of location data for 600 abortion clinics in the United States for the princely sum of $160. This location data covers things like how often people visit, how long they stay there, where they came from and where else they go. It can even point you to where someone lives based on where their phone is overnight. This kind of data has already been used to target specific people. A Catholic publication used location data to out a gay priest. A few years ago, The New York Times ran a couple of special investigations proving it could unmask several individuals, including the US President, using supposedly anonymous location data it had obtained from data brokers. And it’s not just journalists. Employers, insurers, they’re looking at this data too. In fact, pretty much anyone can access this data. You could access it if you wanted to.
Danielle: So there are companies, there are businesses like finding and sifting through resumes, conducting interviews. And they’re making recommendations to employers about sifting through all these resumes about who to hire. And law in the United States doesn’t touch those folks. And so yes, an employer can’t ask you about your genetic information under the federal GINA, the Genetic Information Non-Discrimination Act. And employers can’t use your credit report against you, but when they outsource employment tools and those employment tools are using your information in ways that have a discriminatory impact, Title Seven doesn’t apply to them.
Caroline, narrating: I’ve actually come across these kinds of companies before. When I was researching my book, Invisible Women, I read about a recruitment platform called Gild, which has since been bought by hedge fund Citadel. Gild’s software combed the internet for applicants’ ‘social capital’. That’s things like what sites they visit and how they contribute to those sites. And they fed this data into a shiny algorithm that promised to find you the best candidate. I’d also become kind of obsessed with a company called ZipRecruiter that I kept hearing about from a podcast ad. It promised its algorithm would ‘find a quality candidate for you in minutes.’ At the time, I was mainly concerned that the gender data gap would result in these companies contributing to discrimination against female applicants. But now I was worried that they did have female data and that too could be used to discriminate against female applicants.
I wanted to understand how these companies were using data to make decisions about potential job applicants. So I got on the phone with Hannah, my producer.
Caroline: … and I don’t need to worry about data brokers selling that information to a potential employer who’ll decide that they don’t want to hire me because I’m someone who has miscarriages and I might need to take time off work because I have a miscarriage. But another woman might well have to worry about that.
Hannah Varrall: Yeah, no, that was really shocking. Wasn’t it?
Caroline: Yeah. So I’d definitely like to know from Gild and ZipRecruiter, if they use health data as part, if they buy health data. And if they’ll tell us what kind of data they buy.
Caroline, narrating: There is a desperate irony at the heart of this. And it all starts with the gender data gap. The gender data gap means that women are less likely to get the answers they need from their doctors. We know this. Women are systematically underdiagnosed, standard treatments don’t work as well on us for a whole range of diseases, health conditions that are female dominated are chronically under researched. And so women turn elsewhere for answers.
Historically, this might have been the holistic healer on the high streets. Today, it’s more likely to be the internet or an app. And here’s where the irony comes in. Many of these apps present themselves as friends to women. They talk about the lack of data on female bodies and tell women, “By inputting your data into this app, you are helping address that problem. You’re contributing to science.” It all feels hugely empowering. And you know, it could be. Many of these apps, like the period tracker Clue, for example, do in fact collaborate with scientists. Our knowledge about the female body is already better as a result of the data they’ve collected on millions of women around the world. But most of these apps are not our friends.
Caroline: Why is fem tech so bad at privacy?
Danielle: Because it has to make money and the model is data.
Caroline: So do you feel like they shouldn’t be collecting this data and we should stop this scientific research until we fix the privacy issue?
Danielle: Yes.
Caroline: Right.
Danielle: And in part because if we’re going to do research, do research. Research is covered by IRBs. Research is covered by HIPAA protections. So we’re going to research endometriosis? Let’s do it! We can. We have all sorts of medical research going on all the time.
Caroline: Yeah. But they can’t get the funding for endometriosis.
Danielle: Right. I know that these companies are sharing it with medical researchers, but it’s a side hustle. That’s cover. And their real money is being made from the sale of our data, the location, to data brokers, advertisers and marketers. So I am all for medical research, absolutely. But not as cover for exploitation.
Caroline, narrating: Ultimately, the price of advancing female health shouldn’t be our privacy. It isn’t for men whose bodies are already well catered for by traditional medical research.
Danielle: My girls are in their twenties, early twenties. And so they want to use it because it’s convenient and it helps them figure out when they’re getting their period.
Caroline: I use it.
Danielle: Right? It’s important. It’s helpful. I hate the fact that I have to say to you, “Caroline, don’t use it.” I hate that. I don’t want to have to say that to you. I know that it’s helpful. I see the upside, viscerally. I just hate that it’s fraught with so much risk. I want us to be able to share information in ways that are profoundly helpful for human beings, but shorn of the costs.
Caroline, narrating: All of this is about to get a lot more scary in the US. And that’s because privacy was at the heart of Roe V. Wade.
Danielle: The Supreme Court in 1973 finds for the first time that women have a liberty interest to decide for themselves in the first trimester of pregnancy, if they want to terminate a pregnancy. That it’s between them and their doctor. And the Court isn’t really clear about what part of the constitution it’s relying on. It says the first, the fourth, the 14th due process clause. But it ultimately says a woman has a right to privacy of decisions, a liberty right, to make decisions about herself, about her family, about her reproductive life.
Caroline, narrating: So the whole right to abortion in the US relies on this concept of the right to privacy, which the Supreme Court has just ruled does not exist.
Danielle: That decision said Roe was wrong on the day it was decided.
Caroline, narrating: The decision she’s talking about is called Dobbs versus Jackson Women’s Health Organisation. I asked data journalist Patricia to give me a rundown of the case
Patricia: In 2018, Mississippi tried to ban abortion after 15 weeks. And the Jackson Women’s Health Organisation, which is Mississippi’s only abortion clinic, took them to court because they said that there was existing law and it was constitutionally enshrined that women had a right to abortion before 24 weeks. And that went back and forth and kept going up the courts and eventually reached the Supreme Court, and that is the ruling that we all now know about.
So what does that mean? It’s basically overturning two really important abortion laws. One is Roe V. Wade, which found that women have a constitutional right to privacy and therefore a right to abortion. And the other is Planned Parenthood V Casey, which is a 1992 case that sort of underlined that. That said that before 24 weeks, so before more or less foetal viability, it was an undue burden on women to have their rights blocked in that way. But this Supreme Court has found… it is basically a reading of the constitution. They’re saying that’s not a fair reading of the constitution. It’s not constitutionally enshrined at all, abortion rights. So we should be reading it as it was written in the constitution, which was in… 1868 was the 14th amendment.
Caroline: Right. And what exactly did the 14th amendment say?
Patricia: Basically, it’s called the due process clause. So it’s a woman’s right to due process under the law. So it basically means that the state can’t deny you the right to life, liberty, and property without due process.
Caroline: Right. So Roe V Wade and Casey were saying that the right to abortion comes under that 14th amendment because it is the right to liberty. And the Supreme Court is saying, no, because in 1868, that’s not how they would’ve interpreted it.
Patricia: Yeah, exactly.
Caroline: Wow.
Danielle: And the Court said Order of Liberty wouldn’t have included abortion because abortion was a crime. We got to look at what was protected in 1868. And you want to know what? In 1868 women couldn’t vote.
Caroline, narrating: Other rights US women didn’t have in 1868 included the right to control their own earnings and property, to practise law, to have an equal ownership over their children, and as we heard earlier, the right to not be raped in marriage and to not be beaten by their husbands. That same year, 1868, the year that the 14th amendment was passed, the year that the Supreme Court wants us to return to, another one of those judgements was handed down where a court decided to prioritise a man’s privacy over the rights of the wife he was beating.
Voice actor: We will not inflict upon society the greater evil of raising the curtain upon domestic privacy to punish the lesser evil of trifling violence.
Caroline, narrating: The Supreme Court of North Carolina acknowledged that the violence would have been classed as battery if the victim was not the defendant’s wife. But since she was his wife, they instead classed it as…
Voice actor: Trivial complaints.
Caroline, narrating: And you wonder why I’ve historically been suspicious of privacy?
I feel like my position on privacy might have to adapt in this strangely old new world. I have a sneaking suspicion that apps with location trackers do not mix well with 19th century attitudes to women. And if that’s the case, then we need to know two things. How easy is it for women’s data to be used against them? And should women really be deleting their period apps?
Caroline: Do you think that they’re right to be concerned?
Danielle: I think absolutely. I think people who are like “Get over yourself” or “You female academics are overreacting,” I’ve seen women from the security perspective saying, “Your fertility tracking app isn’t the problem, it’s your cell phone.” Well, you need a warrant for your cell phone and you need a warrant for the content of your communications. You don’t need a warrant, you don’t need a subpoena to buy my period tracking information, my location information from a data broker. So I got to say I’m worried about any collection and promiscuous sharing of our intimate information, which these products and services are built in the United States. So what happens in the US matters for abroad, even if you have more protections than we do.
Caroline, narrating: As well as asking Hannah to look into those recruitment companies, I’d also tasked her with contacting some data brokers to find out what kind of data they were selling and what policies they had in place to protect sensitive data, especially in light of the Supreme court’s decision on Roe V. Wade.
Hannah: Hiya, I’ve had a response from Oracle, one of the big data companies that we messaged. I emailed them 10 questions about their data and where they get it from, who they sell it to. Is it health data? What proportion is women and men? Do they have any intention of revising their policies to protect women’s data following the overturning of Roe versus Wade? And I’ve got an email back saying, “Hi Hannah, thanks for your email. I’m afraid we are unable to participate in this podcast. Thanks for your interest in Oracle and good luck with the programme.” Just as if we’d invited them on as a guest to have a nice chat. So that’s an interesting reply.
Caroline, narrating: Safe Graph, the company that that Motherboard journalist bought all the data off, sent us a statement about its commitment to data privacy. But I mean, last year they actually got banned from the Google Play Store. So make of that what you will.
Hannah: So Caroline, I tried contacting Citadel who bought Gild and I’ve just had nothing back from them at all. So I don’t even know if they still use Gild’s technology, if they still purchase data, or what’s going on.
Caroline, narrating: There was also a response from ZipRecruiter, who like Oracle, seem to have got the wrong end of the stick.
Hannah: “Thanks for reaching out. We’re going to pass on this at this time,” as if we’ve invited them for a cup of tea and a chin wag. I think they have a responsibility to answer our questions, but obviously they don’t seem to think so.
Caroline, narrating: Patricia also didn’t really get anywhere with the fem tech companies.
Patricia: I reached out to the top 10 biggest fem tech companies by revenue and user base. I asked them what their policy was on sharing health and or fertility data with third parties. It’s been a week now. No, it’s been longer than a week, it’s been nine days now. And just four have got back to me despite numerous emails.
Caroline, narrating: Clue said that they didn’t sell data to third parties because their business model is based on membership. And that sensitive data is in any case protected under European law. But I spoke to a lawyer about this and they told me that European privacy law, GDPR, couldn’t stop a company from giving data to law enforcement if they had a warrant or subpoena. Fitbit said they were owned by Google. And well, we know that Google’s business model is data.
Patricia: And I think my favourite response was from a company called Glow, which I hadn’t heard of before. But they gave this weird short paragraph saying, “Thank you for reaching out. We will continue to uncompromisingly protect our users’ privacy and personal health information, period.” which I think is a period joke. “Our number one goal is to build the best products for our users, blah, blah, blah.” But didn’t answer any of the questions. When I insisted twice that if they really cared about this… “That’s so great that you care about this, but can you answer my questions?” They didn’t. And I’ve been chasing them for a few days now. So a real mix. Some companies obviously committed, most ignoring the issue. I should add that Flo have also said that they have launched a new feature called anonymous mode, which allows users to use the app without any personal identifying information, which to me suggests that they probably are worried that there might be some risk with their current model.
Caroline, narrating: The day I spoke to Danielle, period tracker apps were engaged in a flurry of social media posts promising women their data was safe.
Danielle: Two reasons why it’s a false promise. A, they say the data is anonymized. That’s just false the moment they come out of their mouths, because it can easily be connected to other information that’s going to re-identify it, especially the location. You get that person in four hops for location. Because I go home a lot. We all do. Right?
Caroline: Right.
Danielle: So the second thing is you can’t say no to a subpoena or a warrant that is upheld by a court.
Caroline, narrating: If you still think we’re catastrophizing, I have bad news for you. I haven’t even begun to explain how dark things could get. Because we haven’t yet discussed the fact that even before Roe was overturned, women in the US were being prosecuted for miscarriage, cases where the state suspects that a miscarriage was in fact an abortion. According to data collected by a US nonprofit, between 1973 and 2005, there were 413 prosecutions for miscarriage in the United States. That’s about 13 per year. Between 2006 and 2020, there were over 1,200 prosecutions for miscarriage. That’s nearly 86 per year, which is a staggering increase of nearly 600%. And this is only going to get worse now that abortion has either been banned or is under serious threat in more than half of US states.
Danielle: Think about it this way. Angry ex goes to law enforcement and says, “My girlfriend, she got an abortion in violation of the state law. Here’s a text she sent me saying she was going to the doctor.” Then the officer says, “Okay, we’ve got the beginning of an investigation here.” Buys from a data broker her location information that shows that she went to a clinic and then returned home. That is helpful circumstantial evidence. And then you’ve got somebody saying, “And she got her period,” certainly enough for indictment.
Caroline, narrating: In the course of my research, I came across four cases where digital data has already been used to prosecute women for abortions. One of those women, Purvi Patel, was sentenced to 20 years in a US prison for “The neglect of a dependent and foeticide” after she took abortion pills that she bought online. I asked Patricia to see if she could find out anything more about this case.
Patricia: So obviously the court ordered all of the text messages to her friend, which are central to the case. But then in addition to that, they have got some search history, and email history. So they’ve got, she had a customer service email from internationaldrugmart.com, where they found that she’d purchased two drugs that are meant to induce abortion. So all of that was used as evidence against her. But she eventually won her appeal, and had she not, she would’ve been the first woman ever to be charged, convicted and sentenced for foeticide. But I’m interested in, I guess the precedent it sets and the fact that it is an example of a woman whose digital footprint was used against her in a case that related to foeticide, even if it was then overturned. I find it quite chilling.
Caroline: No, it absolutely is, especially in the context of a post-Roe-V-Wade world. And in the context of a country where we know that women are already being prosecuted for having miscarriages. So this is only going to ramp up. And it’s really frightening the idea, like I had a miscarriage two years ago, and it was absolutely devastating. And to think that I would have had to worry about the state trying to prosecute me after that makes me feel sick.
Patricia: Yeah. It’s completely dystopian.
Caroline: But then there is also the bigger issue of, it’s not just law enforcement who have access to this data, it’s pretty much anyone who wants it. And what we were talking about, for example, bounty hunters…
Caroline, narrating: Yes, I did say bounty hunters there. This is a whole new horror in America’s war on abortion. And it’s all thanks to a new law in Texas which bans abortions after a foetal heartbeat is detected. This usually happens around six weeks, which is before most women even know they’re pregnant.
Roe V. Wade was still in force when this law was passed, but Texan lawmakers got around it by allowing people to bring civil cases against abortion providers or anyone else who helps a woman gets an abortion. Anyone who gives her a lift to the clinic, for example. The person bringing the lawsuit is entitled to at least $10,000 in damages if they win their case. Last year, Associated Press News reported that Texas Right To Life, the state’s largest anti-abortion group, had launched a website where people could post tip-offs. And the group said it had attorneys ready to bring lawsuits. The concern is that people who want a relatively easy 10 grand might buy women’s digital data in order to find victims. And remember, a journalist purchased a whole bundle of data for $160. I’m no financial advisor, but I feel like 10 grand is a pretty solid return on that investment.
So how do we fix all this and stop women’s data being used against them? Some women have tried to confuse the apps by inputting fake data.
Danielle: Doesn’t work. No, it’s a joke, because your phone has advertising ID mobile device numbers, so that the app knows who you are. So not saying your correct name, lying about your birth date, lying about your phone number and then giving it half information, like I can imagine what folks were doing to fool the app. And we’ve all tried that strategy. So what you’re trying to do when you don’t say the truth about your birth date is you’re trying to have a choice be meaningful, because you’re like, “I don’t think you need to know this.” That’s your way of making a choice. I guess my answer is, I want you to try it, but it doesn’t work.
Caroline, narrating: So what does work? Danielle thinks that what we need is a civil right to privacy.
Danielle: Now, a civil right in the American modern civil rights tradition means that we have a moral right to something. It means that this right is so important and these obligations are so important that they can’t be traded away for profits or efficiency. It means that we see this right as fundamental to flourishing.
Caroline, narrating: This sounds good, but I’m not completely convinced because there’s a double standard when it comes to the willingness of law enforcement to even look for digital data in the context of a prosecution. They seem to be all too happy to use it against women, but when it comes to things like so-called revenge porn, suddenly, it’s all…
Danielle Citron: “It’s a misdemeanour. We can’t get a warrant. We can’t figure out who this is. We asked the guy, he said it wasn’t him, sorry.” It’s a computer crime. So they don’t know where to begin. Get a warrant for an online service provider to go to the ISP. They’re befuddled.
Caroline: I’m sorry to interrupt for a second.
Danielle: Yeah, no, of course.
Caroline: I’m really blown away by this because when it comes to prosecuting women for abortion, they don’t seem to be befuddled-
Danielle: No befuddling.
Caroline: By digital data.
Danielle: That’s right. They have figured that out fast.
Caroline: There is no, “She said it wasn’t her. She wasn’t using her phone.”
Danielle: Yes. Like they don’t say that. They call it the ‘some other dude’ defence, like it wasn’t me, it was somebody else. Like, it wasn’t me. They’re like, they don’t care. They’re like, of course you did it.
Caroline: Yeah. But you’re saying that for online harm perpetrated by men-
Danielle: “I got hacked,” literally is the response. The person says, “It wasn’t me. Someone must have used my computer. Someone hacked me.” And cops are literally like, “Sorry. He said he was hacked. It wasn’t him.”
Caroline, narrating: So if the law doesn’t really care about women, why will more law help? Well, Danielle says, she’s not really focusing on the criminal law in part precisely because of this issue of the unequal enforcement of the law.
Danielle: The changes in law that I want to see are directed to companies. That a civil right to intimate privacy would mean that companies and government entities collect less and are forbidden from sharing or selling in certain circumstances and have duties of loyalty and commitments of anti discrimination. They would be viewed as the guardians of our intimate information just in the way that online platforms would.
Caroline: Does the overturning of Roe V. Wade make the rights you are calling for more or less likely to happen?
Danielle: I think it makes the issue front and centre in a way that, I hate that this has happened, but our intimate privacy, the choices that we make about our bodies, including information and access to, but now also our health, is so on the line that I think that there is a moment of reckoning here. And I think it’s an important time to start campaigning around intimate privacy. And we’re starting to do that. The Cyber Civil Rights Initiative in conjunction with other advocacy groups to bring to the fore, not only decisional privacy or decisional liberty for women, but all of these broader issues of intimate information. So I’m hoping to have the fight for privacy be fought in state lawmakers houses like state legislatures and at the federal level.
Caroline: So that makes it sound as if you are optimistic.
Danielle: I’m always a bit of a Pollyanna.
Caroline, narrating: I started off this podcast feeling really determined to close the gender data gap for women by any means necessary. And I’m still determined. Our failure to collect data on women’s bodies and women’s lives is killing us. But Danielle has made me see that in my determination to fix this enormous problem as quickly as possible, I’ve been overlooking the fact that not everyone sees the gender data gap in the same way that I do – as a huge failure of science that needs to be addressed to stop women dying from preventable causes.
Instead, some people see it as an opportunity to exploit women. Some people see it as an opportunity to make money. This makes me angry. I’m angry that anyone would exploit what is a life-threatening and life-limiting reality for women everywhere. I’m angry that women are in this position in the first place. But I also feel grateful to women like Danielle, not only for pointing out these sticky knotty issues, but also for her commitment to fixing them, so we can get back to closing the data gap for women without having to pay the price of our privacy.
Thanks for listening to this episode of Visible Women from Tortoise. If you’re a Tortoise Plus listener on Apple Podcasts or a Tortoise member, listen out for a bonus episode coming on Friday. Visible Women is taking a short break after this episode, but don’t worry, we’ll be back soon with much, much more.
This episode was written and produced by me, Caroline Criado Perez, alongside Hannah Varrall and Patricia Clark. The executive producer is Basia Cummings. It features original music by Tom Kinsella.