Culture

Danielle Keats Citron: Tech giants can’t ignore privacy violations

Deep fake videos and non-consensual imagery threaten our privacy—and free speech

November 07, 2022
Danielle Keats Citron. Credit: handout
Danielle Keats Citron. Credit: handout

Danielle Keats Citron, professor of law at the University of Virginia, is an expert on digital privacy. She was one of Prospect’s Top 50 Thinkers in 2015, a MacArthur fellow, and advises governments across the world on cyber hate crime. She is consulted by major technology players including Facebook, Bumble and Spotify, and has been a member of Twitter’s Trust and Safety Task Force since 2009. Her new book The Fight for Privacy: Protecting Dignity, Identity and Love in the Digital Age (Chatto & Windus) sets out her manifesto for cyber civil rights. She spoke to Lois Heslop in September about big-tech liability, the future of privacy legislation and her views on the free speech pushback. The conversation has been condensed and edited for clarity.

Lois Heslop: Right now is a pivotal moment for the future of online privacy, with the proposed Online Safety Bill here in the UK and talk over Section 230 reform in the States.

Danielle Keats Citron: What’s so notable about tech companies, content platforms and search engines is the lack of regulation. For any given industry, there may be a period where you have immunity from liability, because it’s so young and you want to allow for experimentation. We forged this in the United States in 1996—Congress passed the Communications Decency Act which provided immunity for tech platforms for user-generated content. That act is now 25 years old. The biggest culprits are the dominant global behemoths like Microsoft, Meta [Facebook] and dating apps. They’re often headquartered in the United States, but they’re used everywhere, and they still enjoy immunity from responsibility.

LH: Laws only cover specific jurisdictions, so surely some voluntary self-regulation from big tech is necessary. Is there any incentive for tech companies to introduce their own mechanisms?

DKC: Their incentive, of course, is money. Likes, clicks and shares. On the other hand, Facebook got CCRI (the Cyber Civil Rights Initiative) and myself and my colleague Mary Anne Franks involved because they wanted their platform to be family friendly. They were allowing kids 13 and older to use the platform—they’d banned nudity from the start. They’ve been pretty aggressive in helping victims of non-consensual intimate imagery, which has been a huge relief to victims. So we’ve had some progress, and I can’t say it’s because I was so persuasive—I would love to think so. But the truth is that it was in their interest and that aligned with victims’ interests. I also think it’s good PR. Now that companies are seeing the EU move against hate speech and the Online Safety Bill in the UK, they realise they can’t ignore it.

LH: There is a lot of free speech discourse around the Online Safety Bill—and indeed when any government tries to intervene in online privacy. Do these arguments have merit?

DKC: So first things first, intimate privacy is a precondition to free expression and sexual expression. If we do nothing with regard to intimate privacy violations, we know—from studies from the UK, Australia and the United States—that the impact of non-consensual intimate imagery is to silence people and chase them offline; they shut down all their profiles, they cut themselves off from loved ones; they become silent both in their speech, in their expression online, and also in their relationships. Doing nothing is undermining the free speech of intimate privacy victims, and even in the United States you don’t have a right to hack into my computer. You don’t have a right to hide a camera in my bedroom. Nor do you have a right to hide a camera in the public bathroom.

It’s really important to realise that when we talk about free speech, we’re talking about the values behind free expression: the ability to figure out the society we want to live in. A nude photo of me without my consent? That’s coerced sexual expression. I think it actually opens the conversation to show the free speech cost of intimate privacy violations. I view myself as a defender of free speech.

LH: Do individuals share responsibility for looking after their own digital footprint?

DKC: That line of individual responsibility is fine in a system that holds perpetrators responsible. But it’s important to note that platforms are making money from abuse. They are individually responsible for deliberately soliciting, encouraging, and keeping up abuse that earns them money from our data. Individual responsibility means platform responsibility in my mind. So why is it the platform’s responsibility to have duties of care with regard to online harms? You say—why don’t we go after the perpetrators? In my experience from working with governments across the globe—in the United States, UK, South Korea, Singapore—it’s because intimate privacy violations have low-level sanctions, and perpetrators are not prosecuted. The criminal law is woefully underenforced all over the globe. As for civil remedies—it’s tough to get a lawyer to represent you when there’s no deep pocket, and victims don’t have a lot of money.

LH: Facebook was recently asked to hand over data relating to an abortion prosecution in Nebraska. Has the overturning of Roe v Wade made protecting privacy more urgent?

DKC: We have seen some market moves in response to the Roe v Wade decision, like Google announced that it was not going to track location around reproductive health centres. My response to that effort was that’s not good enough. If you keep people’s searches related to abortion and reproductive health care, prosecutors just need to request the search, so it’s like a half measure. You would earn consumer trust by going further—61 per cent of Americans support abortion rights. There are opportunities like in this moment post-Dobbs where I think we have the world’s attention. It’s even more important here because women could go to jail.

LH: In your book you cover the implications of deepfakes, which are getting ever more realistic and sexually exploitative.

DKC: I have had discussions with the UK Image Abuse Reform Commission about a comprehensive approach to intimate privacy that would include regulating deepfake sex videos. Rana Ayyub—an Indian investigative journalist who was targeted with a deepfake sex video—explained that it’s very hard to erase from her mind the notion there are millions of eyes on her sexual identity, as if they’ve seen her doing the sex act in the deepfake. It feels life-ruining, and especially when there’s a link to it in a Google search of your name. It can be hard to get and keep a job and feel safe afterwards.

LH: How can legislators anticipate such rapid technological development?

DKC: We’ve got to get the conception of intimate privacy right. We often view these issues in isolation. Video voyeurism, upskirting, non-consensual intimate imagery, extortion and deep-fake sex videos. They’re all part of the phenomenon of the violation of intimate privacy. If you see them separately, you don’t see the harm in the aggregate, and the cost to intimate privacy, and so we need this approach around the globe. We need robust protections that cover the corporate and the public sphere, a comprehensive understanding of intimate privacy and the cost to civil rights and civil liberties, because then lawmakers see the fullness of the problem and the harm.

LH: Where do you think the priorities should lie in this area? When you are talking to legislators and to tech companies, what directionare you driving them towards?

DKC: First, minimise or stop the collection of intimate information. That creates a whole lot of opportunities to prevent abuse. You don’t collect it: you can’t then be asked to share it with law enforcement like Meta was in Nebraska. You don’t lead to downstream problems like hacking and extortion. That’s commitment to minimisation. You should never collect intimate information unless you need it to provide a legitimate product or service, and then, once you stop needing it, you delete it. The second is viewing platforms as the guardians of our intimate information, because anyone handling that data should have duties of care. There should be no sale of intimate information. You don’t sell it to advertisers. You don’t sell it to marketers or data brokers. It’s too important for human dignity. It’s too important for identity development.

LH: Critics are quick to shout “nanny state” over online privacy regulation. How doyou respond to that, and help to craft laws that are in our legitimate interests?

DKC: We’ve been in a no-state zone for tech platform responsibility since 1996 in the United States, which has impacts across the globe because anything that’s hosted in the US can be viewable outside of the US, and so immunity from responsibility here is immunity from responsibility everywhere. What we’re seeing develop isn’t so much a nanny state, it’s that we have a broken market, and we have a lot of harm, especially with intimate privacy violations and cyber-stalking. There are 9,500 sites whose whole point is the hidden-cam phenomenon, revenge porn and similar non-consensual content and they’re immune from responsibility. In the UK if the Online Safety Bill passes, it introduces liability beyond defamation, including harms like intimate privacy violations. I don’t think the bill is perfect or effective enough with regard to its injunctive relief, and of course it’s not passed yet.

LH: With organisations like the Facebook Oversight Board, we are starting to see social media companies realising their responsibilities as publishers, and not just as platforms.

DKC: I hope that it sparks even more of a commitment to duties of care in terms of combating intimate image abuse. When people reach out to me from different countries, lawmakers and policy folks, I’m like ‘I’m here, how can I help!’ I do think we’re seeing governments respond. Australia’s got an amazing eSafety Commissioner, Julie Inman Grant. There will be countries that are outliers—Russia may host a lot of intimate image abuse. Russia doesn’t care about free speech. But I think we’ve got some tech company attention, so let’s seize the day. I hope the world looks different, protecting intimate privacy as a civil right, rather than that dystopian. It’s one thing when you can see problems—it’s another when algorithms collect the information, and you don’t know about the jobs you don’t ever get interviewed for, or why your life insurance premiums are rising. Providers are not going to explain to you that it’s because of your intimate information.

LH: Do you feel from your position that it’s all going in the right direction?

DKC: I wish I could say without question. We’ve been through a period of tumult where the lives of women, girls and minorities have been less valued, where hate speech that fuels violence against women has been seen as acceptable in the Trump era. We took a lot of steps backwards. At the same time, we were busy passing laws around the country to criminalise non-consensual pornography, so I have a complicated answer because I’ve seen complicated countervailing trends. One of which is the social acceptance of hateful abuse and seeming cultural violation online.

The downside is that online privacy violations have increased. That’s the excesses of our laws around legal immunity. We’ve also seen lawmakers step up to the plate. We have a federal bill that’s got bipartisan support in the United States against non-consensual pornography. It’s narrow. It should be broader. It should be an Intimate Privacy Violation Bill. So my answer is we’re a work in progress across the globe.