Technology

Laura Bates: Billionaires are hijacking AI—and we should all be furious

The activist and author on her latest book, how powerful algorithms are fuelling the backlash against feminism, and what we can do about it

July 23, 2025
article header image


Ellen Halliday, deputy editor: Your new book [The New Age of Sexism] looks at the ways in which misogyny is being embedded in tech and the ways in which technology reveals where misogyny exists and how it operates. Why this book now?

Laura Bates: Because it feels so urgent. We are at a precipice. We are on the edge of an AI revolution that is going to transform our society in ways we can’t quite yet grasp.

This is a unique opportunity to act, to shape what that new world will look like, who it will affect, and how. If we don’t seize this moment for regulation before these [technologies] are rolled out, then we will then be in a mess where we are, for decades, trying to gradually unpick the foundations. If the foundations of that future world are systemically misogynistic and racist, it will be really difficult to row back from that later.

Alona Ferber, senior editor: It's not an easy book to read. Was there anything that surprised you as you were reporting the book?

Laura: In the US, AI tools have been shown to downgrade the number of black patients identified for needing extra care by more than half. That particular tool affects 200 million patients. That was shocking.

A significant proportion of big companies are using them in their recruitment processes, even though we know that those tools have actively been shown to discriminate against women in marginalised groups.

There were also elements of the ways in which technology is being used to enable sexual abuse, stalking in particular, that are really scary as well.

Alona: You open the book with a chapter about deepfakes. Women are being victimized en masse by deepfake pornography and nobody’s talking about it as an urgent issue.

Laura: Yes, there’s a sense that the risk from deepfakes comes from threat to male politicians and to potential democratic erosion, which is, of course, an important future threat to think about.

But it’s so frustrating when you see the Europol report, for example, on how we should police and tackle deepfakes that’s 22 pages long and mentions women once and has two small paragraphs on sexual deepfakes, when the reality is that 98 per cent of all deepfakes are pornographic in nature and 99 per cent of those feature women, including women in politics.

Thirty female politicians in the UK had hundreds of these images and videos made of them and put on a website where they were viewed 12.6 million times. There’s a threat to democracy that’s happening now.

Ellen: What are big tech companies doing to protect women and minorities?

Laura: There has always been a dramatic gap between the things that these big tech companies say and the things that they do when it comes to safety.

There was a study recently that found that if you start a new TikTok account in the name of a teenage boy, it takes less than 30 minutes before the first piece of extreme misogynistic content will be promoted into your feed. You don’t have to go looking for it. It will come to you. And yet, I’m quite sure that TikTok will say “there is no place for misogyny on our platform”. 

Ellen: Is that the responsibility of government or does it have to come from within these tech companies themselves?

Laura: It’s never going to come from within tech companies themselves. We have seen repeated cases where tech companies have been aware of a problem, they’ve been aware of potential fix, and they’ve chosen not to do it because essentially it will negatively impact engagement, which is the holy grail of every tech company because engagement is what leads to advertising revenue and profit.

The only way that this is going to change is if we have regulation that comes from governments, who are ideally using a global framework for regulation.

That doesn’t mean stopping innovation or progress. It just means bringing tech in line with, frankly, every other sector where the idea of common sense regulation at the point of release of products to the public is considered completely normal.

The only reason that we baulk at this is because the tech sector has always played by different rules.

Ellen: Are there hints of any further legislation discussion around regulation in the UK?

Laura: There are meaningful conversations happening around deepfake abuse specifically. But it’s very disappointing that earlier this year at the AI [Action] Summit in Paris, when around 60 countries signed an agreement saying that AI should be safe and ethical, the US government said that they would refuse to sign it, [and] the UK government swiftly followed suit.

There was a period when the UK government indicated that it would be prepared to consider watering down some of the terms of the Online Safety Act, in order to secure a favourable trade deal with the US.

Trump is currently pushing for a kind of 10-year moratorium on all state regulation of AI in the US. It’s a worrying trend.

Ellen: You work with parents, with teachers, with children. Can you talk a bit about the conversations that you have in schools with young people?

Laura: When I’m working in schools with teachers and parents, one of the challenges is that we are currently experiencing a unique moment in history, where a generation of non-digital natives are parenting and educating a generation of digital natives.

When we talk about online porn, for example, we are not talking about an online version  of what might have been the FHM [magazine] or Nuts or Zoo centrefolds of those adults’ teenage lives. We are talking about an online landscape in which pornography is frequently depicting sex as something violent and abusive that is done by men to women, whether they like it or not. When I work in schools, it’s not unusual to hear teenagers say things like “rape is a compliment really”, “it’s not rape if she enjoys it”, “it’s not rape if he’s your boyfriend”.

The backlash against the idea that we should talk to young people about these issues is ludicrous, in light of the Children’s Commissioner’s findings that the average age at which young people first see online porn is 13. We know that young people are bombarded by extreme content online every day, many of them before the age of 12.

Are we going to stick our heads in the sand and refuse to deal with that? Or are we going to give them such a robust, wider understanding, in an age-appropriate way, of what healthy relationships and boundaries look like, so that we take some of the power away from misinformation that they will inevitably come across online?

Alona: Do you feel like things have changed for the better in any way since you started your work?

Laura: There are, of course, glimmers of hope and there have been positive changes There's always been a sense in which progress is followed by backlash and there are two steps forward and one steps back. But never before in history has that backlash been facilitated by immensely powerful algorithms. We’re seeing opinion polling suggesting that the most outdated, regressive, conservative attitudes towards women and their bodies are most likely to be held by the youngest cohort of men surveyed. 

Ellen: You say towards the end of this book that writing it made you angry. But what gives you hope about the future?

Laura: Well, I think optimism and hope are linked to the anger. Women’s anger has been stigmatised for centuries. But we are on the brink of something akin to the Industrial Revolution. We should all be absolutely furious that it is being hijacked by this incredibly small number of unelected billionaires, using it as their personal playground to enrich themselves further, at enormous cost to the planet [and] to the most vulnerable communities.

If enough of us get angry enough, I feel hopeful that there is still power in democracy, in collective protest, in calling on our elected representatives to recognise that they have the power to act and that we want them to regulate now before it’s too late.

And seeing the efforts of incredible frontline feminist organisations, academics, campaigners, young women who are fighting so hard to give voice to that anger in a very righteous way, that gives me hope.

This is an extract from an interview on the Prospect Podcast. It has been edited for length and clarity. Listen to the full episode here. Or watch an excerpt on our YouTube channel below.