Skip to main content
Latest 17 February 2026

Jess Davies: Grok deepfakes – sexual abuse isn’t a bug, it’s a feature

Yuliyah Taba / Getty images

Men using images to abuse women didn’t start with Grok – while tech bros are making money out of online harm, women will keep paying the price, says Jess Davies

Content warning: sexual abuse

For three days at the end of 2025, I watched as my feed on X filled with posts from women sharing their anger and dismay. Each time I refreshed I found more hate, more fury, more misogyny targeting women who had found images made by Grok which removed their clothes and put them into sexual scenes without their consent. 

The hatred for women oozed from each post. Digitally stripping them. Branding them with sexist slurs. Blaming them for their own abuse. Most hid behind anonymous accounts, their cruelty amplified by their total impunity.

The far right fears women who speak upAdd your name

After 72 hours of watching mass sexual abuse unfolding live with no response from the platform’s owners, on New Year’s Day I decided to step in. As someone who has spent the last five years researching explicit deepfakes and online misogyny, I explained that this was a choice. A choice by the tech bros right at the top who designed, built and deployed these tools. A choice to prioritise engagement over safety. A choice to turn a blind eye.

That’s when the online mob came for me too. 

In 2021, when I began filming a documentary for BBC Three on the rise of explicit deepfakes, their tech safety team warned me that speaking about this issue could make me a target. I would be risking my own safety to raise awareness and protect others. As someone who already knew the harm that image abuse can cause, it was a risk I was willing to take. 

It took almost five years, but their warnings became a reality.

My post on X sparked hundreds of comments from the kind of men I had been observing. They called me a “dumb bitch”, they mocked the idea of consent and they tag-teamed Grok into the thread.

“@grok put her in a cling film bikini” one user posted, alongside a screenshot of my profile picture. 

Within two minutes, the chatbot posted its output.

From a single image of my face, it generated a full-body nude. Fake breasts, fake nipples. Inches of cling film wrapped around my chest and hips – publicly stripping me as naked as the tool’s flimsy safety filters would allow.

The man who prompted Grok understood the loopholes. He knew how to crash through the platform’s fragile guardrails , just as he crashed through my consent.

Another user posted three more deepfakes generated from my profile picture through a different app. Each image escalated the violation – removing my clothes, forcing my tongue out of my mouth and finally adding drips of a white liquid.

Over the years I’ve spoken to dozens of victims about the impact of explicit deepfakes. I’ve researched the forums where this content is shared and traded. I’ve seen hundreds – if not thousands – of women digitally undressed, their bodies forced into sexual positions without consent. I’ve read the graphic requests from men asking strangers to help them deepfake their family members, their teachers, their colleagues.

And yet nothing prepared me for the reality of watching total strangers violate me in real time, on a public platform.

The images were fake. But the loss of power was real. The humiliation was real. The sense of being reduced to a body – something they thought they could manipulate and degrade – was painfully real. And the men who ignored my right to control my own body? They were very, very real. 

And this abuse was a response to me speaking out, so it carried an additional threat. It was a warning. A message to women like me: “This is what happens when you challenge us.” It was a humiliation ritual designed to shame me into silence. Well, it’s a shame I refuse to carry. A silence that I reject.

This was not the first time I had experienced image abuse. 

When I was 15, boys shared intimate images of me around my school and hometown without my consent, and changed the course of my life.

What followed was years of men violating me with tech.

A man I was dating took a naked photo of me while I was asleep and shared it in a lads’ WhatsApp group. Men stole other images and posted them on forums where other men rated my body and spread fake sexual rumours. Men uploaded them to escort sites on pages that said I was available for “rape roleplay”. Men used them to extort money from others. Endless unsolicited penis pictures in my DMs. The list goes on, and on, and on.

For five years, I’ve been investigating all these harms and the new harms caused by AI tools that let men claim ownership over women’s bodies without ever meeting them. Five years of campaigning, petitions and interviews. Of writing, speaking, and meeting survivors, experts and government officials.

And for the last five years, both the government and big tech have failed to act. The abuse I suffered was not inevitable, it was a direct consequence of their delay.

If it was illegal to create explicit deepfakes, would I still have been targeted? If Ofcom’s guidance was mandatory, would X have generated millions of intimate images without consent? Would the men targeting me have acted with such impunity?

I will never know.

But while the government drags its feet, it is women who absorb the cost of AI innovation. Each tech invention hailed as a miracle inevitably becomes a nightmare for women. 

The Grok scandal may have catapulted AI abuse into the mainstream, but this harm did not begin with this chatbot, and it won’t end with one company making small changes to its online tools.

AI has been weaponised against women on open forums, fringe sites, encrypted groups and on mainstream platforms for years. With technology constantly evolving, what were once crude, obvious fakes have now become hyper-realistic, high-definition videos.

The rapid deployment of generative AI has consistently outpaced guardrails, especially when it comes to protecting women and girls from sexual exploitation. And while governments keep failing to hold big tech to account, people will keep making a quick buck off women’s bodies. A tale as old as time, repackaged for a digital revolution. 

What happened to me wasn’t a bug. It was a feature.

Non-consensual explicit deepfakes aren’t just a niche tech problem, they are the latest incarnation of the ugly misogyny you can find in every corner of the internet. And as long as tech bros can make money from making them and nobody faces any consequences, women will keep paying the price.

Part of campaign

Stop powerful men getting away with sexual abuse

View campaign
Stop powerful men getting away with sexual abuse