We must do everything possible to tackle the devastating crime of image based abuse

Kirith Entwistle ©House of Commons/Roger Harris
Every day women and girls across the UK feel violated and harassed by both strangers and people in their lives. Their intimate photos are taken and shared online, and they have no way to protect themselves. Or someone took their selfies and ran them through AI software and now they’re sharing fake explicit images. Some women have described the experience as “digital rape” and I’m worried that our laws and our justice system are not fit to protect them.

Last month I had a really interesting conversation with the End Violence Against Women Coalition. They have a membership of more than 150 specialist women’s organisations, researchers, and experts working on joined-up approaches to end violence against women and girls. I came away wanting to have a much bigger conversation.

What we talked about was the very real abuse being suffered every day by people across the country in the form of Non-consensual Intimate Image (NCII) abuse. Now that isn’t new, it’s something we’ve been talking about since everyone started carrying phones around with cameras on them. What is new is our understanding of the scale of the problem and it’s potential to get so much worse.

Let’s use the Revenge Porn Hotline as an example. Since 2015 they’ve had more than 24,000 total direct cases via phone or email. 3,500 cases in this year alone. They also have an online chatbot, where they’ve had 35,000 sessions with victims of NCII abuse. They’ve helped remove 330,000 images from the internet, but they know of 30,000 reported non-consensual intimate images remain online due to issues with laws and international boundaries.

Georgia Harrison is a courageous campaigner who shared her story with the Women and Equalities Committee. Georgia’s images were distributed without her consent, leading to years of harassment, scrutiny and anguish. Even after her abuser was convicted, Georgia continued to see her images circulate online.

That is why I decided to lead a debate in Parliament on image-based abuse. We need to create opportunities to hear the stories of those who have been impacted, and we need our law makers to have the chance to get their head around the challenges we face and find ways to protect people. Incremental change is not enough.

It feels like at any time in my life the phrase “the digital world is changing rapidly” has been true, and the pace of change only seems to be accelerating. As exciting as that pace of change can be, if we’re too slow to respond it will pose serious risks. When it comes to protecting women and girls, I’m worried that we are already failing to keep up.

Earlier this year, sexually explicit AI-generated deepfake images of Taylor Swift were spread widely across social media sites like X (Twitter). They were freely available without any filtering or censorship, nothing to protect the many young people who would be searching her name, and nothing to protect Ms Swift from this horrendous act. Eventually X blocked people from searching her name as a temporary measure but they had been seen by millions. One post was reported to have been seen over 47 million times before its eventual removal.

When people think of “deepfakes” they might only think of it as a celebrity problem, but it is already accelerating. Only three weeks ago a man from my constituency was sentenced to 18 years in prison for creating child abuse images using AI technology and real pictures of children. People would supply him with photos and pay him to manipulate them into horrendous images of abuse. It’s good to see that he has been brought to justice but this is rare. What happens when AI products become so widespread and easy to use that people don’t need to search the darkest parts of the internet to find the one person willing and able create these images. They can just make their own pictures of anyone they feel like without their consent. Everyone with photos in digital spaces becomes a potential victim of these acts.

I want to be able to tell survivors that this Government are doing everything possible to support them. I want to reassure them that our Ministers are responding in real time to the scale and urgency of the crisis. With every day we delay, more women and girls are thrust into cycles of harm without the protections that they urgently need and deserve.

Kirith Entwistle MP

Kirith Entwistle is the Labour MP for Bolton North East, and was elected in July 2024.