Sharing Images Without Consent UK: Your Legal Rights Explained (2025 Guide)

Sharing images without consent in the UK is now a priority offence under the Online Safety Act, recognising the profound impact this form of abuse has on survivors’ safety and wellbeing. Since 31st January 2024, the law has specifically made sharing AI-generated intimate images without consent (deepfakes) illegal. This legislative change acknowledges a disturbing reality: the most commonly shared deepfake pictures on the internet are non-consensual sexual depictions of women.

Even though these images themselves may not be real, the harm they cause certainly is. Unfortunately, despite stronger legal protections, charging rates remain “pitifully low” in such cases. This is primarily because, as a relatively new and complex crime, police are still not properly investigating it or collecting sufficient evidence. Women consistently report more harm and greater concerns about the internet than men, with the rise of generative AI making it easier than ever to create fake images of women and girls as a tool for abuse.

In this comprehensive guide, you’ll learn exactly what constitutes image-sharing without consent under UK law, your legal rights as of 2025, the challenges in enforcement, and practical steps if you’ve been affected. Whether dealing with actual photos shared without permission or AI-generated images, understanding the law proves essential in protecting yourself and seeking justice.

Intimate Image Sharing and Tech Facilitated Coercive Control

Digital abuse has become a significant factor within coercive control. One area that continues to escalate is the use of intimate images to threaten, coerce, or manipulate a partner. This behaviour is not about sexual gratification. It is a tactic used to create fear, compliance, and dependence.

How this appears in abusive relationships

Perpetrators may:

  • Pressure someone to send intimate images, often using emotional blackmail.
  • Record or photograph intimate moments without consent.
  • Threaten to share images with friends, family, colleagues, or online audiences.
  • Use the fear of exposure to stop a victim from leaving the relationship.
  • Release images to cause humiliation or social isolation.

This behaviour is recognised as tech facilitated coercive control because technology extends the reach of abuse and removes safe spaces for the victim.

Impact on victims

Victims often describe:

  • Fear and anxiety about exposure.
  • A sense of losing control over their body and identity.
  • Shame and distress linked to the risk of public or workplace discovery.
  • Isolation from social circles and online communities.
  • Ongoing manipulation long after the relationship ends due to continued threats.

The Online Safety Act 2023 made sharing intimate images without consent a criminal offence in the UK, including deepfake content. However, reporting routes, enforcement, and support structures still remain inconsistent. This reinforces the need for employers to understand that digital abuse can directly affect employee wellbeing, performance, and attendance.

Why workplaces must be aware

Tech facilitated abuse often follows the victim into work. Worry about exposure, the risk of colleagues receiving manipulated content, or fears that an employer may discover the images can create overwhelming distress. This makes workplace awareness essential.

Domestic Abuse Education supports organisations to recognise these behaviours as part of modern patterns of coercive control. Our CPD accredited training includes digital abuse, online image threats, AI generated content, and the psychological impact of these tactics.
Sharon’s Policy also includes guidance on tech facilitated abuse, with practical employer steps for confidentiality, reporting routes, and trauma informed response.

What counts as sharing images without consent in the UK

Under UK law, the definition of what constitutes sharing images without consent has expanded considerably. At DAE, we regularly see cases where survivors are unsure if their experiences qualify as image abuse.

The law defines “intimate images” as materials showing:

  • Someone in a sexual act or doing something sexual
  • Exposed genitals, buttocks or breasts (including when visible through wet/transparent clothing or covered only by underwear)
  • Someone during urination, defecation, or personal care related to these acts
  • A person breastfeeding

Furthermore, these protections cover various formats, including digital images, screenshots, printed photos, and electronically stored files. Notably, the law also addresses digitally manipulated content, including “deepfakes” where AI technology creates non-consensual sexual imagery.

Research reveals alarming statistics: 1 in 14 adults in England and Wales have experienced threats to have their intimate images shared, with women particularly vulnerable (1 in 7 women aged 18-34).

Additionally, the law now recognises previously overlooked abuses such as “downblousing” (downward-facing photos of someone’s chest) and “upskirting” (images taken underneath clothing).

The crucial factor remains consent – sharing intimate images without permission constitutes a criminal offence regardless of whether they were initially taken consensually or if the relationship status has changed.

What the law says in 2025: Your rights explained

The legal landscape for image-based abuse has been completely transformed by the Online Safety Act 2023, giving you stronger protections than ever before.

As of January 2024, the law created several distinct offences, now in full force in 2025:

  1. A “base” offence of sharing intimate images without consent (maximum 6 months imprisonment)
  2. Sharing intimate photos with the intent to cause alarm, distress or humiliation (up to 2 years imprisonment)
  3. Sharing for sexual gratification (up to 2 years imprisonment)
  4. Threatening to share intimate images

Importantly, these laws protect you regardless of whether the image is real, altered, or entirely AI-generated “deepfake” content. Moreover, for threats to share images, prosecutors don’t need to prove the image actually exists.

Offenders may face registration as sex offenders, whilst 2025 has seen further strengthening of enforcement. In September 2025, cyberflashing was made a “priority offence,” compelling tech platforms to implement proactive measures like automated detection tools.

For online platforms, the stakes are high. Under Ofcom’s enforcement powers, companies failing to protect users face fines up to 10% of their worldwide revenue or potential UK service blocks. This represents a fundamental shift from reactive to preventative approaches.

We’re now starting to see real impact from these changes. The first successful cyberflashing prosecution occurred in February 2024 against a registered sex offender who sent explicit images to both a 15-year-old girl and a woman in her 60s.

Challenges in enforcement and what needs to change

Despite stronger laws, the enforcement of intimate image abuse remains problematic across the UK. Most concerning is the extremely low conviction rate; only 4% of cases reported to the police result in perpetrators being charged. At DAE, we frequently hear from survivors who encounter significant barriers when seeking justice.

Police responses often prove inadequate with officers’ lacking understanding about tech-facilitated abuse. Many survivors describe police failing to gather digital evidence or properly implement the law effectively.

Another critical issue involves non-compliant platforms hosting abusive content. While the Revenge Porn Helpline achieves approximately 90% success in content removal, the remaining 10%, often on overseas sites, can remain permanently online, causing ongoing trauma for survivors.

The immediate priority for victims is the urgent removal of content. As one survivor explained: “It is really like a house fire, the quicker you can put it out, the quicker you can stop it”. Unfortunately, crucial support services like the Revenge Porn Helpline face funding challenges.

Looking forward, meaningful improvement requires:

  • Consistent trauma-informed training across all sections of the criminal justice system
  • Adequate resources for police to investigate online crimes and gather digital evidence efficiently
  • Compelling perpetrators to delete non-consensual images from their devices
  • Long-term funding increases for specialist support services to meet growing demand

Conclusion

Understanding your rights regarding image sharing without consent proves essential in today’s digital world. The UK’s strengthened legal framework through the Online Safety Act now offers significant protections against all forms of intimate image abuse, whether involving real photographs or AI-generated deepfakes. Nevertheless, the gap between legal protection and practical enforcement remains troublingly wide.

Despite these comprehensive laws, many survivors still face an uphill battle when seeking justice. Low conviction rates, inadequate police responses, and persistent victim-blaming attitudes create significant barriers. Additionally, even when content removal succeeds on mainstream platforms, overseas websites may continue hosting abusive images indefinitely, causing ongoing trauma for those affected.

The path forward requires both individual awareness and systemic change. First and foremost, knowing your specific rights empowers you to take appropriate action if you become a victim. Equally important, pressure must continue for better-trained police officers, improved digital evidence collection, and long-term funding for specialist support services like the Revenge Porn Helpline.

Though progress has been made, true protection requires vigilance from all corners – lawmakers, law enforcement, technology companies, and society at large. The expanded legal definitions and increased penalties represent significant steps forward, yet much work remains. Until enforcement matches the strength of the law, survivors will continue needing comprehensive support and advocacy from organisations like DAE as they navigate the aftermath of this profoundly violating form of abuse.

FAQs

Q1. Is it illegal to share someone’s photos without their permission in the UK? Yes, it is illegal to share intimate images or videos of someone without their consent in the UK. This includes uploading them to websites, sending them to others, or threatening to do so. The law covers various types of intimate content, including sexual images and those showing exposed private body parts.

Q2. What are the penalties for sharing intimate images without consent in the UK? The penalties vary depending on the specific offence. The base offence carries a maximum sentence of 6 months imprisonment. However, if the sharing is done with intent to cause alarm, distress, humiliation, or for sexual gratification, the maximum sentence increases to 2 years imprisonment. Offenders may also face registration as sex offenders.

Q3. Are AI-generated intimate images (deepfakes) covered by UK law? Yes, as of January 2024, UK law specifically addresses AI-generated intimate images or ‘deepfakes’. Sharing these without consent is now illegal, recognising that while the images themselves may not be real, the harm they cause is significant.

Q4. What should I do if someone threatens to share my intimate images? Threatening to share intimate images is now a distinct offence under UK law. You should report this to the police immediately. Importantly, prosecutors don’t need to prove that the image actually exists to pursue a case against the person making the threat.

Q5. How effective is law enforcement in dealing with image-based abuse cases? Unfortunately, despite stronger legal protections, enforcement remains challenging. Only about 4% of reported cases result in charges against perpetrators. This is often due to inadequate police responses, difficulties in gathering digital evidence, and a lack of understanding about tech-facilitated abuse. However, efforts are being made to improve training and resources for investigating these crimes.

Share this article