Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Urgent legal changes needed to protect victims of revenge porn and image abuse, academics say

Victim says she attempted suicide after being told police could do nothing

Lizzie Dearden
Home Affairs Correspondent
Monday 01 July 2019 05:43 EDT
Some victims became 'obsessed' with checking the internet for new posts
Some victims became 'obsessed' with checking the internet for new posts (Getty/iStock)

Urgent legal changes are needed to protect the victims of revenge porn and other image-based sexual abuse, victims have said.

One woman told how she attempted to kill herself after being coerced into making explicit videos that were shared across the internet, then told that police could not help.

Others said they were “living in fear” over the possibility that loved ones and work colleagues would see images. They said they isolated themselves or were obsessively checking the internet to see what had been posted and where.

An academic report found that despite the creation of laws making revenge porn and upskirting illegal, police were being hampered by restrictions and emerging offences were not covered.

For example, there are no specific laws to tackle “cyberflashing”, where victims are sent unsolicited sexual photos, and “deepfake” pornography that sees victims’ faces added to explicit videos.

A two-year review of existing legislation was started by the Law Commission last week, but academics said the proposals were “not enough” to tackle wide-ranging abuses.

“While it’s welcome the government has recognised the need for comprehensive law reform, for women and men being victimised right now, this is justice delayed,” said author Clare McGlynn, a Durham University professor. “Image-based sexual abuse shatters lives.”

The report found that “immediate legal changes” were needed to grant victims anonymity and enable them to report offences without fear of drawing more harassment.

The study, which was co-authored by professor Erika Rackley, from the University of Kent, and Dr Kelly Johnson of Durham University, called for image-based abuses to be reclassified as sexual offences.

It warned that current laws were a “patchwork of piecemeal and out-of-date provisions with many significant gaps”, with some requiring proof of sexual gratification or distress as motives for a charge.

Academics said many police forces were providing an “inadequate” response because of the lack of appropriate laws, not viewing the offences as a priority or having insufficient resources to investigate them fully.

A BBC investigation found that last year only 7 per cent of cases involving the non-consensual distribution of images reported to the police resulted in a charge, and one third of victims withdrew their support for prosecution.

Victims who gave accounts to the academic study reported being told that it was “too difficult” to investigate their case or prove an offence, that there was not enough resources, that police did not have the relevant technology, or that the abuse was not covered by the law.

Some officers resorted to informal measures by calling perpetrators and warning them to delete images and posts, but leaving them without a formal reprimand or criminal record, the report said.

It called current laws “inconsistent, outdated and confusing”, with different thresholds for prosecution.

What is upskirting and why is it now illegal?

The revenge porn law introduced in England and Wales 2016 made it an offence to share a private, sexual image without consent, but only if it can be proven that the perpetrator intended to cause distress to the victim.

The new upskirting law also requires a motive of distress or sexual gratification, leaving room for claims that photos were a joke or financial gain to be used as a defence.

Anna*, whose abusive former partner shared explicit videos online after they broke up, said police “weren’t interested” when she contacted them and said it would be too difficult to take them down.

She hired a lawyer to continue the efforts but the videos “kept re-appearing”, forcing her to change her appearance to avoid being recognised.

“I couldn’t go out and I couldn’t go to school to pick up my children, I couldn’t go shopping. I was completely withdrawn from the world,” Anna said. “I attempted suicide at one time … it’s just something that’s never ending.”

Academics found the term “revenge porn” was reductive, with their research finding a “strong relationship between image-based sexual abuse, domestic abuse and coercive control”.

The report called for better education in schools, universities and workplaces to stop victims of image-based sexual abuse being punished or stigmatised.

It also found that while most social media and internet companies have processes to remove harmful images, they are “often slow and complicated”.

Academics called for the government to create an Office for Online Safety to provide assistance for victims wanting to remove images of themselves, alongside specialist emotional and legal support.

Professor Rackley said image-based sexual abuse was “constant, ongoing and relentless”.

She added: “It not only shatters their lives, but also the lives of those who love and support them.”

Maria Miller, chair of the Women and Equalities Select Committee, told the Press Association the law needed to catch up with technology.

She said: “What this report does is provide even more evidence of the impact that image-based sexual abuse can have on individuals.

“The law is supposed to treat crimes that happen online in the same way it would treat that offence happening on the street, say, so one hopes that the government is taking note that things need to change. There needs to be a specific offence for image-based sexual abuse.”

A government spokesperson said: “We are completely committed to ensuring the internet is as safe as it can be for UK citizens and to clamp down on online abuse.

“We are clear that if it's unacceptable offline, it's unacceptable online. This is why we’ve set out world-leading plans in our White Paper to put a new duty of care on online platforms towards their users, overseen by an independent regulator with teeth.

“On top of that we have asked the Law Commission to review the current law on image-based sexual abuse to ensure it is fit for purpose.”

*names have been changed to protect anonymity

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in