People who make deepfake sex videos to humiliate others should be convicted of a sex crime and sign the offender registry, former culture secretary Maria Miller said.
The Tory MP said the trauma to victims who had explicit digital fake images of them was so severe that it needed to be treated the same as offline sex offenses.
Ms Miller also expressed concern over the spread of “deepfake” technology, which allows users to edit photos so that people appear to be naked or performing sexual acts.
Its call comes as deepfake technology becomes so advanced that tech giants such as Adobe – the company behind the popular Photoshop editing software – have launched a website where people can check for photographs in line have been manipulated.
“Deeply distressing for anyone”
Speaking in an interview with The Sunday Telegraph, Ms Miller said: “When victims talk about the impact it has on them, they compare it to sexual abuse in the offline world and the fact that the law doesn’t not treating it that way shows it is an important problem to be solved.
“The fact that it’s not clearly against the law means more people are likely to look into this and potentially get involved until we have the kind of penalties that are needed.” “
While sharing someone’s intimate images online, sometimes referred to as ârevenge porn,â is a crime, creating and posting deepfake versions is a legal gray area.
The Law Commission recommended earlier this year that the government strengthen the law on the abuse of intimate images, including digitally manipulated images.
Ms Miller said she wanted to take it one step further and make it a sex crime to create and distribute explicit deepfake images and also use nudification software to create nude images.
This would mean that those convicted would have to sign sex offender registers, which is a requirement for any sex offense committed against a child or an adult.
Widely targeted at women
Ms Miller said: âHaving a sexually explicit image of yourself broadcast around the world without your consent is something deeply distressing for anyone.
“What I’m asking is something that will stop the perpetual catch-up we have to play with the tech industry where the law is lagging behind the imaginations of IT entrepreneurs who seem to want to cash in on the humiliation. , in particular, women.
Ms Miller, who served as David Cameron’s Culture Secretary and Minister for Women and Equality between 2012 and 2014, was initially alarmed at the rise in intimate image abuse after hearing of traumatic cases voters.
In recent years, she has become a champion of strengthening the law of practice, warning that she is targeting women en masse.
Another trauma that victims of deepfake abuse face is the struggle to have the image removed from the internet.
Victims often have to ask a patchwork of tech companies to remove the footage, knowing that copies can be reposted at any time.
Fines that amount to billions
However, the government is currently drafting new due diligence laws, which The Telegraph has been campaigning for since 2018, meaning tech companies will have to remove illegal material from their sites.
If explicit deepfakes are criminalized, social media giants could face fines of billions if they don’t quickly remove these images.
Ms Miller said she also wanted some of the money the Treasury is expected to make from the new regime, which will be overseen by Ofcom, to go to supporting victims of intimate image abuse.
She added: “What I would suggest is that some of this money did not go directly to the treasury, but went directly to charities who do an incredible job supporting the victims.”