The creation of sexually explicit deepfake content is set to become a criminal offense in England and Wales, with a draft law announced that would criminalize the creation of such images or videos without consent. This comes as concern grows over the use of artificial intelligence to exploit and harass women. It is already illegal in England and Wales to share such deepfakes without the subject’s consent, with jail time being a possible consequence for perpetrators. Deepfakes are manipulated images or videos that give the impression that someone has said or done something they have not.

Laura Farris, the UK’s Minister for Victims and Safeguarding, stated that England and Wales would likely be the first countries worldwide to outlaw the creation of sexually explicit deepfakes. The draft law would encompass both pornographic images and nude deepfakes, regardless of whether the subject is engaging in erotic behavior. Scotland and Northern Ireland will need to pass relevant laws in their respective countries. Northern Ireland is exploring options regarding the Ministry of Justice’s proposals, while Scotland has not yet responded to inquiries about introducing similar rules.

The new offense in England and Wales will be introduced through an amendment to the Criminal Justice Bill, currently making its way through parliament. Last year, changes to the Online Safety Act already made it a crime to share deepfake sexual images in these countries. The creation of deepfakes has included super-imposing women’s faces onto sexually explicit images without their consent, with singer Taylor Swift being a high-profile victim of this practice. In the US, lawmakers have introduced a draft law that would allow victims of sexually explicit deepfakes to sue those who create and share such content without consent.

The European Union has also proposed a directive criminalizing the creation of sexually explicit deepfakes, which, if passed, would require member states to create corresponding national laws. UK Minister Farris emphasized that deepfakes are used to degrade and dehumanize others, especially women, and described the new offense as a clear message that creating such material is immoral, often misogynistic, and criminal. Meta, the owner of Facebook and Instagram, announced that it would review how it handles deepfake pornography after explicit AI-generated images of female public figures circulated on its platforms. The Meta Oversight Board acknowledged that deepfake pornography is a growing cause of gender-based harassment online.

Share.
Exit mobile version