AI-Generated Fake Nudes Crisis Exposes Legal System's Shortcomings

· 1 min read

article picture

The rise of artificial intelligence (AI) technology capable of creating realistic fake nude images is becoming an increasingly widespread problem, with victims and experts warning that current laws are inadequate to address this growing threat.

Recent data shows the scale of the issue is expanding rapidly, with one major website dedicated to AI-generated nude images receiving approximately 14 million monthly visits. A survey by Internet Matters found that 13% of teenagers have already encountered nude deepfakes.

Former Love Island contestant Cally Jane Beech experienced this firsthand when her professional underwear modeling photo was transformed into an explicit nude image using AI software. "It looked so realistic, like nobody but me would know," Beech told reporters. When she reported the incident to police, she found they were ill-equipped to handle the case.

The problem is particularly acute in schools. A recent teacher survey revealed that 7% had dealt with incidents of students creating fake sexually explicit images of classmates in the past year. The NSPCC warns these images are being used for grooming, blackmail, and bullying.

Law enforcement officials acknowledge serious gaps in their ability to address the issue. Assistant Chief Constable Samantha Miller told MPs that "the system is failing," noting that out of 450 victims, only two reported positive experiences with police handling of their cases.

While the UK government has promised new legislation in 2024 to criminalize the generation of AI nudes, victims say more comprehensive laws are needed. Currently, while sharing such images may be illegal, requesting their creation is not.

One victim, identified as "Jodie," discovered her social media photos had been manipulated into explicit content and posted on pornographic websites. Despite helping secure a conviction in her case, the perpetrator received only a suspended sentence and minimal fines.

Professor Clare McGlynn, an expert in online harms, points to the troubling normalization of these technologies: "These nudify apps are easy to get from the app store, they're advertised on TikTok. We've normalized the use of these nudify apps."

As victims await stronger legal protections, concerns remain about whether upcoming legislation will effectively address image removal and the solicitation of fake nude content. With AI technology becoming more accessible and sophisticated, advocates stress the urgent need for robust legal frameworks to combat this growing form of digital abuse.