The use of artificial intelligence to generate fake nude images is a growing concern, with YouTube being a popular platform for promoting such tools. Forbes found that there were over 100 videos on YouTube advertising AI apps and websites that can remove clothes from images of women, attracting millions of views. Some of these videos provided tutorials on how to use these apps, and there were reports of high school students using them to generate nudes of their female classmates, leading to bullying and public shaming.

One website showcased in these videos was cited in court documents for a case in which a child psychiatrist was sentenced to 40 years for using AI to create child sexual abuse images. The psychiatrist was accused of using the tool to alter images of his high school girlfriend when she was underage by removing her clothes. The victim testified in court about the horror of having her innocent pictures twisted without her consent for illegal and disgusting purposes.

Google’s AI nudifier problems are not limited to YouTube, as Forbes identified three Android apps offering to remove clothes from photos. These apps had millions of downloads and were allowing users to generate fake nude images easily. There were also advertisements promoting “deep nude” services, with some openly offering to create fake nude photos of celebrities like Taylor Swift. After Forbes raised concerns about these apps and ads violating Google policies, Google took action by removing the ads and channels promoting these services.

The National Center on Sexual Exploitation (NCOSE) criticized Google for profiting from these nudify apps by accepting advertising money from developers and taking cuts of ad revenue. NCOSE pointed out that Apple had been quick to remove nudifier apps from the App Store, while Google had been slow to address the issue. The rise of AI-generated deepfake porn, including of children, is a major concern, with the National Center for Missing and Exploited Children receiving thousands of reports of AI-generated child sexual abuse material over the last year.

The use of AI to create fake nude images has real-life consequences for victims, as seen in the case of the convicted child psychiatrist who used AI to undress his victims in their childhood photos. The ongoing trauma caused by these AI-generated images was testified in court by multiple victims, who expressed fear that their images could be accessed by pedophiles and others. The need for responsible practices and policies around the spread of image-based sexual abuse was emphasized, with calls for Google to take action to address this growing issue.

Share.
Exit mobile version