Tuesday, Dec 26, 2023

As fake nudes become latest AI headache, experts call for regulation and awareness

Deepfake AI Technology: Hundreds of “undressing” websites have come up over the past few months and in some cases mobile apps that are distributed as an Android Application Package (APK) and can be installed outside the Google Play Store.

Deepfake AIDeep fake news. (Credit: AI generated image using Microsoft Bing)

The small Spanish town of Almendralejo was shaken when dozens of schoolgirls reported that their nude images, generated using an “undress” app employing Artificial Intelligence, were circulated and shared on everyone’s phones at school. 

In another case, a high school in New Jersey made headlines when a high schooler made deepfake pornographic images of his female classmates, again created via AI. 

Although happening thousands of miles apart, but roughly around the same time, both the cases were connected by the abuse of AI for creating deepfakes of unsuspecting victims. In September 2023 alone, 24 million users visited fake nude websites, according to a study by social media analytics firm Graphika. While hundreds of such “undressing” websites have come up over the past few months, sadly search engines like Google have not restricted their access in any way. 


Most of these sites (in some cases mobile apps that are distributed as an Android Application Package (APK) and can be installed outside the Google Play Store onto smartphones) create nudes of any photo uploaded to them, some so believable that it is hard to make out they are synthetic. Deep fakes aren’t a new problem, but new-age artificial intelligence tools are making it easier for anyone to produce AI-generated nude images of regular people. The easy accessibility to AI tools makes everyone vulnerable, especially women and minors, who are easy targets of Deepfake AI porn. 

“There is no sensitivity because you are not doing it manually. You’re just technically doing it or using the technology. So all those emotions attached to doing an act are lost. I think these [AI] are making it easy to commit a crime,” psychologist Anjali Mahalke told in an interview. “When people create any sort of pornography or commit a crime, there is a shame. Underneath them, there is a struggle… some sort of trauma, or it’s mostly rooted in shame, and that shame becomes like a narcissistic wound. In the psyche, there is no guilt, remorse, or any of those negative spectrum of emotions,” she continued. 

Festive offer

Even more problematic is the fact that these websites operate as digital products being pushed to consumers in the hope of acquiring users and then monetising them using extra services. Several AI-generated Deepfake Nude image-creating websites use a freemium model —  initially users can generate pictures for free after which they have to purchase extra credits to access advanced features (for instance, age and body customisations) using known platforms like PayPal as well as cryptocurrency platforms like Coinbase. 

These websites claim they are for “entertainment purposes and not aimed to insult anyone”.  Mahalke qualified this claim: “90% of deepfake content is about pornography, and 99% of the time, it involves women. It’s quite concerning how women are mostly the subjects of such content.” 


While you will find listicles and how-to about this new technology, the fact remains that search engines like Google and Bing both have started indexing such pages and not filtering them out considering the obvious issues with the technology being accessible to everyone. Google did not comment when asked for a response. 

“Essentially, these are websites. You have the takedown provision under the Information Technology Act and rules to direct the Internet Service Provider (ISP) and TSP to do so,” said Abhishek Malhotra, Managing Partner at TMT Law Practice. “Google may now be hesitant to do so; either they will do it, or they will say, ‘Go and get an order from a court of law.'”

While there are obvious advantages to AI technologies being accessible to everyone, there is a need to prevent all aspects of this new technology from being so easily accessible, especially when there is a clear case of misuse. The misuse of AI to create deepfake Nude images is greater also because all it takes is one photo of a person and a website or a mobile application to morph someone’s image. The fact remains that every image created using a so-called “nudify tool” is potentially a criminal act infringing upon someone else’s dignity and privacy. Shockingly, some of the websites themselves label the services as “non-consensual intimate imagery”. 


“At the end of the day, there is an entity in the background, a company or an individual that has created this AI and given it a particular purpose—the algorithm that will be fed into the AI and the language models (LLMs) that it will be reading. The language material, where AI is ingesting information to perform its intended function, will contain learning modules for the AI to execute tasks such as morphing of photographs. So, the person who has written the algorithm or fed the LLMs to the AI, or the entity involved, is responsible for the activity that is taking place,” said Malhotra. 

Fake nudes have been around even before AI images started becoming popular, but now you don’t need to be a bad actor to create one. Across the world, even minors are accessing these websites for fun and ending up creating images that could potentially be used to threaten or harass someone. These fakes are most often created using regular images of people pulled down from social media posts and could end up on these same platforms but with disastrous results. To make matters worse, even social media platforms have not put in place technologies to flag such content thus contributing to their larger inability to take on explicit content.  

Mahalke said 98% of these deepfake websites have a dedicated purpose. “So, if they are dedicated sites, then certainly the government can establish mechanisms.” She also suggested that schools could help create awareness about the perils of this new technology. 

It is time for the world to wake up to this new menace and enact strong laws that plug the entire pipeline for AI-generated nudes which are created and shared without consent. Labeling and watermarking AI-generated content to help differentiate between authentic material and that made by software might be a start. The onus is also on big tech companies such as Meta and Google to add deepfake detectors to their platforms so that people aren’t able to upload sexually explicit photos and videos. But legal experts are also demanding the regulation of AI and the formation of laws that protect those whose photos or videos were used to create sexually explicit content and shared online. 

“We do have legal regulations, especially in the POCSO Act. Sections 13 to 16 address the punishment of any person, regardless of how they spread child pornography or whether the actual child is present in the pornographic act. We have covered it in child laws, but we have not covered it in adult laws,” explained Mahalke. That said, Mahalke pointed out, there is only a 1% conviction rate in case of child pornography.

First published on: 21-12-2023 at 10:00 IST
Latest Comment
Post Comment
Read Comments