There are several platforms that create deepfakes by deploying sophisticated neural networks that can "predict" what a person might look like without clothes. Users upload an image, usually of a fully clothed woman, and the software renders a synthetic nude version of the person in the image. While the results may be technically fake, the emotional, psychological, and reputational damage caused to the victims is chillingly real.
In the ever-expanding landscape of artificial intelligence (AI), technological marvels continue to redefine the boundaries of what is possible. But as with all powerful tools, AI's potential to do good is often mirrored by its potential to cause immense harm. One such deeply disturbing and insidious application is the rise of AI-driven image manipulation tools—that digitally strip clothing from images, usually of women, to produce hyper-realistic deepfakes (read nudes.)
What was once the realm of dystopian fiction is now a horrifying reality for thousands across the globe, and India is no exception.
The emergence and spread of undressing AI tools have sparked not only widespread ethical concern but also an urgent debate about digital safety, consent, and the psychological violence inflicted by virtual abuse.
Deepfakes are being misused to create non-consensual explicit content, exposing victims and challenging laws, ethics, and digital safety.
The Rise Of Digital Undressing
There are several platforms that create deepfakes by deploying sophisticated neural networks that can "predict" what a person might look like without clothes.
Users upload an image, usually of a fully clothed woman, and the software renders a synthetic nude version of the person in the image.
While the results may be technically fake, the emotional, psychological, and reputational damage caused to the victims is chillingly real.
These tools, often masked behind seemingly harmless names and user-friendly interfaces, are easily accessible through the internet or even through mobile apps and messaging Apps' groups.
The fact that these apps can operate anonymously adds another layer of complexity, making it difficult to trace perpetrators.
India’s Wake-Up Call: Real-Life Incidents
India, with its vast youth population, increasing smartphone penetration, and growing access to AI tools, has already seen the devastating effects of deepfakes and undressing AI technologies.
Perhaps the most alarming aspect is how this technology is being used not by sophisticated hackers, but by school students and everyday internet users.
In October 2023, Delhi Police arrested a 17-year-old schoolboy for using an undressing AI tool to create fake nude images of his female classmates.
According to the police report, the boy shared the images among WhatsApp and Telegram groups for “fun”, without fully understanding the trauma it could cause.
The victims—aged between 14 and 16—suffered severe emotional distress, with some even refusing to attend school for weeks.
In a similar case from Mumbai in early 2024, a 22-year-old college student used AI-generated nudes of a girl who had rejected his romantic advances.
The manipulated images were circulated online and sent to her relatives, which led to an outburst of family tension, humiliation, and the girl being forced to leave college temporarily.
While the accused was arrested under BNS sections related to cybercrime and sexual harassment, the incident reflects a disturbing trend: AI is becoming a weapon of digital vengeance.
A similar incident was reported from Lucknow, in which the Special Task Force arrested a man accused of creating nude images of 36 girls using AI and deepfake techniques.
He sourced photos from Instagram profiles and used them to fabricate explicit images, which he then used for blackmail.
Another shocking incident occurred in Guwahati, where a 19-year-old girl found her deepfakes being circulated on social media.
The original photo was a harmless selfie posted on Instagram, which had been downloaded and altered without her knowledge.
The images went viral within hours, shared on pages with thousands of followers, with lewd captions and offensive comments.
She later spoke to local news, saying, “The images were fake, but the shame, fear, and pain I felt were absolutely real.”
A Modern Weapon Of Misogyny
The misuse of AI to strip women of their clothes—digitally—exposes a deep-rooted societal sickness. These tools are not just about technological novelty; they reflect centuries-old misogynistic tendencies now given a 21st-century mask.
In a country already grappling with issues of gender-based violence, consent, and moral policing, undressing AI apps are simply the latest tools of oppression.
While women are overwhelmingly the primary targets of these attacks, there have been cases where men and LGBTQ+ individuals have also been victimised.
However, the disproportionate focus on women highlights how such tools perpetuate and amplify patriarchal control over women’s bodies, turning the digital world into yet another unsafe space.
Legal Grey Areas And Loopholes
Indian law still lacks a dedicated legal framework for handling AI-generated deepfake nudity. Most such cases are now prosecuted under relevant provisions of the Bharatiya Nyaya Sanhita, 2023 (BNS) and the Information Technology Act, 2000.
The commonly used sections from BNS include:
Under the Information Technology Act, 2000, the relevant sections remain unchanged:
However, the law is playing catch-up with the pace of technology. Legal experts have repeatedly called for more robust, AI-specific legislation to tackle the new challenges posed by synthetic media.
The Role Of Social Media And Tech Platforms
Social media platforms play a significant role in either amplifying or curbing the spread of such content.
Despite their community guidelines, Facebook, Instagram, and X (formerly Twitter) often fall short when it comes to quickly identifying and taking down deepfake content.
Algorithms designed to promote engaging or shocking content often end up prioritising such malicious uploads.
Telegram, in particular, has emerged as a hotspot for distributing AI-generated nudes. Indian Telegram groups with thousands of members freely share such content, often monetising them by charging access fees.
Despite repeated reports, the platform’s lax moderation policies make it a breeding ground for digital sexual abuse.
Psychological Fallout: A Hidden Epidemic
What often goes unspoken in discussions about deepfake abuse is the long-term psychological trauma that victims endure.
The victims may not have been physically assaulted, but the impact is comparable. Feelings of shame, violation, anxiety, depression, and even suicidal thoughts are common.
Mental health professionals in India are now seeing an uptick in cases related to “digital trauma”. Victims, especially young women, find themselves in a state of constant anxiety—unsure of who has seen the image, where it might appear next, or whether it might affect their education, relationships, or employment.
A 2024 study by the Centre for Internet and Society (CIS), Bangalore found that over 40% of young women in metros have either experienced or know someone who has been targeted by deepfake or AI nudity tools.
More alarmingly, nearly 60% did not report the incident due to fear of shame or backlash from family and society.
A Call To Action
The threat posed by deepfake tools is not just a technological issue—it is a social and moral crisis. If we are to preserve human dignity in the digital age, India must treat this as a serious offence rather than a novelty or prank.
What Can Be Done?
AI is undoubtedly one of the most transformative technologies of our time. But in the absence of ethics, compassion, and accountability, it can also become one of the most dangerous. Using AI technology to generate deepfakes show us that the fight for women’s dignity and safety is far from over—only now, the battlefield has moved online.
As a society, we must rise to the occasion—not just by banning such tools but by addressing the culture that fuels their popularity. The future of India’s digital integrity depends on it.
ALSO READ | The Science Of Masturbation: Benefits, Risks, And Societal Perspectives
The Story Mug is a Guwahati-based Blogzine. Here, we believe in doing stories beyond the normal.