https://indianmasterminds.com

ADVERTISEMENT
ADVERTISEMENT

Beyond POCSO: The Legal Vacuum in Protecting Young Women from AI-Driven Exploitation

A sharp legal analysis of AI-enabled exploitation and regulatory gaps in India — By Advocate Aishwarya Srivastava
Indian Masterminds Stories

The architecture of harm in the digital age has shifted from physical proximity to algorithmic reach. Artificial intelligence now enables the creation of hyper-realistic images, voices, and videos that can be deployed to exploit, extort, and silence. Young women, particularly those in the 18 to 25 age group, occupy a precarious space in this ecosystem. They are legally adults, yet socially and psychologically vulnerable, and therefore fall outside the protective canopy of child-centric legislation such as the Protection of Children from Sexual Offences Act, 2012, even as they remain prime targets of technology-enabled abuse.

Indian law has responded to digital exploitation through a layered but fragmented framework. The POCSO Act marked a decisive intervention by criminalising sexual offences against minors, including those facilitated through digital means. Section 15 penalises storage and dissemination of child sexual abuse material, and the Supreme Court has clarified that even viewing or possessing such material attracts penal consequences. In Just Rights for Children Alliance v. S. Harish (2024), the Court held that digital possession of child sexual abuse material is not a passive act but one that perpetuates exploitation, thereby attracting criminal liability. The Act’s definitional breadth includes computer-generated imagery involving minors, indicating legislative foresight in addressing evolving technological threats.

Parallel provisions exist under the Information Technology Act, 2000. Sections 67 and 67A criminalise the publication and transmission of obscene and sexually explicit material in electronic form. Section 67B specifically addresses child sexual abuse material. Sections 66C and 66D deal with identity theft and impersonation, while Section 66E addresses violations of privacy in relation to images of private areas. These provisions, though significant, are technology-neutral. They were not designed with artificial intelligence in mind and therefore operate through judicial interpretation rather than explicit statutory design.

Also Read- The Constitutional Friction: DoPT’s Mandate, the CAPF Act of 2026, and the Legal Battle for Parity

The Bharatiya Nyaya Sanhita, 2023, continues to criminalise the circulation of obscene material, including in digital contexts, but does not fundamentally alter the legal position regarding AI-generated exploitation. General penal provisions relating to voyeurism, stalking, intimidation, and defamation supplement this framework, yet they remain reactive and case specific. The constitutional guarantee of dignity under Article 21, as interpreted in Justice K.S. Puttaswamy v. Union of India (2017), provides a normative foundation for protecting informational privacy and bodily autonomy in digital spaces. However, constitutional principles require statutory articulation for effective enforcement.

The central legal vacuum lies in the absence of a dedicated framework addressing AI-generated sexual content and synthetic identity manipulation. Deepfake pornography, non-consensual image morphing, and AI-driven sextortion are prosecuted by stretching existing provisions. This creates doctrinal uncertainty and inconsistent enforcement. The protective shield of POCSO ceases at the age of eighteen, creating a legal cliff for young women who remain equally susceptible to exploitation. Indian law also lacks a clear recognition of non-consensual image-based abuse as a distinct offence, leading to fragmented prosecution under obscenity, privacy, or defamation provisions.

The vulnerability of young women is not incidental but structurally embedded. The digital economy incentivises visibility. Social media platforms encourage the sharing of images, videos, and personal information, which become raw material for AI-based manipulation. Grooming techniques have evolved into sophisticated psychological operations that exploit emotional dependence, validation seeking, and fear of social stigma. The reluctance to report such crimes is amplified by reputational concerns and the slow pace of investigative processes. Law enforcement agencies, though increasingly equipped, still face challenges in digital forensics, cross-border evidence collection, and platform cooperation.

Patterns of AI-enabled crime reveal a transition from isolated offenders to organised digital networks. Synthetic identities are used to establish trust with victims before deploying extortion tactics. Real-time manipulation technologies can superimpose faces onto video streams, creating false but convincing visual evidence. Automated sextortion networks deploy bots to target thousands of individuals simultaneously. Dark web marketplaces facilitate the trade of both real and AI-generated exploitative material. These developments blur the distinction between reality and fabrication, making evidentiary standards more complex and enforcement more challenging.

Comparative jurisdictions have begun to respond with greater specificity. Several states in the United States have enacted laws criminalising the creation and distribution of deepfake pornography without consent. The United Kingdom has expanded the scope of its Online Safety framework to include harmful synthetic content. The European Union’s evolving regulatory approach to artificial intelligence emphasises transparency, accountability, and risk classification. These developments underscore the need for India to move beyond interpretative application of existing laws and adopt a more explicit legislative approach.

A coherent response requires a multi-layered reform strategy. First, there is an urgent need for a dedicated statutory framework addressing digital sexual exploitation. Such a law must define deepfakes and synthetic media, criminalise their malicious creation and dissemination, and provide civil and criminal remedies tailored to the nature of harm. Second, the protective principles underlying POCSO should be extended through a calibrated framework to cover young adults who remain vulnerable in digital environments. Third, non-consensual image-based abuse must be recognised as a standalone offence, with clear elements of consent, intent, and harm.

Platform accountability must be strengthened through enforceable obligations. Intermediaries should be required to deploy AI-based detection tools, implement robust grievance redressal mechanisms, and ensure time-bound removal of harmful content. The due diligence obligations under the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, need to be supplemented with proactive monitoring requirements for high-risk categories of content. At the same time, safeguards must be built to prevent overreach and protect freedom of expression under Article 19(1)(a), subject to reasonable restrictions under Article 19(2).

Digital forensics and investigative capacity must be enhanced. Specialised cyber units, trained in AI-related evidence, are essential. Evidence preservation protocols must be standardised to ensure admissibility under Section 65B of the Indian Evidence Act, 1872. Cross-border cooperation mechanisms, including Mutual Legal Assistance Treaties, must be streamlined to address jurisdictional challenges. Victim protection frameworks must prioritise confidentiality, psychological support, and legal assistance, recognising the long-term impact of digital sexual exploitation.

Regulation of artificial intelligence itself is an integral part of the solution. Mandatory watermarking of AI-generated content, traceability mechanisms, and accountability standards for developers can create deterrence without stifling innovation. Ethical guidelines must be translated into enforceable norms, ensuring that technological advancement does not come at the cost of human dignity.

The law has historically evolved in response to social change, but the pace of technological transformation demands anticipatory governance. AI-driven exploitation is not merely a legal issue; it is a challenge to the very notion of identity, consent, and autonomy. Young women, situated at the intersection of visibility and vulnerability, bear the brunt of this transformation.

POCSO remains a landmark in protecting children, but its protective logic cannot remain confined within rigid age boundaries in a borderless digital world. The task ahead is to construct a legal framework that recognises the continuum of vulnerability and responds with clarity, precision, and compassion. The legitimacy of the legal system will ultimately be measured by its ability to protect those who are most exposed to harm in spaces that are increasingly invisible yet profoundly real.

Also Read- Algorithmic Governance and Child Safety: From Analogue Vulnerabilities to Digital Risks and Algorithmic Accountability


Indian Masterminds Stories
ADVERTISEMENT
ADVERTISEMENT
Related Stories
ADVERTISEMENT
ADVERTISEMENT
NEWS
Intelligence-Bureau
Intelligence Bureau Faces 40% IPS Shortage; DIG, SP Ranks Nearly 50% Vacant
PMGSY Bihar rural road connectivity
Bihar Transfers 4 IAS Officers; Gopal Meena Named Governor’s Secretary, Chongthu Shifted
PM Narendra Modi
Census 2027 Goes Digital, PM Modi Urges Citizens to Participate in ‘Nation-Building Exercise’
Sadhna Saptah 2026
PM Modi Highlights Northeast’s Bamboo Revolution as Catalyst for Innovation and Employment
WhatsApp Image 2026-04-26 at 12.45
CAIT Raises Concerns Over Retailers vs E-Commerce Companies Amid India’s $250 Billion Market Projection
images (3) (1)
‘Skills Matter More Than Degrees Today,’ Says IAS Rahul Jain at Ratlam Student Guidance Session
Delhi High Court Open Prisons Case
Delhi High Court Directs DoPT to Decide Corruption Complaint Against Retired IAS Officer Rajat Kumar in 3 Months
Shiromani_Gurdwara_Parbandhak_Committee_Official_Logo (1)
SGPC to Honour Sikh UPSC Achievers With Rs. 1 Lakh Award, Announces Advocate Harjinder Singh Dhami
ADVERTISEMENT
ADVERTISEMENT
Videos
Pawan Sareen
Truth Behind India’s LPG Supply Strain Amid Rising Demand and Global Uncertainty 
IAS Saurabh Katiyar
IAS Saurabh Katiyar’s Model of Good Governance: Compassion, Efficiency, and Real Impact
IAS Saurabh Katiyar
How IAS Saurabh Katiyar is Making Government Offices Citizen-Friendly in Mumbai | Video Interview 
ADVERTISEMENT
UPSC Stories
WhatsApp Image 2026-04-25 at 7.02
Born Without a Forearm, Kerala’s Daughter Secures AIR 167 in UPSC CSE 2025
Born without a forearm, Kerala’s Kajal Raju improved from AIR 910 to AIR 167 in UPSC CSE 2025 after four...
WhatsApp Image 2026-04-24 at 3.47
How Manoj Ramchandra Patil Became His Village’s First Civil Servant
Hailing from drought-hit Jalihal village in Maharashtra, Manoj Ramchandra Patil secured AIR 493 in UPSC...
ankit sakni1
Ankit Sakni Becomes Bijapur’s First Civil Services Success Story
Ankit Sakni from Bhairamgarh, Bijapur, secured AIR 816 in UPSC CSE 2025, becoming the district’s first...
CSR NEWS
ews
DVK Foundation Launches Scholarship Programme for EWS Students at BGIS Vrindavan
BGIS Vrindavan Partners with DVK Foundation for EWS Student Scholarships
ECIL
ECIL Completes CSR Project by Handing Over Retaining Wall at Rastriya Vidya Kendra, Telangana
ECIL Enhances Student Safety and School Infrastructure in Medchal-Malkajgiri District Through Corporate...
ntpc
NTPC WR-I Launches ₹7.64 Crore CSR Project to Renovate IPD Blocks at N.M. Wadia Hospital, Solapur
Renovation of Buildings A, B, and Annex to Strengthen Healthcare Infrastructure, Improve Patient Care,...
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT
Latest
Intelligence-Bureau
Intelligence Bureau Faces 40% IPS Shortage; DIG, SP Ranks Nearly 50% Vacant
Aishwarya Srivastava
Beyond POCSO: The Legal Vacuum in Protecting Young Women from AI-Driven Exploitation
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT
Videos
Pawan Sareen
IAS Saurabh Katiyar
IAS Saurabh Katiyar
ADVERTISEMENT
ADVERTISEMENT