New Childlight research reveals widespread

sexual exploitation risks to children across South Asia

  • Millions report rape or sexual assault before age 18

  • 1,325% rise in AI-generated abuse material

  • Survivors call for greater child protection

New academic research has uncovered a “human tragedy” of sexual exploitation and abuse behind closed doors”, with millions of children across South Asia at risk.

The Childlight Global Child Safety Institute said that data from the region, while limited, indicates around one in eight children in South Asia report rape or sexual assault before the age of 18.

It found a third of the region’s countries (India, Nepal and Sri Lanka) had representative survey data, with 12.5% of children reporting having been raped or sexually assaulted before turning 18 (14.5% of girls and 11.5% of boys). That would equate to about 54 million children in those three countries alone.*

Childlight is a data institute based in the University of Edinburgh in the UK and the University of New South Wales in Australia. It highlights the scale and nature of child sexual abuse globally to build support for greater action to safeguard children.

Its new Into The Light report, published today (Tuesday), also provides fresh insights into the scale of sexual exploitation and abuse of children that occurs online, facilitated by technology. This includes a 1,325% rise in harmful AI-generated abuse material in the past year, including “deep fakes” that place real children’s faces onto sexual images.

The research is being presented in India this week at a trust and safety summit in New Delhi and in Kerala at c0c0n, Asia’s largest cyber security conference, where Childlight is working with police across India to use technology to identify abusers and safeguard children.

Within South Asia, India, Bangladesh and Pakistan host the highest volume of child sexual abuse material (CSAM) on computers, as tracked by global monitors, the National Center for Missing and Exploited Children (NCMEC) and INHOPE. Together, the three countries account for nearly all reports in the region.

In 2024, the National Center for Missing and Exploited Children, which tracks CSAM globally, was alerted to 2,252,986 cases of material reported or hosted in India, and to 1,112,861 cases in Bangladesh and 1,036,608 cases in Pakistan.

Childlight’s report produces a CSAM availability rate, based on this INHOPE and NCMEC data and country population size. The highest rate in South Asia was in the Maldives, at 94 cases per 10,000 people, followed by Bangladesh on 64.1 and Pakistan on 41.3. The fourth highest rate is Bhutan on 41, followed by Afghanistan (28.9), Sri Lanka (27.8) and Nepal (19.4). India has the lowest rate on 15.5 cases per 10,000 people. (See table below)

Table 1. Calculated CSAM rate per 10,000 people for countries in UNICEF Classified Region of South Asia, 2023–2024

Childlight CEO Paul Stanfield said: “Abuse is closer than people think. Globally, we are seeing a human tragedy of millions of young lives devastated by physical and online sexual abuse, which are often closely related.

“This is preventable, and all of us can and must help stop it. When we prevent abuse, we enable healthier lives – better mental health, better physical health, and a greater chance for children to do well in school and in their adult lives, with all the personal and societal and economic gains that bring for everyone.”

Anil Raghuvanshi, founder and president of ChildSafeNet, based in Nepal, said: “The evidence is clear. Children face heightened risks of abuse and exploitation online. Governments and technology companies must act now with concrete safeguards and adequate funding to keep children safe online. Failing to protect children is failing to fulfil their responsibilities. Time is ticking and every second counts. We need action, not promises.”

Saanika Kodial from Mumbai was 14 when she became a victim of digital sexual abuse on social media. Today, she campaigns for Brave Movement to end childhood sexual violence.

She said: "I was an innocent victim of something I had no control over. Survivors are often made to feel embarrassed and guilty by people around them, by the perpetrators, and even by the law. They must understand that speaking up does not make them the ‘bad guy.’ There will always be people who will believe them, relate with them, and empathise with their experiences."

Childlight, which is co-hosting the event, supports robust legislation, pro-active CSAM detection and rapid removal by tech companies, as well as educational programmes to support children and those working with them to curb the problem.

There are large information gaps in South Asia due to a lack of representative survey data, including around people who abuse children related to them. However, Childlight commended India for the breadth and depth of the data it publishes on child sexual exploitation and abuse (CSEA), including police data. This enables stakeholders to monitor patterns, assess responses and identify gaps in protection systems.

Indian statistics point to an increasing trend in the number of police recorded CSEA cases, rising from 54,359 cases in 2021 to 64,469 in 2022, as well as an increasing prosecution rate of over 90%. Data for Pakistan show a near doubling of CSEA cases recorded by police, up from 1,546 to 2,954.

Stanfield, a former director of INTERPOL, the world’s largest international police organisation, said: “While the scale of this issue is concerning, the visibility of the data offers an opportunity to focus efforts on encouraging reporting and prioritising law enforcement responses.”

Rhiannon-Faye McDonald was abused at the age of 13 after being approached online by a perpetrator posing as a teenager. Soon afterwards, he turned up at her home and abused her in person. Today, she campaigns for better online safety through the Marie Collins Foundation.

She said: “For too long, technology companies have favoured profit over safety. A rising number of children being abused is a direct result. For most victims and survivors, even with the right support, the impacts are significant and long-lasting. We live with misplaced self-blame and the fear of being recognised by those who have seen the images or videos of our abuse. For anybody who believes that it's "just a photo", this couldn't be further from the truth.”