
Big numbers make headlines, but without context, they can distort the truth.
As someone who has worked at the intersection of digital safety and child protection for over a decade, the latest CyberTipline data from the National Center for Missing & Exploited Children (NCMEC) demands both attention and caution. India has been flagged as the highest reporter of suspected child sexual abuse material (CSAM) in 2024, with over 2.25 million reports — more than any other country in the world.
At face value, these numbers are horrifying. The real story is far more complex — and that’s exactly why we need more rigorous, context-driven research before forming conclusions or designing policy responses.
First, these reports are not generated by governments or law enforcement. They come primarily from U.S.-based electronic service providers (ESPs) who are legally obligated under 18 USC 2258A to report suspected CSAM on their platforms — regardless of where the user is located.
With India being one of the largest digital populations in the world, it’s no surprise that platforms are flagging content that appears to be uploaded or accessed from here.
But here’s the critical nuance: these numbers do not confirm the presence of unique child victims or establish the origin of the abusive material. A large proportion could include re-shared images (sometimes by horrified or unaware users), duplicates, or content routed through Indian IP addresses via proxies or anonymizers. Much of the material might have been created elsewhere, and many reports could be linked to a small number of files being shared multiple times.
Yet in the absence of clear explanation, these numbers risk misleading public discourse and reinforcing harmful narratives — suggesting, falsely, that India is uniquely unsafe for children online. This is not just inaccurate; it’s dangerous.
At the Centre for Social Research, we’ve worked with children, parents, educators, and survivors across the country, building digital literacy and advocating for safety-by-design systems. We’ve seen first-hand that India’s online safety landscape is shaped by multiple factors — low awareness, limited safeguards on platforms, weak enforcement, and cultural stigma around reporting sexual abuse.
To address this crisis, we need data that’s contextual, disaggregated, and localised. We need to know:
- How many of these reports involve Indian children?
- What are the patterns of abuse or sharing?
- What is the role of platforms in detecting and preventing harm?
- How many victims ever receive justice, care, or protection?
- Are digital safety tools and reporting mechanisms accessible in local languages and rural contexts?
- Where is the content being generated, and where is it being consumed or redistributed?
- How many of these reports lead to timely law enforcement action?
- What legal or regulatory gaps are enabling repeat circulation of abusive content?
- How are survivors and their families consulted in shaping policy or platform responses?
Without answers to these questions, we are designing solutions in the dark.
This is a call for research — independent, transparent, and intersectional. We need partnerships between platforms, civil society, academic institutions, and government to build an evidence base that informs targeted action, rather than reactionary blame.
India may top the charts in raw numbers. But unless we decode what’s underneath those numbers, we risk misdiagnosing the crisis — and failing the very children we seek to protect.
Share this blog. Ask your platforms how they handle CSAM. Talk to your school about online safety education. Change Begins with informed voices— like yours.
