Skip to main content
Digital Rights

CCTV and Surveillance: Privacy Rights in Indian Public Spaces

India's streets are filling up with CCTV cameras faster than anyone can count them, yet there's barely a rulebook for who watches, who stores the footage, or how long they keep it. Here's what that means for your privacy.

SR
Sneha Reddy
·13 min read
Share:
CCTV and Surveillance: Privacy Rights in Indian Public Spaces

Okay so here's something that probably won't surprise you: India is now one of the most camera-covered countries on the planet. Walk through any mid-size city and you'll spot dome cameras on traffic poles, bullet cameras outside shops, little blinking units tucked into the corners of metro stations. They're everywhere. In early 2025, one estimate from the Comparitech surveillance index put India's total CCTV count above 80 million, with some projections heading toward 100 million by late 2026. That's a staggering number, and it raises a question nobody in power seems eager to answer: who's actually watching all of this, and under what rules?

I've been thinking about this for a while now, and the more you dig into it the stranger the picture gets. We've got cameras going up at a pace that would make London or Beijing do a double-take, but the legal scaffolding holding the whole thing together is, to put it gently, thin. Maybe nonexistent in places. There's a genuine tension between the promise of public safety and the reality of what mass recording does to ordinary people just living their lives. Let's walk through the whole mess.

How We Got to Millions of Cameras

India's camera boom didn't happen overnight, but it did accelerate sharply in the last five or six years. The Safe City programme, backed by the Ministry of Home Affairs, kicked off around 2018 with an initial rollout in eight cities: Ahmedabad, Bengaluru, Chennai, Delhi, Hyderabad, Kolkata, Lucknow, and Mumbai. The stated goal was to make public spaces safer, especially for women, by deploying CCTV networks linked to AI-powered command centres. Each of these cities received hundreds of crores in central funding to buy hardware, build fibre networks, and install monitoring rooms staffed around the clock.

Hyderabad probably went furthest. The Telangana government claims the city runs over 600,000 cameras, many of them integrated with a centralized Command and Control Centre. Police there say they can track a vehicle across the city in near real-time using automatic number plate recognition. Delhi, meanwhile, took a slightly different path. The AAP government launched a scheme to install CCTV cameras in residential colonies, handing the job to Resident Welfare Associations and offering free installation. By mid-2024, reports suggested more than 300,000 cameras had gone up across the capital under this scheme alone, on top of the tens of thousands already operated by Delhi Police and the central government.

What's interesting, and somewhat unsettling, is that private cameras now likely outnumber government ones. Shops, apartment complexes, offices, malls, schools, hospitals. It seems like every building with a front door has at least one camera pointed at it. The cost of a basic four-camera DVR setup has dropped below Rs 10,000, which puts surveillance technology within reach of practically any business or middle-class household. That democratization of surveillance means the sheer volume of footage being generated every day across India is almost impossible to calculate.

The Safe City Machine and What It Actually Does

The Safe City projects aren't just about cameras. They're about connecting cameras to software that can think, or at least sort data faster than a human operator ever could. Facial recognition is part of the package in several cities. Chennai's system, for instance, was reported to include facial recognition capabilities as early as 2019. Hyderabad Police have openly talked about using it to identify suspects in crowds during festivals and public gatherings.

Automatic number plate recognition is another big piece. ANPR cameras mounted at intersections and toll plazas can log every vehicle that passes. String enough of these together and you've got a city-wide vehicle tracking network. In theory, this helps catch stolen cars and trace getaway vehicles. In practice, it also means that your daily commute, your weekend drives, where you park and for how long, all of that gets logged and stored somewhere.

Crowd behaviour analytics add another layer. Some systems flag "unusual" movement patterns, like someone standing still for too long in a busy area or a group forming quickly. The underlying AI models were mostly trained on datasets from other countries, which raises its own set of questions about accuracy and bias when applied to Indian crowds, Indian clothing, Indian complexions. I haven't seen any published audit of how well these systems actually perform here. We're sort of taking the vendors' word for it, and the vendors are selling a product.

Where's the Law?

This is where things get genuinely weird. India, a country with 80-plus million surveillance cameras, does not have a single dedicated law that governs CCTV use. There's no statute that spells out where cameras can and can't be placed, how long footage can be kept, who gets to access it, or what happens when the data is misused. Just... nothing specific. It's a regulatory vacuum that would probably shock anyone used to the EU's approach, where CCTV in public spaces has to comply with GDPR's strict requirements around lawful basis, data minimisation, storage limitation, and transparency.

What we do have are scattered provisions in other laws. The Information Technology Act, 2000 has some broad language about data protection under Section 43A and the IT Rules, but it wasn't written with CCTV in mind. The Digital Personal Data Protection Act (DPDPA) of 2023 is the more recent entry, and it does cover "digital personal data," which could theoretically include digitized CCTV footage. But the Act gives wide exemptions to the government, especially when processing data in the interest of sovereignty, security, or public order. If the state wants to run surveillance cameras, the DPDPA, as currently written, probably won't stop it.

There are also state-level police regulations and municipal orders that sometimes reference CCTV, but these tend to be operational guidelines rather than enforceable privacy rules. A few city police commissionerates have issued circulars requiring shops and businesses to install cameras and share footage on demand. Think about that for a second: the government is both mandating that private parties create surveillance infrastructure and then claiming access to the data it produces, all without any clear legal framework governing how that data is handled afterward.

Puttaswamy and the Privacy Promise

The big legal anchor here, and really the only one with any teeth, is the Supreme Court's 2017 ruling in K.S. Puttaswamy v. Union of India. A nine-judge bench unanimously declared that the right to privacy is a fundamental right, protected under Article 21 of the Constitution. Justice D.Y. Chandrachud, writing for the majority, laid out a three-part test that any state intrusion into privacy must satisfy. The intrusion has to be sanctioned by law. It has to serve a legitimate aim. And it has to be proportionate, meaning the measure shouldn't be more invasive than what's strictly needed to achieve that aim.

On paper, this should be a powerful check against mass surveillance. If you run every large-scale CCTV deployment through the Puttaswamy test, most of them look shaky. Are they sanctioned by law? Not by any CCTV-specific statute. Do they serve a legitimate aim? Sure, public safety counts, probably. But are they proportionate? That's where the argument falls apart. Recording everyone, everywhere, all the time, storing footage indefinitely, layering on facial recognition that can identify individuals in a crowd, none of that looks like a "proportionate" response to the general goal of making streets safer. A targeted camera at a known crime hotspot is proportionate. A blanket covering an entire city with AI-linked surveillance is something else entirely.

The trouble is, Puttaswamy sets a principle, not a regulation. Someone has to actually challenge specific surveillance programmes in court for the test to be applied, and litigation in India moves slowly. The Internet Freedom Foundation (IFF) and other civil liberties groups have filed petitions against specific instances of facial recognition use, but a broad challenge to CCTV surveillance itself hasn't made it to a decisive judgment yet. So the privacy right exists in theory while the cameras keep multiplying in reality.

Facial Recognition: The Part That Should Worry You Most

If plain CCTV is a privacy concern, facial recognition technology (FRT) is that concern multiplied by a hundred. Regular cameras record what happens. FRT identifies who's there. It turns a passive recording into an active identification system, and India has been deploying it without any specific legal authorization, without public debate, and without independent oversight.

The National Crime Records Bureau (NCRB) has been developing a National Automated Facial Recognition System (AFRS) since at least 2019. The plan, based on procurement documents that became public through RTI requests, is to build a centralized database that can match faces captured by CCTV cameras against photos from passports, driving licences, and mugshot databases. State police departments were expected to feed into this system and draw from it. The NCRB's original Request for Proposal described a system capable of handling massive volumes of image data and matching faces in "near real-time."

Delhi Police used FRT during the 2020 anti-CAA protests, reportedly to identify participants from CCTV footage. That's the scenario that chills civil liberties advocates: surveillance technology designed for catching criminals being turned on political protesters. And there's very little stopping that from happening again, because there's no law that restricts what FRT can be used for or requires judicial authorization before it's deployed.

The accuracy problem is also real. Facial recognition systems trained primarily on lighter-skinned populations have well-documented higher error rates on darker skin tones. India's population is enormously diverse, and I'm not aware of any publicly available testing data showing how the systems being deployed here perform across different demographics. Misidentification isn't just an inconvenience. It could mean wrongful detention, harassment, or worse. A 2020 report by researchers at MIT found that some commercial facial recognition systems had error rates above 30% for darker-skinned women. We should probably be asking whether the systems Indian police are buying perform any better.

Data Retention: The Black Hole Nobody Talks About

Here's a question I haven't been able to get a satisfying answer to from any government source: how long is CCTV footage stored? The honest answer seems to be "it depends, and nobody's really checking." Some police systems reportedly retain footage for 30 days before overwriting. Others claim to keep it for 90 days. Private installations might keep footage for a week or a year depending on storage capacity. There's no mandated standard.

When you layer on facial recognition and analytics, the storage question becomes even more pointed. A face template, the mathematical representation of someone's facial features extracted by FRT software, takes up very little storage compared to raw video. You could feasibly store millions of face templates indefinitely on a modest server setup. So even if the video gets overwritten, the biometric data derived from it might persist for years. Nobody's disclosed whether that's happening, and there's no audit mechanism to find out.

The European approach here is instructive, even if India isn't bound by it. Under the GDPR, CCTV operators must define and publish retention periods, delete footage when it's no longer needed for the stated purpose, and justify any extended retention. Data Protection Impact Assessments are required for large-scale surveillance. In India, there's no equivalent requirement. A government body or private business can hold onto footage, and anything derived from it, for as long as they want, and you'd never know.

What It Feels Like on the Ground

I talked to a few people in Delhi and Bengaluru about their experience with CCTV cameras in their neighbourhoods. Most of them didn't think about it much, which is itself part of the problem. Surveillance becomes normalized so quickly that people stop noticing the cameras or questioning why they're there. One RWA president in a South Delhi colony told me that residents initially resisted cameras pointing at individual homes, but "everyone got used to it after a few months." He said the footage was stored on a hard drive in the colony guard's room and "anyone on the RWA committee could come watch it."

That level of access, informal, unregulated, controlled by whoever happens to be in charge of a housing society, should concern us. Domestic abuse cases where a controlling partner monitors a spouse's movements through colony CCTV, neighbourhood disputes where footage is weaponized, stalking enabled by knowing someone's routine from camera recordings. These aren't hypothetical scenarios. They've been reported in Indian media, though they rarely get connected to the bigger policy conversation about surveillance governance.

A software engineer in Bengaluru told me something that stuck with me. "I just assume I'm being recorded everywhere," she said. "At work, on the road, at the mall, in my apartment parking lot. I've stopped caring, I guess." That kind of resigned acceptance is what surveillance studies scholars call the "chilling effect" in action. People don't change their behaviour because they made a conscious choice. They just gradually stop doing things that might look unusual on camera. They self-censor without realizing it.

How the EU Does Things Differently

It's worth looking at the GDPR model, not because India should copy it wholesale, but because it shows what a regulated version of public surveillance looks like. In the EU, any CCTV system processing personal data must have a lawful basis, typically "legitimate interest" for private operators or a specific legal provision for law enforcement. Signage is mandatory. You have to know you're being filmed, and you have to be able to find out who's filming you and why.

The EU's approach to facial recognition is even stricter. The EU AI Act, which entered into force in 2024, places real-time biometric identification in public spaces in its "prohibited" category with only narrow exceptions for law enforcement, and those exceptions require prior judicial authorization. That's a world apart from India, where police departments can deploy FRT on a procurement budget without asking a court, a legislature, or the public.

Data Protection Impact Assessments under the GDPR force organizations to think before they install cameras. They have to document what they're capturing, why, who'll access it, how long they'll keep it, and what safeguards are in place. If the assessment reveals high risk, the supervisory authority (the local data protection regulator) gets consulted. None of this exists in India. Cameras go up because someone decided they should, and that's the end of the analysis.

Things You Can Actually Do About It

I don't want to end on pure pessimism, because individuals aren't completely powerless here. There are concrete steps you can take, even within the messy legal framework we've got.

First, use the Right to Information Act. RTI is still one of the sharpest tools Indian citizens have. File applications with your municipal corporation, police commissionerate, or the relevant ministry asking how many cameras are installed in your area, what technology they use (including whether FRT is active), who has access to the footage, what the retention policy is, and whether any Data Protection Impact Assessment was conducted. The questions themselves force accountability, and the answers, or the refusal to answer, can be revealing.

Second, know what the DPDPA gives you, even in its current limited form. The Act includes a right to access your personal data and a right to erasure in certain circumstances. If you're captured on a CCTV system that falls under the Act's scope, you could theoretically request a copy of footage containing your image or ask for it to be deleted. Whether that'll actually work in practice with a police department or a government surveillance centre is another question, but asserting the right creates a record and builds pressure for clearer rules.

Third, pay attention to what's happening in your neighbourhood. If your RWA is installing cameras, ask about the data retention policy. Ask who has access to the footage. Ask whether cameras will be pointed at individual front doors or private balconies, and push back if they are. These conversations happen at the colony meeting level, and that's exactly where they should happen. Surveillance governance doesn't have to start in Parliament. It can start in your apartment complex.

Fourth, support the organizations doing the legal heavy lifting. The Internet Freedom Foundation has been at the forefront of challenging facial recognition and surveillance overreach through litigation and public advocacy. The Software Freedom Law Centre is another. These groups file the RTIs, bring the court challenges, and publish the reports that make this conversation possible. They run on donations and volunteer energy.

Fifth, and this one might sound small, but it matters: talk about it. The biggest advantage surveillance has is that people don't think about it. When you point out a camera cluster to a friend and say "did you know there's no law governing how that footage is used?", you've started something. Public awareness is the precondition for political will.

Where This Goes From Here

There's been some noise in policy circles about a dedicated surveillance reform bill, but nothing concrete has emerged as of early 2026. The Parliamentary Standing Committee on Home Affairs asked pointed questions about facial recognition during a hearing in late 2024, and some MPs raised concerns about the absence of safeguards. But hearings aren't legislation, and there's no draft bill in circulation that I'm aware of.

The tension isn't going away. If anything, it's getting sharper. Camera prices keep falling. AI analytics keep getting more capable. Smart city projects keep expanding. Every new installation creates more data, more potential for misuse, and more pressure on a legal framework that was never designed to handle any of it. The Puttaswamy judgment gave us the right to privacy. What we haven't built yet is the infrastructure of rules and institutions that would make that right actually mean something when you're walking down a street lined with cameras.

So the question that keeps nagging at me is this: at what point does a city go from "monitored for safety" to "surveilled by default"? And when it does, who gets to decide that the trade-off was worth it -- the people being watched, or the people doing the watching?

SR

Written by

Sneha Reddy

Digital Rights Advocate

Sneha Reddy is a digital rights advocate focused on internet freedom and surveillance in India. She works at the intersection of technology and policy, helping citizens understand their digital rights under Indian law.

Found this article helpful? Share it!

Share:

Related Posts

Comments (0)

Leave a Comment

Loading comments...