Skip to main content
Data Protection

Understanding Biometric Data Protection in India

Your fingerprints can't be reset like a password. India holds biometric data on 1.4 billion people through Aadhaar alone, and the legal protections around that data remain thinner than most citizens realize.

RK
Rajesh Kumar
·13 min read
Share:
Understanding Biometric Data Protection in India

A fingerprint isn't a password. You can't change it after a breach. You can't generate a new one and update your records. Once compromised, it stays compromised — permanently, irrevocably, for the rest of your biological life. The same goes for iris patterns, facial geometry, and voiceprints. This fundamental characteristic makes biometric data different from every other category of personal information, and it's the reason why its protection demands a level of seriousness that India's legal system has only partially delivered.

India operates the largest biometric database in human history. The Aadhaar system, administered by the Unique Identification Authority of India, holds fingerprints and iris scans of over 1.4 billion residents. That sentence is easy to read and hard to fully absorb. No country, no corporation, no entity of any kind has ever assembled a biometric repository of that scale. The closest comparison would be China's social credit-linked surveillance infrastructure, but even that doesn't have a single centralized biometric database comparable to UIDAI's. The sheer concentration of sensitive data in one system creates a risk profile that's difficult to reason about, because there's no historical precedent for a breach at that scale.

The Expanding Footprint of Biometric Collection

Aadhaar was the beginning, but biometric collection in India has spread well beyond it. DigiYatra, the facial recognition system for airport boarding, was operational at 24 Indian airports by the end of 2025. The stated purpose was convenience — skip the line by scanning your face instead of showing a boarding pass. That framing is appealing, and it seems like many travelers have adopted it without much hesitation. But DigiYatra captures and stores facial geometry data linked to your Aadhaar and flight itinerary. The data retention policies, as of early 2026, remain somewhat opaque. Civil liberties groups like the Internet Freedom Foundation have filed RTI requests seeking clarity on how long DigiYatra retains facial data and who has access to it. The responses have been incomplete.

Workplace biometric attendance systems represent another massive collection point that rarely gets discussed. Walk into any mid-sized Indian company's office or a government building and you'll probably find a fingerprint or face recognition terminal at the entrance. These systems are manufactured by dozens of companies — ZKTeco, Realtime Biometrics, BioMax, Matrix — and deployed with minimal regulatory oversight. The IT manager at a company with 500 employees now holds biometric data on all of them, stored on a local server or sometimes a cloud service, with security practices that vary wildly. There's no mandatory standard for how these systems store data, how long they retain it, or what happens to the biometric records when an employee leaves the organization. In theory, the DPDPA's consent requirements apply. In practice, refusing to use the biometric attendance system means you can't mark yourself present at work. The "consent" is about as voluntary as agreeing to your employer's dress code.

Banking and financial services have woven biometric authentication deep into their operations. Aadhaar-based eKYC lets you open a bank account, get a SIM card, or complete insurance paperwork by providing a fingerprint scan that's verified against UIDAI's database. Micro-ATMs in rural areas use fingerprint authentication for cash withdrawals. This has been genuinely useful for financial inclusion — millions of people who couldn't provide traditional documentation now have access to banking services. But the authentication system's reliability has been questioned. Fingerprint scanners in rural micro-ATMs sometimes fail for manual laborers whose fingerprints have been worn down by years of physical work. Iris scanners fail for people with cataracts, a common condition in older rural populations. These failures aren't just inconveniences — they can lock people out of their own money.

Law enforcement's use of biometrics deserves its own scrutiny. The National Automated Fingerprint Identification System, or NAFIS, launched by the National Crime Records Bureau, aims to create a searchable database of fingerprints from crime scenes and arrested individuals across India. The Criminal Procedure (Identification) Act of 2022 expanded police powers to collect biometric data — including fingerprints, iris scans, and biological samples — from convicted persons, arrested persons, and even those detained under preventive detention laws. Critics have pointed out that this means biometric data can be collected from people who haven't been convicted of anything, and the Act's provisions for data deletion after acquittal are vaguely worded. City-level CCTV networks with facial recognition capabilities are operational in Hyderabad, Chennai, Delhi, and several other cities. Hyderabad's system, built in partnership with companies linked to Chinese surveillance technology, has been particularly controversial.

What the Law Actually Says — And What It Doesn't

The Digital Personal Data Protection Act of 2023 treats biometric data as personal data subject to consent requirements and purpose limitation. That's a baseline, not a ceiling. The problem is that the DPDPA doesn't create a special category for biometric data the way the EU's GDPR does. Under GDPR Article 9, biometric data used for identification is classified as "special category data" that requires explicit consent and is subject to stricter processing conditions. The DPDPA makes no such distinction — your shopping preferences and your iris scans receive, at least on paper, the same level of protection. Given the irreversible nature of biometric compromise, this feels like an oversight. It may well be an intentional one, because creating stricter protections for biometrics would impose constraints on Aadhaar and government surveillance programs.

The Aadhaar Act of 2016 provides some specific protections for biometric data collected under the Aadhaar system. Section 29 prohibits UIDAI from sharing core biometric information — fingerprints and iris scans — with anyone for any reason. They can't be used as evidence in court, can't be shared with police, and aren't supposed to be accessible even under a court order. This is a genuinely strong protection, at least as written. UIDAI has also implemented technical measures: biometric data is encrypted at the point of capture, stored in encrypted form, and matched through a one-way process that doesn't expose the raw biometric template. The Virtual ID system, introduced in 2018, lets you generate a temporary 16-digit number linked to your Aadhaar that can be used for authentication without revealing your actual Aadhaar number. These are thoughtful safeguards, and they probably represent the strongest biometric protections in Indian law. The limitation is that they only apply to Aadhaar data — not to the biometrics collected by DigiYatra, your employer's attendance system, your bank's eKYC process, or a housing society's facial recognition gate.

The Supreme Court's Puttaswamy judgment from 2017, which established privacy as a fundamental right under Article 21, provides the constitutional foundation for biometric protection. Justice D.Y. Chandrachud's opinion emphasized that informational privacy includes the right to control the dissemination of personal data, and that any state intrusion must satisfy tests of legality, necessity, and proportionality. The subsequent Aadhaar judgment in 2018 upheld the system's constitutional validity but imposed restrictions — Aadhaar couldn't be mandatory for bank accounts, mobile connections, or school admissions. Only government subsidy disbursement and income tax filing could require it. In practice, these restrictions have been unevenly followed. Several private entities continued to demand Aadhaar for services well after the judgment, relying on the practical reality that most Indians won't challenge such demands.

There's a gap in the legal framework that becomes apparent when you look at private sector biometric use. A housing society in Gurugram that installs facial recognition cameras at its entrance, a gym chain that uses fingerprint scanning for member check-in, a retail store that deploys facial recognition to track customer movement patterns — none of these are governed by the Aadhaar Act, and the DPDPA's general provisions, while applicable in theory, haven't been tested through enforcement actions. The consent mechanism is usually a clause buried in a terms-of-service agreement that nobody reads. Even if someone did read it and objected, the practical alternatives — "Don't live in this society," "Don't use this gym" — make meaningful consent fictional.

What Individuals Can Do — And What Requires Systemic Change

Individual protective measures exist but they're limited. Locking your Aadhaar biometrics through the UIDAI portal or the mAadhaar app is something every Indian should do. When biometrics are locked, they can't be used for authentication, which means if someone tries to use a cloned fingerprint or compromised biometric data to authenticate as you, the attempt will fail. You can temporarily unlock them when you need to complete an Aadhaar authentication and lock them again immediately afterward. It takes about two minutes. Using the Aadhaar Virtual ID system whenever possible reduces the exposure of your actual Aadhaar number. These are small steps, but they're the only ones fully within individual control.

The broader changes need to come from legislation and enforcement. India needs a biometric-specific regulatory framework — something that goes beyond the DPDPA's general provisions and addresses the unique risks of biometric data. This framework would ideally cover: mandatory data protection impact assessments before deploying any biometric system; strict purpose limitation preventing function creep; maximum retention periods with automatic deletion; security standards for biometric data storage; and meaningful penalties for breaches involving biometric data that reflect the irreversible nature of the harm. Countries like Illinois in the US, with its Biometric Information Privacy Act, offer models worth studying — though India's context, with Aadhaar as a national infrastructure, creates complications those models don't address.

The biometric data market in India also has a supply chain problem that's rarely examined. The fingerprint scanners, iris readers, and facial recognition cameras deployed across India are manufactured by a mix of Indian, Chinese, and Korean companies. Many of these devices have firmware that communicates with manufacturer servers for updates and diagnostics. The question of whether biometric data captured by a Chinese-manufactured scanner could be transmitted to servers outside India isn't hypothetical — it's a supply chain security concern that applies to thousands of devices installed in government offices, banks, and corporate buildings. India doesn't currently have certification standards for biometric hardware that address data exfiltration risks. The Bureau of Indian Standards has specifications for biometric device accuracy but not for data security at the hardware level. That gap is concerning given the sensitivity of what these devices capture.

The Data Protection Board of India, now that it's been fully constituted, will need to establish how it handles biometric-related complaints. The first few cases involving biometric data misuse will set important precedents. Whether the Board treats biometric complaints with heightened urgency, whether it imposes penalties that actually deter large entities, whether it takes on government programs or limits itself to private sector actions — these early signals will determine whether India's biometric protection regime has teeth or is merely decorative.

The international comparison is instructive, even if imperfect. The EU's GDPR classifies biometric data as "special category" under Article 9, subjecting it to the strictest processing requirements — explicit consent, documented necessity, and data protection impact assessments are all mandatory before any biometric system can be deployed. Illinois, in the United States, passed the Biometric Information Privacy Act in 2008, which requires informed written consent before biometric collection and creates a private right of action — meaning individuals can sue companies directly for violations. That private right of action has been enormously effective; settlements under BIPA have run into hundreds of millions of dollars, creating a financial deterrent that generic data protection rules simply don't provide. India has neither a biometric-specific consent requirement beyond the DPDPA's general provisions nor a private right of action for data protection violations. The DPDPA channels all complaints through the Data Protection Board, which may or may not prove to be an effective enforcement body. No individual can directly sue a company for biometric data misuse under current Indian law — they have to go through the Board or pursue general civil remedies through regular courts, which is slow and expensive.

The emerging use of biometrics in housing societies and gated communities across Indian cities raises questions that sit uncomfortably between privacy and security. Residential complexes in Bangalore, Hyderabad, Gurugram, and Noida have installed facial recognition systems at entry gates, marketed as security measures to prevent unauthorized access. Residents are required to register their faces; visitors are photographed and logged. The data is typically managed by the security vendor — a private company that may store facial recognition data on its own servers, with its own retention policies, accessible to its own employees. Residents who object often find that opting out means they can't use the automated gate and must wait for manual verification, creating a practical penalty for exercising privacy preferences. The Resident Welfare Associations that approve these systems rarely conduct due diligence on the vendor's data practices. Nobody asks where the data goes after you move out of the apartment, or what happens if the security vendor gets hacked, or who's liable if facial recognition data from a housing society ends up being sold or misused.

Voice recognition systems are an emerging biometric category that's growing in India without much public notice. Banks are increasingly using voiceprint analysis for phone-based customer authentication — when you call your bank's helpline and speak to verify your identity, your voice characteristics may be analyzed and stored as a biometric identifier. Telecom companies are exploring similar technology for customer service calls. Unlike fingerprint or iris scanning, voice biometric collection often happens without the person being clearly aware that a biometric is being captured. You think you're just talking to a customer service agent; the system is quietly building a voiceprint profile.

Schools in India are beginning to experiment with biometric attendance for students, particularly in government schools where monitoring attendance is tied to funding and mid-day meal program allocation. Children's biometric data is being collected without meaningful parental consent — parents are "informed" through circulars or school apps, but refusing isn't practically an option when attendance is mandatory and the biometric system is the only way to record it. The DPDPA's provisions on children's data, once the implementing rules are finalized, will presumably apply to these systems. But the gap between a legal requirement and its enforcement in a government school in rural Madhya Pradesh or Uttar Pradesh is significant. The children whose biometric data is being collected today will have that data in government databases for potentially their entire lives, collected at an age when they had no capacity to consent and their parents had no practical ability to refuse.

I think about this topic differently since something that happened to my father last year. He went to a government office in Lucknow to update his pension records. The clerk asked him to provide a fingerprint scan for Aadhaar verification. His fingerprints didn't register — he's 73, and years of manual work in his younger days left his prints faded and difficult for scanners to read. The clerk told him to come back another day, as if the failure of the technology to accommodate a common condition in elderly Indians was my father's problem to solve. He made three trips before someone suggested trying iris authentication instead, which worked. No one at that office apologized or acknowledged the flaw. The system assumed his body would conform to its requirements, and when it didn't, the system simply refused him. That small indignity, multiplied across millions of elderly and working-class Indians, is perhaps the most concrete illustration of why biometric governance matters — not just for privacy, but for basic dignity and access.

RK

Written by

Rajesh Kumar

Founder & Chief Editor

Rajesh Kumar is a cybersecurity expert with over 12 years of experience in digital privacy and data protection. He has worked with CERT-In and various Indian enterprises to strengthen their data security practices. He founded PrivacyTechIndia to make privacy awareness accessible to every Indian.

Found this article helpful? Share it!

Share:

Related Posts

Comments (0)

Leave a Comment

Loading comments...