Skip to main content
Digital Rights

World Down Syndrome Day and Digital Accessibility: Inclusive Privacy

I've been thinking about who gets left out when we design privacy tools and consent systems. On World Down Syndrome Day, that question feels more pressing than usual.

SR
Sneha Reddy
·14 min read
Share:
World Down Syndrome Day and Digital Accessibility: Inclusive Privacy

I've been thinking a lot about who we mean when we say "users" in conversations about digital privacy. We mean people who can read dense legal text. We mean people who understand what "data processing" and "third-party sharing" refer to. We mean people who can work through settings menus, compare options, and make informed decisions about abstract trade-offs between convenience and personal data. We mean, implicitly, people without cognitive disabilities. And that assumption is so baked into how we build privacy systems that most people never question it.

March 21 is World Down Syndrome Day, chosen because Down syndrome involves a third copy of the 21st chromosome — 3/21, the 21st of March. It's a day meant to raise awareness about the rights, inclusion, and dignity of people with Down syndrome and, more broadly, people with intellectual disabilities. I want to use this as a starting point for a conversation that privacy circles rarely have: what happens to your right to privacy when the systems built to protect it weren't designed for the way your mind works?

India has an estimated 32,000 children born with Down syndrome each year, and over a million people living with the condition. Intellectual disabilities more broadly affect a much larger population. As more aspects of daily life move online — banking, healthcare, government services, education — these individuals are required to engage with digital systems that assume a specific level of cognitive processing. When the system's assumption and the user's reality don't match, the gap isn't a minor inconvenience. It's a fundamental failure of the framework we claim protects everyone equally.

Privacy law, including India's DPDPA 2023, is built on consent. You agree to let a company process your data. You choose what to share. You control your information. The entire architecture rests on the idea that consent is meaningful — that the person clicking "I agree" understands what they're agreeing to.

Now consider what a typical privacy policy looks like. Even the shorter ones run several thousand words of legal language. They reference concepts like "data fiduciary," "processing purposes," "legitimate interests," "data retention periods," and "cross-border transfers." I write about privacy for a living and I sometimes have to read these documents twice. For a person with Down syndrome or another intellectual disability — someone who may have difficulty with abstract reasoning, complex sentence structures, or the concept of future consequences from present choices — these documents are meaningless. Not difficult. Meaningless.

When consent is meaningless, it's not consent. It's a formality. The person taps "Agree" because the button is there and they want to use the app, not because they've made an informed decision about data processing. And the legal system treats that tap the same as it treats an informed decision by someone who read and understood every clause. That's a structural injustice, and I don't think that's too strong a word for it.

The DPDPA recognises that children can't provide meaningful consent, which is why it requires parental consent for minors. But for adults with intellectual disabilities, the situation is murkier. An adult with Down syndrome might live independently, hold a job, and manage many aspects of their own life. They're not a child. They have autonomy and deserve to have it respected. Requiring guardian consent for everything would be paternalistic and would deny their agency. Accepting their "consent" to incomprehensible legal text would be performative and would deny their dignity in a different way. The answer lies somewhere in between, in what disability rights advocates call supported decision-making — helping someone understand their choices without making those choices for them. But I haven't seen a single Indian app or platform implement anything resembling this.

Assistive Technology and the Privacy Tax

People with intellectual and developmental disabilities often rely on assistive technologies to interact with digital systems. Screen readers for those with visual impairments are well-known, but the assistive technology picture is broader than that. There are simplified interface apps that reduce cognitive load, alternative keyboards for people with motor or cognitive differences, communication apps for non-verbal individuals, and routine management apps that help structure daily activities.

Each of these tools typically requires elevated permissions on the device. A simplified launcher might need access to all installed apps. A communication app might need microphone access, contact list access, and always-on screen access. A routine management app might need location services, calendar access, and notification control. These permissions are legitimate — the tools genuinely need them to function. But they also create a broader attack surface and more opportunities for data collection. The person using these tools ends up sharing more data with more apps than a typical user, not because they chose to, but because the tools they depend on require it.

There's a term I've seen used in disability and technology circles: the privacy tax. People with disabilities pay it in the form of extra data they must share just to access the same services that able-bodied users access without those costs. A person who can type their own messages doesn't need a predictive communication app tracking their word patterns and sending them to a cloud server. A person who relies on such an app has no choice — the cloud processing is part of how the tool works. The privacy cost is inherent in the accommodation, and that cost isn't distributed equally.

Caregivers, Shared Accounts, and Blurred Boundaries

Many individuals with intellectual disabilities have caregivers — parents, siblings, professional support workers — who help manage their digital lives. Sometimes this means the caregiver has access to the individual's phone, their passwords, their email. Sometimes multiple people share a single device or a single set of credentials. From a privacy perspective, this creates a tangle of questions that our current frameworks aren't equipped to answer.

Whose data is it when a caregiver creates and manages an Aadhaar-linked account for a person with Down syndrome? The account belongs to the individual, legally. But the caregiver controls it, practically. If the caregiver shares health data with a support organisation — say, to enrol the person in a therapy programme — was that the individual's decision? Did they understand what was being shared? Did anyone ask? In many cases, the answer is that practical necessities override formal consent processes, and the individual's data gets shared in ways that technically violate data protection principles but happen anyway because there's no workable alternative.

This isn't a problem that can be solved by writing a better privacy policy. It requires rethinking how consent works in the context of supported living. Some models worth exploring: co-consent, where both the individual and a trusted supporter confirm a data sharing decision together; tiered access, where a caregiver can manage certain aspects of a digital account without having access to everything; and audit trails visible to the individual (or their advocate) showing what data has been accessed and by whom. None of these are standard practice anywhere, but they represent directions that could make the system more just.

What Inclusive Privacy Design Could Look Like

Inclusive privacy isn't about dumbing things down. It's about designing systems that communicate clearly to a wider range of minds. The solutions already exist in other fields — we just haven't applied them to privacy.

Easy Read is a format developed for people with intellectual disabilities that uses short sentences, common words, and pictures alongside text. It's been mandated in some contexts in the UK and Europe. An Easy Read version of a privacy policy would replace "We may process your personal data for the purpose of personalised content delivery" with something like "We look at what you do in this app to show you things we think you'll like" next to a simple illustration. This isn't a hypothetical — organisations like Inclusion Europe have published guidelines for creating Easy Read documents, and some European government agencies already publish Easy Read versions of policy documents.

No Indian company that I'm aware of offers an Easy Read version of their privacy policy. The concept doesn't appear in the DPDPA or its draft rules. It should. The law mandates that consent be informed, but it doesn't require that information be communicated in accessible formats. That's a gap, and an easy one to close if regulators cared to.

Visual consent mechanisms could replace text-heavy consent forms with icon-based representations. Instead of paragraphs about data categories, show icons: a camera icon for photo access, a location pin for location data, a phone icon for call logs, with clear visual indicators (green for allowed, red for denied, a lock for encrypted). The Privacy Icons project proposed something like this years ago. Some apps have started using visual permission explanations — Apple's iOS does this reasonably well in its permission dialogs — but privacy consent for data processing (as opposed to device permissions) remains stubbornly text-based.

Biometric access for password managers and security tools helps people who struggle with text-based passwords. If unlocking Bitwarden requires only a fingerprint instead of typing a master passphrase, it becomes accessible to someone who can't manage complex text entry. Most password managers already support biometric unlock, so this is less a design change and more an awareness point for caregivers setting up devices for people with disabilities.

2FA that doesn't depend on reading SMS codes matters too. App-based 2FA with a simple "approve/deny" button is easier than reading a six-digit code from a text message and typing it into another app. Push-notification-based authentication (like Google's "Is this you? Tap yes to sign in") is perhaps the most accessible form, as it reduces the action to a single tap on a clear question. These alternatives exist but aren't always offered as the primary option.

The Rights of Persons with Disabilities Act, 2016 (RPwD Act) mandates equal opportunity, non-discrimination, and accessibility in all aspects of life. It recognises intellectual disabilities. It requires that information and communication technologies be made accessible. Read alongside the DPDPA, the RPwD Act should compel platforms to make their data protection mechanisms accessible to people with cognitive disabilities. In practice, the two laws operate in separate regulatory silos — the Data Protection Board handles the DPDPA, the Chief Commissioner for Persons with Disabilities handles the RPwD Act — and nobody's connecting the dots.

I keep thinking about this intersection because it reveals something about how we design systems. When engineers build a consent flow, they think about legal compliance — does this checkbox meet the DPDPA's requirements? When designers build it, they think about conversion rates — how many users complete the flow without dropping off? Neither group typically asks: can a person with Down syndrome understand what this screen is asking them to do? And until someone asks that question during the design process, the answer will keep being no.

A few things that would make a real difference. The DPDPA's rules (which are still being finalised) should require that consent mechanisms be available in accessible formats, including Easy Read and visual formats. Platforms should be required to offer simplified privacy settings — not buried in menus, but surfaced during onboarding with clear, visual explanations. The Data Protection Board, once operational, should include disability rights expertise among its members or advisors. Accessibility audits of privacy interfaces should be part of compliance assessments.

Will any of this happen soon? I'd like to think so, but I'm honestly not sure. The disability rights community in India has won significant legal protections, but enforcement remains patchy. The privacy community in India is growing but is still dominated by the concerns of able-bodied, tech-literate users. The intersection — people thinking about both disability rights and digital privacy — is a small community, and small communities struggle to move policy.

What Other Countries Are Doing (and What India Could Learn)

The UK's Age Appropriate Design Code (also known as the Children's Code) requires that digital services likely to be accessed by children provide privacy settings that are "easy to understand and use." While it focuses on children rather than adults with intellectual disabilities, the principle extends naturally. If you design for the least text-literate user, everyone benefits. The concept is sometimes called the "curb cut effect" — the ramps cut into pavements for wheelchair users also help people with prams, delivery workers with trolleys, and anyone pulling a suitcase. Accessible design serves a wider audience than the one it was intended for.

The European Accessibility Act, which takes effect in 2025, mandates accessibility in a range of digital products and services. It's not perfect, and enforcement will be the real test, but it establishes the principle that accessibility is a legal requirement, not a charitable add-on. India's RPwD Act already contains similar principles on paper. What's missing is the connection between those principles and the specific domain of data protection. Nobody's drawn the line between "digital services must be accessible" and "data protection mechanisms within those services must also be accessible." The line seems obvious once you state it, but it's not yet reflected in regulation or practice.

Some specific initiatives are worth watching. The Global Accessibility Awareness Day (GAAD) community has been pushing for privacy-related accessibility audits as part of standard accessibility testing. A few European organisations — particularly banks and government services — have begun testing their consent mechanisms with cognitively diverse users and redesigning based on the results. India hasn't joined this movement yet, but it could. The technology exists. The design frameworks exist. What's needed is the institutional will to implement them.

Coming Back to the Question

I started this piece by asking who we mean when we say "users" in privacy conversations. The honest answer, right now, is that we mean people like us — people who can read, reason about abstractions, move through menus, and make decisions based on text. We build systems that work for our minds and assume they work for everyone's.

They don't. A privacy right that requires you to read and understand a 5,000-word legal document isn't a right for someone with an intellectual disability — it's a formality. A consent button that leads to outcomes the person tapping it can't predict isn't consent — it's ceremony. A data protection law that doesn't require accessible communication isn't protecting everyone — it's protecting some people and leaving others exposed.

On World Down Syndrome Day, that gap between intention and reality feels especially sharp. The people who experience it most acutely are also the people with the least power to demand change. They didn't design these systems. They didn't write these laws. They didn't choose to live in a world that's moving online at a pace that outstrips anyone's ability to keep up, let alone the ability of someone whose mind processes information differently.

Privacy, if it means anything, means everyone gets to control their personal information. Not just people who read fast. Not just people who understand legal language. Not just people whose brains work the way a software designer assumed they would. Everyone. We're nowhere close to that, and recognising the distance is, I think, the first step toward closing it. The question I keep returning to — and the one I started with — is who we're building for. Until the answer genuinely includes people with intellectual disabilities, we haven't built a privacy framework. We've built a privacy framework for some people. And "for some people" is a qualifier that should make anyone uncomfortable.

SR

Written by

Sneha Reddy

Digital Rights Advocate

Sneha Reddy is a digital rights advocate focused on internet freedom and surveillance in India. She works at the intersection of technology and policy, helping citizens understand their digital rights under Indian law.

Found this article helpful? Share it!

Share:

Related Posts

Comments (0)

Leave a Comment

Loading comments...