Skip to main content
Data Protection

How Indian Healthcare Apps Handle Your Medical Records

Practo knows your prescriptions. 1mg knows your lab results. PharmEasy knows what you're treating. Apollo 247 has your vitals. And the Ayushman Bharat Health ID might soon tie it all together. Who else is looking at your medical records?

RK
Rajesh Kumar
·12 min read
Share:
How Indian Healthcare Apps Handle Your Medical Records

— because the thing that nobody seems to want to say plainly is that Indian healthcare apps treat your medical data with roughly the same care a street vendor treats a paper napkin. Not all of them, sure. Some are better than others. But the baseline is disturbingly low for an industry that's handling prescriptions, lab results, mental health consultations, and HIV test reports.

What Healthcare Apps Actually Collect

Think about what a healthcare app actually knows. Not in the abstract. Specifically. Practo, if you've used it for a few years, probably has a record of every doctor you've visited, every symptom you searched for, every prescription uploaded to your profile, and every appointment you booked. If you used the teleconsultation feature, there's a record of those conversations too. 1mg has your medicine orders — which means they know what conditions you're being treated for, how long you've been on a medication, and when your refills are due. PharmEasy holds similar data. MFine, before its pivot, had video consultation recordings. Apollo 247 tracks vitals if you've used their health monitoring features, and stores lab reports you upload for doctor consultations.

Collectively, these platforms hold a portrait of your physical and mental health that's more detailed than what your family doctor might have. And unlike your family doctor, who's bound by medical ethics codes and has a face you can look at across a desk, these are tech companies. Their primary obligation is to shareholders, not patients. That's not cynicism. It's corporate law.

What do they collect? The list is long. Start with the obvious stuff: your name, phone number, email, date of birth, and address. That's standard for any app. Then you get into the medical layer. Prescriptions — either typed in during a consultation or photographed and uploaded. Lab reports and diagnostic images. Consultation notes from doctors. Symptom checkers and self-assessment quiz responses. Medication reminders and adherence data (did you take your pill today? the app knows if you tapped "yes" or "no"). Some apps pull in data from fitness trackers and wearables — heart rate, sleep patterns, step counts, blood oxygen levels. If you've linked your insurance, the app has your policy number, claim history, and potentially your insurer's assessment of your health risk profile.

Then there's the stuff you didn't actively provide. Location data — most of these apps request GPS access, ostensibly for finding nearby pharmacies or clinics. Device information. Browsing behavior within the app (which conditions did you search for? which articles did you read? did you spend four minutes on the depression screening page?). Some apps track your activity across sessions in ways that build a behavioral health profile without you ever filling out a questionnaire.

Vague Privacy Policies and Data Sharing

I pulled up Practo's privacy policy last month. It's about 4,000 words. Somewhere around paragraph twelve, it mentions that Practo may share your data with "partners" for purposes of "improving services" and "relevant offerings." That language is vague enough to cover almost anything. It could mean sharing anonymized aggregate statistics with a research institution. It could also mean handing your prescription history to a pharmaceutical company's marketing division so they can target you with ads for competitor drugs. The policy doesn't distinguish between these scenarios in any meaningful way.

1mg (now Tata 1mg, after Tata Digital's acquisition) has a more detailed policy but with similar ambiguities. They state that personal health information "may be shared with authorized third parties" including "health insurance companies, pharmaceutical companies, and other entities." That "other entities" is doing a lot of work. When a privacy policy uses catch-all phrases like that, it's generally because the company wants maximum flexibility in how they monetize or distribute your data, and they don't want to update the policy every time they sign a new data-sharing partnership.

PharmEasy's approach is similar. A 2025 investigation by an Indian digital rights organization found that the PharmEasy Android app was communicating with at least 14 third-party tracking domains, including advertising and analytics services owned by Google and Facebook. That means your medicine purchase data — or at least metadata about your app usage patterns — was being shared with advertising networks. The company didn't dispute the technical finding but argued that "no personally identifiable health information" was shared. Whether that's reassuring depends on how much you trust the distinction between "personally identifiable" and "effectively identifiable when combined with other data points."

Apollo 247 stands out slightly because Apollo Hospitals is a traditional healthcare provider that expanded into digital, rather than a tech startup that moved into healthcare. Their data practices might benefit from institutional familiarity with medical ethics. Or they might not — institutional size doesn't guarantee good privacy practices, as anyone who's dealt with a hospital's billing department can attest. Apollo's privacy policy does mention compliance with the Information Technology Act and "applicable data protection laws," which presumably now includes the DPDPA. But the specifics of how consultation recordings, vitals data, and lab reports are stored, encrypted, and retained aren't detailed in the user-facing policy.

Data Sharing with Insurance Companies

The question of data sharing with insurers deserves its own examination because the incentives there are genuinely alarming. Health insurance companies want to assess risk. Your medical history is the single most valuable input for that assessment. If an insurer knows you were treated for hypertension, that you searched for diabetes symptoms on a health app, that you ordered cholesterol medication last month, and that your step count has been declining — that's a risk profile that could affect your premiums or your ability to get coverage at all.

Is this happening? Almost certainly, to some degree. In mid-2025, a report by the Internet Freedom Foundation documented cases where Indian health insurance companies appeared to have access to applicants' pharmacy purchase histories during the underwriting process. The mechanism wasn't entirely clear — it could have been direct data sharing from pharmacy apps, or it could have been through data brokers who aggregate health-adjacent purchase data. Either way, the result is the same: your private health decisions influencing your insurance terms without your explicit knowledge or consent.

The DPDPA should theoretically prevent this kind of opaque data sharing. Consent under the Act must be "free, specific, informed, and unconditional" (Section 6). Sharing prescription data with an insurance company requires separate, purpose-specific consent — you can't bury it in a 4,000-word privacy policy that nobody reads and call that "informed." But enforcement has been slow. The Data Protection Board of India is still building its operational capacity, and healthcare data practices haven't been a priority enforcement area so far. That might change, but as of early 2026, the gap between what the law says and what companies actually do remains wide.

Ayushman Bharat Health ID and Centralized Records

Then there's the government's piece: the Ayushman Bharat Digital Mission and the ABHA (Ayushman Bharat Health Account) ID. The vision is ambitious. Every Indian citizen gets a unique 14-digit health ID. Medical records from any provider — hospitals, clinics, labs, pharmacies — are linked to this ID and stored in a federated system. You control who sees what through a consent manager. Your records follow you from doctor to doctor, city to city, without needing to carry paper files or repeat tests.

On paper, it's a genuine improvement over the fragmented, paper-heavy system India has now. In practice, the privacy implications make me nervous. A centralized (or even federated) health data system is a massive target. If compromised, the breach wouldn't expose a single app's data — it could expose the medical histories of hundreds of millions of people. The Health Data Management Policy, published by the National Health Authority in 2022, includes provisions for encryption, consent-based access, and data minimization. But the policy is a guideline, not a law. Compliance for private health information providers participating in the ABDM ecosystem is technically mandatory, but auditing and enforcement mechanisms are still being developed.

The consent manager is the centerpiece of the ABDM's privacy architecture. When a doctor or hospital requests access to your records through the ABDM system, the request goes through a Health Information Exchange and Consent Manager (HIE-CM). You're supposed to receive a notification, review what's being requested, and grant or deny access. In theory, this is patient-controlled data sharing at its best. In practice, consent fatigue is a real concern. If you're visiting a new doctor and they need your records, you'll probably tap "approve" without reading the details — the same way you tap "accept" on cookie notices. The system also assumes consistent smartphone access and digital literacy, which is far from universal in India. For the hundreds of millions of Indians who access healthcare through public hospitals and primary health centres, the digital consent flow may be handled by a hospital clerk, not by the patient.

There's a concern I haven't seen discussed much: what happens to your ABDM health data if you die? Medical records don't stop being sensitive when someone passes away. Can family members access the deceased's health ID records? Can insurers? Can researchers? The Health Data Management Policy is surprisingly quiet on this. It's the kind of edge case that matters enormously to real people and gets almost no attention in policy discussions.

Mental Health Data and Special Risks

Mental health data deserves a separate callout because of the particular stigma it carries in Indian society. Several of these apps offer teleconsultation for psychiatric and psychological services. Practo lists psychiatrists and psychologists alongside general physicians. MFine offered mental health consultations before scaling back. Newer platforms like Amaha (formerly InnerHour) and MindPeers focus specifically on mental health. The data generated in these consultations — session notes, screening questionnaire results, diagnoses, medication details — is extraordinarily sensitive. In India, where mental health conditions still carry significant social stigma, a leak of therapy records could damage someone's marriage prospects, employment opportunities, and family relationships. Yet the apps handling this data don't always treat it with heightened protection. I've seen mental health platforms that store session notes in the same database and with the same access controls as general appointment records. There's no segregation. No additional encryption layer. No restricted access policy limiting which employees can view psychiatric versus general medical data. The DPDPA classifies health data as personal data but doesn't create a special sub-category for mental health data with enhanced protections, which is a gap the law probably should have addressed.

Pharma marketing represents yet another pipeline for healthcare data leakage. India's pharmaceutical market was worth over $50 billion in 2025. Drug companies compete fiercely for prescribers' attention and patients' awareness. If a pharma company can identify patients who've been prescribed a competitor's drug — through data obtained from an app or a data broker — they can target those patients with information about their own alternative. This isn't speculation. It's an established marketing practice in the US healthcare system, and there's growing evidence that similar practices are emerging in India. The mechanism might look something like this: a pharmacy app shares aggregated but insufficiently anonymized purchase data with a third-party analytics firm, which cross-references it with demographic data purchased from another broker, and the resulting patient profiles end up in a pharma company's marketing database. No single step in that chain might violate the DPDPA in an obvious way. But the end result is that your private prescription choices are influencing targeted advertising you didn't consent to.

The emerging wearable ecosystem adds another dimension. With Apple Health, Google Fit, and dedicated devices like Fitbit, Samsung Galaxy Watch, and dozens of Indian wearable brands pushing continuous health monitoring, the volume of real-time health data being generated is exploding. Some of this data feeds into healthcare apps — when you share your Apple Health data with Practo, for instance, or when Apollo 247 pulls step count and heart rate data from your fitness tracker. The consent model for this data sharing is usually a single toggle: on or off. There's no granularity. You can't say "share my step count but not my heart rate" or "share last week's data but not today's." And once the healthcare app has that data, it's subject to the app's privacy policy, not the wearable's. The original device manufacturer's protections no longer apply.

Security is the other half of this equation, and it's where the picture gets particularly grim for Indian health-tech startups. A penetration testing firm based in Bengaluru published results from audits of 22 Indian health-tech applications conducted between January and September 2025. The findings: nearly 40% did not encrypt medical records at rest (meaning the data sits unencrypted on servers). Over half had API endpoints that were accessible without proper authentication — meaning an attacker who found the right URL could potentially pull medical records without logging in. Several apps transmitted sensitive data over unencrypted HTTP connections during at least some user flows. One app stored complete prescription images in a publicly accessible cloud storage bucket that was indexed by search engines.

These aren't theoretical vulnerabilities. They're the kind of gaps that script kiddies can exploit, let alone motivated attackers or state actors. And unlike a leaked password or a compromised credit card number, a leaked medical record can't be changed. Your diagnosis history is permanent. Your genetic test results are permanent. Your mental health treatment records are permanent. Once that data is out, it's out forever, and it can follow you through job applications, insurance underwriting, marriage prospects (in Indian social contexts where health conditions carry stigma), and more.

Employee access controls are another weak point that rarely gets mentioned. In most Indian health-tech companies, customer support staff need access to user accounts to resolve complaints. That means a 22-year-old support executive making Rs 25,000 a month can pull up your complete medical history, your prescriptions, your consultation notes. How many of those employees have been background checked? How many have signed NDAs that would actually hold up in court? How many would notice if a colleague was accessing records they had no business viewing? Most health-tech companies don't have the internal monitoring tools to detect unauthorized access by their own staff. A rogue employee — or one who's been bribed or coerced — could exfiltrate thousands of patient records without triggering any alarm. The healthcare industry in the US learned this lesson years ago, which is why HIPAA mandates detailed access logs and automatic flagging of suspicious access patterns. India has no equivalent requirement for healthcare data access monitoring.

Data localization is an unresolved question too. Some Indian healthcare apps store data on servers outside India — AWS regions in Singapore or Ireland, for instance. The DPDPA allows cross-border data transfer except to countries specifically blacklisted by the government, but the blacklist hasn't been published yet as of early 2026. In the interim, your lab reports could be sitting on a server in Frankfurt governed by EU data protection law, or in Singapore governed by Singapore's PDPA. That's not necessarily worse than local storage — in some cases, international cloud providers have better security practices than Indian hosting companies. But it adds complexity. If there's a breach, which country's regulators have jurisdiction? If you file a deletion request under the DPDPA, does it apply to data stored on foreign servers? These questions don't have settled answers yet.

Telemedicine adds its own wrinkle. When you have a video consultation through Practo or Apollo 247, that call is either recorded or it isn't. Most platforms don't clearly state which. If it is recorded, where's the recording stored? For how long? Who can access it? If it isn't recorded, is there a transcript? The Telemedicine Practice Guidelines issued by the Board of Governors of the Medical Council of India in 2020 say that teleconsultations should be documented, but they don't specify the format, storage requirements, or retention period in meaningful detail. A video recording of a medical consultation is arguably among the most sensitive data types imaginable. It contains your face, your voice, your symptoms described in your own words, and a doctor's assessment. If that recording leaks, you can't un-ring the bell.

I wanted to end this with a neat conclusion, some kind of reassuring call to action or optimistic outlook. But the honest truth is that Indian healthcare data protection is a mess. The apps aren't transparent enough. The consent mechanisms are too weak. The security standards are too low. The government's ABDM framework is promising but underbaked on enforcement. And the DPDPA, while a step forward, hasn't yet produced the kind of regulatory pressure that would force the health-tech industry to change its behavior.

Your medical records are sitting in databases you can't inspect, governed by privacy policies you didn't read, protected by security measures you can't verify, and potentially shared with companies you've never heard of. That's where things stand.

RK

Written by

Rajesh Kumar

Founder & Chief Editor

Rajesh Kumar is a cybersecurity expert with over 12 years of experience in digital privacy and data protection. He has worked with CERT-In and various Indian enterprises to strengthen their data security practices. He founded PrivacyTechIndia to make privacy awareness accessible to every Indian.

Found this article helpful? Share it!

Share:

Related Posts

Comments (0)

Leave a Comment

Loading comments...