The Tricks They Pull On You Every Day

That app you just downloaded? It is designed to trick you.

Not in some vague, conspiracy-theory way. In a specific, tested, measured, and intentionally deployed way. Teams of designers, product managers, and growth hackers sit in rooms and discuss how to make you hand over more data, spend more money, and agree to things you would never agree to if someone asked you plainly in conversation. The buttons are the wrong colour on purpose. The text is small on purpose. The “no” option is hidden on purpose. None of this is an accident. None of it is bad design by overworked junior developers. It is manipulation — and it has a name.

They are called dark patterns.

The term was coined by a UX designer named Harry Brignull back in 2010. He started cataloguing the tricks that websites and apps use to get people to do things they did not intend to do. He gave them names. The names stuck. And the tricks got worse, because now there was an entire industry of people who knew how to deploy them at scale.

Open your phone right now. Open any popular Indian app. Within the first three screens of signing up or using it, you will run into at least one of the following patterns. I am going to go through them one by one because I think most people experience these tricks every day without realising they are being manipulated, and once you see the pattern, you cannot unsee it.

Pre-checked consent boxes. You download a shopping app. You sign up with your email and phone number. At the bottom of the registration screen, below the “Create Account” button, in grey text on a white background, there is a checkbox. It is already ticked. It says something like “I agree to receive promotional offers via SMS, email, and WhatsApp from [Company] and its trusted partners.” You did not check that box. The app checked it for you. You would have to scroll down, notice it, and actively uncheck it to stop it from applying. Most people never see it. The box is designed to be invisible. The text is designed to be boring. The tick is designed to look like it belongs there. And just like that, without making a single active choice, you have given a company permission to flood your inbox and your WhatsApp with ads for the next three years. This is manipulation.

Unnecessary permission demands. A bill payment app asks for access to your contacts, your camera, your location, and your storage before it will let you pay your electricity bill. None of those have anything to do with paying a bill. Paying a bill requires an internet connection and your payment details. That is it. But the app shows a full-screen message saying it needs all these permissions “for the best experience” or “to function properly.” If you deny even one permission, the app refuses to open or shows a persistent nag screen every time you use it. Some lending apps in India go even further. They demand access to your SMS inbox, your call logs, your photo gallery, and your complete contact list. They call it “credit assessment.” Security researchers have found apps in this category uploading entire contact lists to remote servers. If you default on a loan from one of these apps, they have been known to message your contacts to shame you into paying. This has been documented in multiple news reports and RBI complaints. This is not credit assessment. This is coercion disguised as a feature.

A bill payment app does not need your contacts. A news app does not need your camera. A shopping app does not need your location tracked around the clock. If an app refuses to work without permissions that have nothing to do with its stated purpose, that tells you what the company actually wants from you. It is not your business. It is your data.

Confirm shaming. This one makes my blood boil. A pop-up appears asking if you want to enable notifications. The “yes” button is large, green, friendly, and says “Yes, I want great deals!” The “no” option is tiny grey text that reads: “No thanks, I prefer paying full price.” Read that again. The company is framing the act of declining spam notifications as a personal failing. You are not just saying no. You are admitting that you prefer to waste money. Other examples I have personally seen on Indian apps: “No, I do not care about my privacy” when declining a data collection newsletter. “No, I am okay with a worse experience” when refusing location tracking. “No, I will miss out on savings” when trying to close a promotional banner. The wording is calculated to trigger a tiny pang of guilt or stupidity. Just enough to make you hesitate. Just enough to make you tap “yes” against your own interest. This is manipulation.

Subscription traps and forced continuity. “Try Premium free for 7 days!” The button is bright and inviting. You tap it. You enter your card details or your UPI ID. No confirmation that you are signing up for a paid subscription after the trial. No reminder on day six. No notification on day seven. On day eight, 499 rupees disappears from your account. You go looking for the cancel button. It is not on the main screen. It is not in Settings. It is hidden under Account, then Subscription, then Manage, then a tiny “Cancel” link at the bottom of a page full of text about everything you will lose if you cancel. You tap Cancel. “Are you sure?” Yes, I am sure. “We can offer you 50% off for the next month.” No. “Please tell us why you are leaving” with a mandatory survey. No, just cancel. “Your cancellation will take effect at the end of the billing cycle.” Signing up took one tap and five seconds. Cancelling took four screens, two offers, one survey, and about three minutes of your time — if you even found the cancel button at all. Indian streaming platforms, fitness apps, language-learning apps, and ed-tech platforms all do this. This is manipulation.

Basket sneaking. You add a flight ticket to your cart on a travel booking site. When you reach the payment page, there is an extra line item: travel insurance for 349 rupees. You did not add it. The site added it for you. The option to remove it is a small grey checkbox that says “I do not want travel protection” and it is pre-checked to include the insurance. Some booking sites add meal plans, priority boarding, lounge access, or donation add-ons the same way. The total on the payment page is always higher than the price you saw on the search results page. You have to actively go hunting for the things that were sneaked into your basket and remove them one by one. This is manipulation.

Fake urgency and fake scarcity. “Only 2 left in stock!” “47 people are looking at this right now!” “This deal expires in 04:59!” with a countdown timer ticking on screen. These numbers are often fabricated. The two items left in stock have been two items left in stock for the past three weeks. The 47 people looking at the product is a randomly generated number that changes each time you refresh the page. The countdown timer resets to five minutes every time you reload. The purpose is to make you feel like you have to buy now or you will miss out. The urgency is artificial. The scarcity is fake. But the feeling of panic it creates in you is real, and that is what they are counting on.

Bundled consent. An app shows you one screen with one “I Agree” button. Behind that single button sits the terms of service, the privacy policy, consent to data sharing with third-party advertisers, marketing consent, and analytics tracking. All of it bundled into one agreement. You cannot accept the terms you need (to use the app) without also agreeing to the data sharing you do not want. The choice presented to you is: accept everything, or do not use the app. There is no middle ground. No granular control. No ability to say “yes to the service, no to the ads.” This is manipulation.

Confusing double-negative language. “Opt out of not receiving promotional messages.” Stop and parse that sentence. Which action results in you getting spam? Which action stops it? The answer is not immediately obvious, and that confusion is the point. If you are not sure what you are agreeing to, you are more likely to leave the default setting in place and move on. The default setting always favours the company.

Nagging. You decline a permission or a notification request. The app asks again. And again. Every time you open it, a pop-up appears asking for the same permission you already denied. After the fifth or sixth time, you give in just to make it stop. This is a war of attrition, and the app has more patience than you do.

The pattern behind all patterns: Every time an app makes the privacy-invasive choice easy, prominent, and rewarding and the privacy-preserving choice hidden, confusingly worded, or guilt-inducing, that is a dark pattern. It is designed. It is tested. It is deployed on purpose. The company profits from your confusion and your hurry.
Smartphone showing deceptive app interface with hidden unsubscribe button

Indian Law Is Catching Up (Slowly)

India now has laws that specifically target dark patterns. They exist on paper. Enforcement is another matter, but the legal framework is real and it is getting stronger.

In November 2023, the Central Consumer Protection Authority (CCPA) under the Ministry of Consumer Affairs published the “Guidelines for Prevention and Regulation of Dark Patterns.” These guidelines define dark patterns as “any practice or deceptive design pattern using UI/UX interactions on any platform, designed to mislead or trick users to do something they originally did not intend or want to do, by subverting or impairing the consumer autonomy, decision making or choice.” That definition is broad on purpose. It is meant to catch anything a company does to manipulate user behaviour through design.

The CCPA guidelines list thirteen specific types of dark patterns by name. These are:

1. False urgency: creating a fake sense of time pressure to push a purchase or decision. Countdown timers that reset, “limited time” offers that never actually expire.
2. Basket sneaking: adding products, services, or charges to your cart without your consent during the checkout process.
3. Confirm shaming: using guilt, fear, or shame to steer you toward a particular choice. The “No, I prefer paying full price” buttons.
4. Forced action: forcing you to take an action you do not want to take in order to access a feature you do want. Buy a subscription to cancel a subscription. Share your contacts to use a calculator.
5. Subscription trap: making it easy to subscribe and difficult to cancel. One-tap sign-up, twenty-step cancellation.
6. Interface interference: manipulating the user interface to highlight certain actions and obscure others. Making the “Accept All” button green and large while the “Manage Preferences” link is grey and tiny.
7. Bait and switch: advertising one thing and delivering another. A product shown at one price on the listing page appears at a higher price on the checkout page.
8. Drip pricing / hidden costs: showing a low price initially and then adding fees, taxes, and charges at the final payment step so the total is much higher than expected.
9. Disguised advertisement: designing ads to look like regular content or search results so users click on them thinking they are organic results.
10. Nagging: repeatedly asking users to do something, like enabling notifications or granting a permission, even after they have declined.
11. Trick question: using confusing, double-negative, or ambiguous language in consent requests so users make choices they did not intend.
12. SaaS billing: generating recurring charges for a service the user signed up for once and may no longer use, with no easy cancellation option.
13. Rogue malware: using scare tactics, like fake virus warnings, to trick users into installing apps or paying for unnecessary services.

Every one of these thirteen patterns is used by Indian apps right now. Not small, obscure apps. Major platforms with tens of millions of users. Travel booking sites that sneak insurance into your cart. E-commerce apps that show fake countdown timers. Ed-tech apps that trap students in auto-renewing subscriptions. Lending apps that force access to your entire phone. The CCPA guidelines apply to “all platforms, including advertisers and sellers, systematically offering goods or services in India.” Violations are punishable under the Consumer Protection Act, 2019, which allows for fines, product recalls, and cease-and-desist orders.

Then there is the Digital Personal Data Protection Act, 2023 (DPDP Act). The DPDP Act says that consent for processing personal data must be “free, specific, informed, unconditional, and unambiguous.” Take those five words and hold them up against what apps actually do. Pre-checked consent boxes? That is not “free” consent, because the choice was made for you. Bundled consent where one button covers everything? That is not “specific,” because you could not agree to one thing without agreeing to twenty other things. Double-negative language in consent forms? That is not “informed,” because a reasonable person cannot understand what they are agreeing to. A sign-up that takes thirty seconds and a deletion process that takes an email, a phone call, and a fourteen-day “cooling off period”? The DPDP Act says withdrawing consent must be as easy as giving it. If it is not, the company is violating the law.

The enforcement gap is still wide, though. The EU has fined Amazon 746 million euros for dark-pattern-related GDPR violations. The US Federal Trade Commission penalised Epic Games 245 million dollars for deceptive practices targeting children. In India, the Data Protection Board is operational but is still working through its early caseload. Consumer forums have started hearing complaints about deceptive design. Media coverage is increasing. The National Consumer Helpline has a functioning complaint portal. But we have not yet seen the kind of fine that would make a major Indian tech company rethink its entire onboarding flow overnight.

That will change. It always changes eventually. But it changes faster when people actually file complaints. Every complaint on record adds to the pressure. Ten complaints about the same app, the same pattern, from ten different cities, that creates a paper trail that regulators can use. The law is there. The rules are there. What is needed now is people who are angry enough to use them.

Fight Back: Practical Steps Right Now

You cannot avoid dark patterns entirely because they are in too many apps. But you can make them fail on you more often than they succeed. And you can make the companies that use them feel the consequences.

Slow down during sign-up. Speed is the weapon dark patterns depend on. The pre-checked box works because you tapped “Next” without scrolling down. The bundled consent works because you hit “I Agree” without reading what was behind the button. The free trial trap works because you entered your card details without checking the cancellation policy. All of these patterns fail if you take thirty extra seconds. Thirty seconds. Scroll to the bottom of every registration screen. Read the checkboxes. Look for pre-checked options and uncheck them. Read what the “I Agree” button is actually agreeing to. Look for a “Manage Preferences” link, even if it is small and grey and placed where you would not normally look. Thirty seconds of attention defeats most dark patterns, because they are designed for people who are not paying attention.

Find the hidden option. There is almost always a way to decline, skip, or manage preferences. The app just does not want you to see it. On consent pop-ups, look for a tiny “x” in the corner, or a “Skip” link at the bottom. On permission requests, remember that on Android you can always deny and then grant permissions later if you actually need them. On cookie consent banners, look for “Manage Settings” or “Reject All” rather than just hitting “Accept All.” The harder a company makes it to find the opt-out, the more they benefit from you not finding it. That alone should motivate you to look harder.

Question every permission request. When an app asks for your contacts, camera, microphone, location, or storage, pause and ask: does this app need this to do the thing I installed it for? A maps app needs your location. A photo editing app needs your camera and storage. A calculator needs nothing. A bill payment app needs your internet connection and your payment details and that is it. If the app demands permissions that make no sense for its stated purpose, deny them. If the app breaks after you deny a permission it should not need, that tells you something about the app that no rating or review ever will.

Set a cancellation reminder the instant you start a free trial. The moment you sign up for any free trial, open your phone calendar and create a reminder for one day before the trial expires. Do it right then. Do not tell yourself you will remember. You will not. The entire business model of free trials relies on the fact that people forget. Better yet: many apps on Android allow you to cancel the subscription immediately after signing up. You still keep the trial period, but the auto-renewal gets disabled. Go to Play Store > Payments & subscriptions > Subscriptions right after signing up and cancel there. You get the free trial and you do not get charged.

Do a monthly permissions audit. Once a month, go to Settings > Privacy > Permission Manager on your Android phone. Look at every app that has access to your location. Look at every app that can read your contacts. Check which apps have camera access, microphone access, storage access. You will find things you forgot about. Apps you used once six months ago still have access to your contacts. A game you downloaded for a flight still has access to your microphone. Revoke everything that does not make sense. It takes five minutes and you will be surprised every single time.

Report dark patterns when you see them. File a complaint with the National Consumer Helpline at 1800-11-4000. You can also file online through the NCH app or at consumerhelpline.gov.in. Name the app. Describe the dark pattern. Take a screenshot and attach it. Be specific: “The app pre-checked a box granting consent to marketing messages without my action” or “The cancellation process required six steps while sign-up required one step.” If you want to go public, leave a detailed app review on the Play Store or App Store that describes the deceptive practice. Companies track their review scores religiously. A drop of even 0.1 in average rating gets attention from product teams. Public visibility and consumer complaints together create pressure that internal ethics teams alone cannot.

If you are feeling particularly motivated, you can file a complaint under the Consumer Protection Act, 2019 through the e-Daakhil portal (edaakhil.nic.in), which is the online filing system for consumer commissions. You can file against any company whose dark patterns caused you financial loss or violated your right to informed consent. The fee is minimal and you do not need a lawyer for district-level commissions.

Tell Your Family About This

Everything I have described so far targets everyone who uses a smartphone. But some people are more vulnerable than others, and if you are reading an article about dark patterns on a privacy blog, you are probably not the most vulnerable person in your household.

Think about who else in your family uses a phone. Your parents. Your grandparents. Your aunt who just got her first smartphone last year. Your uncle who figured out WhatsApp but does not really understand what happens when he taps “Allow” on a permissions screen. Your ten-year-old cousin who downloads every free game in the Play Store without reading anything. These are the people dark patterns are built for. People who trust that what a screen shows them is honest. People who assume that if an app is asking for something, it must need it. People who do not know what a pre-checked box is or why it matters.

I watched my father sign up for a food delivery app last month. He hit “Allow” on every single permissions pop-up without reading any of them. When I asked him why, he said, “The app asked, so it must need it.” He granted location access, camera access, contact access, and notification access to an app whose only job is to deliver biryani. He did not know he could say no. He did not know what those permissions meant — and the app was designed for people exactly like him.

My mother got trapped in a subscription for a health tips app. She tapped “Try Free for 7 Days” and did not realise her credit card would be charged 299 rupees a month after the trial ended. She noticed the charge on her bank statement three months later. That is 897 rupees for an app she opened twice. When I tried to cancel it for her, I had to go through the app settings, then account, then subscription management, then answer three “are you sure?” prompts, decline a discount offer, and complete a mandatory feedback form. The whole process took me four minutes and I knew what I was doing. She would have given up after the first “are you sure” screen.

This is not their fault. They did not grow up with screens. They do not have the instinct to distrust an interface. When a screen says “Allow,” they assume it is asking permission out of courtesy, not out of greed. When a screen says “Free Trial,” they take it at face value. They do not know about the fine print, the auto-renewal, the hidden charges. And the companies that build these apps know this. They design for the least suspicious user, the one who will tap through everything quickly, the one who trusts what they see.

So if you have read this far and you understand what dark patterns are, you have a responsibility. Not a legal one. A human one. Sit down with the people in your family who are less familiar with how apps work. Show them, on their own phones, what a pre-checked consent box looks like. Show them the tiny grey “Skip” link at the bottom of a permissions screen. Show them how to find the “No” button that is deliberately hidden. Walk them through the Play Store subscription manager so they can see if anything is charging them without their knowledge. Help them set up their permission settings so apps cannot access their contacts and camera for no reason.

You do not need to turn this into a lecture. Just show them one thing. One dark pattern on one app they actually use. Once they see it once and understand that it was put there on purpose to trick them, they start noticing others on their own. The awareness spreads once it starts.

I showed my father the pre-checked marketing consent box on the food delivery app. I unchecked it in front of him and explained what it did. The next week, he called me and said, “That new banking app I downloaded had the same thing. I unchecked it.” One example was enough. He started looking.

Tell your family. Your parents, your grandparents, anyone in your house who uses a smartphone but did not grow up with one. They are the ones dark patterns target most effectively, and they will not learn about this from anywhere else unless you tell them.

App screen showing dark pattern tricks like pre-checked boxes and confusing buttons