•7 min read
Dark Patterns: How Tech Companies Trick You Into Giving Up Privacy
The manipulative design tricks that make it hard to protect your privacy and easy to surrender your data
Alex Rivera
UX Researcher & Advocate
# Dark Patterns: Manipulation in Design
## What Are Dark Patterns?
Dark patterns are user interface designs deliberately crafted to trick users into doing things they might not otherwise do—like sharing more data, staying on a platform longer, or making purchases.
## Common Dark Patterns
### 1. Privacy Zuckering
Named after Mark Zuckerberg, this involves tricking users into sharing more personal information than they intended.
**Example**: Facebook's privacy settings that require 15+ clicks to fully restrict sharing while defaulting to maximum sharing.
### 2. Forced Continuity
Making it very easy to sign up but very hard to cancel.
**Example**: Free trials that auto-convert to paid subscriptions, hidden cancellation processes requiring phone calls or multiple confirmation steps.
### 3. Confirmshaming
Using guilt or shame to manipulate users.
**Example**: "No thanks, I don't want to protect my privacy" as the option to decline.
### 4. Disguised Ads
Content that looks editorial but is actually advertising.
**Example**: "Sponsored" posts in social feeds designed to look like regular content.
### 5. Forced Action
Requiring users to do something unrelated to complete their desired action.
**Example**: Must enable location tracking to use an app that doesn't need location data.
### 6. Bait and Switch
User intends one thing but something different happens.
**Example**: Google's "Location History" toggle that doesn't actually stop all location tracking.
## Real-World Examples
### Google's Location Tracking
- "Location History" off didn't stop tracking
- Used "Web & App Activity" to continue surveillance
- Resulted in $391.5 million settlement
### LinkedIn's Dark Patterns
- Tricked users into spamming contacts
- Made it unclear what "Add Connections" would do
- Settled FTC complaint
### Amazon Prime Cancellation
- Multiple confirmation screens
- Confusing language
- Designed to prevent cancellations
- Under FTC investigation
## Why They Work
### Psychological Exploitation
- Exploit cognitive biases
- Take advantage of inattention
- Use social pressure
- Create artificial urgency
### Information Asymmetry
- Companies know users don't read fine print
- Interface design guides behavior
- Options are framed to favor company interests
## Fighting Dark Patterns
### What You Can Do
- Read all options carefully
- Don't click through quickly
- Screenshot important settings
- Use browser extensions that highlight dark patterns
- Report deceptive practices to FTC
### What Needs to Change
- Ban manipulative design practices
- Require plain language and clear choices
- Mandate easy opt-out processes
- Create liability for dark patterns
- Empower regulators to enforce rules
## The Bigger Picture
Dark patterns aren't just annoying—they're a symptom of surveillance capitalism. When profit depends on extracting maximum data and engagement, manipulative design becomes a competitive advantage.
Until we address the underlying business model, dark patterns will continue to proliferate.
## What Are Dark Patterns?
Dark patterns are user interface designs deliberately crafted to trick users into doing things they might not otherwise do—like sharing more data, staying on a platform longer, or making purchases.
## Common Dark Patterns
### 1. Privacy Zuckering
Named after Mark Zuckerberg, this involves tricking users into sharing more personal information than they intended.
**Example**: Facebook's privacy settings that require 15+ clicks to fully restrict sharing while defaulting to maximum sharing.
### 2. Forced Continuity
Making it very easy to sign up but very hard to cancel.
**Example**: Free trials that auto-convert to paid subscriptions, hidden cancellation processes requiring phone calls or multiple confirmation steps.
### 3. Confirmshaming
Using guilt or shame to manipulate users.
**Example**: "No thanks, I don't want to protect my privacy" as the option to decline.
### 4. Disguised Ads
Content that looks editorial but is actually advertising.
**Example**: "Sponsored" posts in social feeds designed to look like regular content.
### 5. Forced Action
Requiring users to do something unrelated to complete their desired action.
**Example**: Must enable location tracking to use an app that doesn't need location data.
### 6. Bait and Switch
User intends one thing but something different happens.
**Example**: Google's "Location History" toggle that doesn't actually stop all location tracking.
## Real-World Examples
### Google's Location Tracking
- "Location History" off didn't stop tracking
- Used "Web & App Activity" to continue surveillance
- Resulted in $391.5 million settlement
### LinkedIn's Dark Patterns
- Tricked users into spamming contacts
- Made it unclear what "Add Connections" would do
- Settled FTC complaint
### Amazon Prime Cancellation
- Multiple confirmation screens
- Confusing language
- Designed to prevent cancellations
- Under FTC investigation
## Why They Work
### Psychological Exploitation
- Exploit cognitive biases
- Take advantage of inattention
- Use social pressure
- Create artificial urgency
### Information Asymmetry
- Companies know users don't read fine print
- Interface design guides behavior
- Options are framed to favor company interests
## Fighting Dark Patterns
### What You Can Do
- Read all options carefully
- Don't click through quickly
- Screenshot important settings
- Use browser extensions that highlight dark patterns
- Report deceptive practices to FTC
### What Needs to Change
- Ban manipulative design practices
- Require plain language and clear choices
- Mandate easy opt-out processes
- Create liability for dark patterns
- Empower regulators to enforce rules
## The Bigger Picture
Dark patterns aren't just annoying—they're a symptom of surveillance capitalism. When profit depends on extracting maximum data and engagement, manipulative design becomes a competitive advantage.
Until we address the underlying business model, dark patterns will continue to proliferate.