
It’s 2025. Consumers are more privacy-aware, regulation is tighter, and conversations around ethical data practices are mainstream. By now, most people know they should read the fine print. They know that “enhancing your experience” often means “tracking your behavior.” They know they have rights under laws like the GDPR and CCPA.
And yet every day, users still fall into the same traps:
These are not design bugs. They’re features: strategic design decisions known as dark patterns.
Despite regulatory pressure and growing public awareness, dark patterns continue to shape consent flows across the internet. Together, let’s break down why they persist in 2025, how they work, and what businesses can do instead.
Dark patterns are design choices that intentionally mislead, pressure, or manipulate users into taking actions they might not otherwise choose—like accepting all cookies, sharing more data than intended, or skipping privacy settings altogether.
Dark patterns often blur the line between clever UX and coercion, prioritizing business goals at the cost of user autonomy. In consent user experience–the most well-known example being cookie banners–this typically means nudging users toward giving up more data.
These tactics don’t happen by accident. They’re designed with a purpose—to extract consent while avoiding the friction of informed refusal.
Read more: Dark patterns matter– and consumers are the victims
Another common term for dark patterns is deceptive design, which is increasingly used in legal, academic, and UX circles to describe the same manipulative interface tactics. While "dark patterns" is widely recognized, "deceptive design" is often preferred for its clearer, more neutral tone—especially in regulatory and public-facing discussions.
Other related phrases include manipulative UX, coercive design, or anti-patterns, all of which highlight the intention behind the interface: to steer users toward decisions that may benefit the business at the user's expense.
As awareness grows, shifting language toward “deceptive design” helps focus the conversation on intent and impact, not just aesthetics.
Here’s how dark patterns show up in popular services:
Each of these brands has been criticized (and in some cases penalized) for creating consent experiences that feel more like traps than choices.
Take Honda’s recent cookie banner controversy: the company was called out for making it nearly impossible to refuse non-essential cookies without navigating multiple layers of confusing language and subtle nudges. This wasn’t an isolated case—it was just the latest example of how widespread and normalized these deceptive UX tactics remain, even among trusted global brands.
Despite growing awareness, public criticism, and tighter regulation, dark patterns remain deeply embedded in digital experiences, especially when it comes to consent. So why are they still here in 2025? The answer lies in a combination of psychology, incentives, loopholes, and internal misalignment.
Dark patterns take advantage of cognitive biases:
Loss aversion: Subtle messages (“You may lose functionality”) scare users into consenting.
These tactics don’t convince users—they wear them down.
Let’s be honest: dark patterns convert.
Many companies still treat consent as a conversion funnel. Teams A/B test different banner layouts to increase opt-ins, optimize cookie settings for data volume, and prioritize “user data collected” as a growth metric.
But here’s the problem:
Short-term data wins often lead to long-term trust losses.
Once users feel manipulated, they don’t just avoid your settings—they avoid your brand.
Consent must be freely given, according to GDPR. But what does “freely” mean when “Reject All” is three clicks deeper?
Regulations are evolving, but UX design is outpacing enforcement. This has led to widespread “compliance theater”—interfaces that technically comply, while ethically failing users.
Dark patterns often emerge from silos:
Without collaboration, the default becomes: “Let’s just get the consent we need and move on.” That’s where manipulation creeps in.
Dark patterns take many forms in consent interfaces, but they all share a common goal: nudging users toward surrendering more data, often without fully realizing it.
Let’s break them down—and compare them to ethical design alternatives:
Designs that make it much harder to refuse than to accept. A classic example is the “Reject all” option hidden behind multiple clicks or buried deep in settings, while “Accept all” is instantly available. The friction is deliberate—it’s designed to wear down the user’s resistance.
Checkboxes for consent already ticked by default. This assumes consent without a clear, affirmative action from the user, violating not only ethical UX principles but often legal standards as well.
Repeated prompts that reappear every time a user declines consent or ignores the banner. Some interfaces make dismissing the prompt temporary or even impossible until the user gives in. The goal: wear the user down through sheer repetition.
Phrases like “We use cookies to enhance your experience” without clearly stating that data will be used for tracking or advertising. This language feels friendly and neutral but conceals the true extent of data usage.
Access to a website or app is contingent on accepting non-essential cookies or providing personal data. Users are denied entry or functionality unless they submit—essentially turning “consent” into coercion.
Designers use color, size, and placement to guide the user’s eye—and action. “Accept all” might be a bright, bold button, while “Manage settings” is grayed out or looks like a hyperlink buried in fine print. This primes users to act without considering other options.
Ask yourself:
If you’re unsure, test your design with real users. If they feel confused, rushed, or coerced—it’s not ethical UX.
The digital world is evolving. Users expect transparency. Regulators are sharpening their focus. And the brands that succeed in the future will be those that build trust by design.
Dark patterns may still work—but they’re running out of time.
Go further: How to stop using dark patterns (and build trust instead)