
Itâs no longer a gray area. In 2025, UX decisions are under legal scrutiny, especially when they manipulate users into giving consent or make it harder to refuse.
For years, companies hid âReject Allâ behind multiple clicks and claimed technical compliance. But today, regulators have made it clear: thatâs not just unethical. Itâs illegal.
In the EU, under GDPR, consent must be freely given and informedâso dark patterns that obscure opt-out options or pre-select consent can invalidate it. In the U.S., states like California (under the CCPA/CPRA) explicitly prohibit the use of dark patterns to interfere with privacy choices. Globally, regulators are increasingly cracking down on these practices.
Whether itâs GDPR in Europe, CCPA in California, or FTC enforcement in the U.S., the message is the same:
âConsent must be clear, informed, and as easy to decline as to give.â
So why are dark patterns still in use? And what happens when companies cross the line?
Short-term performance pressure is a major driver. Teams often optimize for opt-in rates and data collection without considering user trust or long-term consequences. Thereâs also a lag in regulatory enforcement and a lack of cross-team alignment, especially between legal, marketing, and UX
Read more: Dark Patterns: Why They Still Work (and How to Spot Them)
Together, letâs unpack:
Because itâs no longer enough to look compliant on the surface. Consent UX is now a legal artifactâand intent matters.
Any specific video here would be great
Under the EUâs General Data Protection Regulation (GDPR):Â
As clarified by the European Data Protection Board (EDPB):
âIf the user is faced with a âTake it or leave itâ choice or if consenting is easier than refusing, consent is not valid.â
Under the ePrivacy Directive, cookies require prior consentânot coerced consent. Pre-checked boxes, obscured rejection paths, or âConsent Wallsâ that block access violate these principles.
Key legal interpretation:
If a banner buries opt-out, pre-selects consent, or uses confusing interface hierarchy, itâs likely non-compliantâeven if itâs branded as a user-friendly design.
In theory, that leaves little room for dark patterns. But in practice, enforcement has been inconsistentâespecially around user interface design.
â
â
California has gone further in directly naming dark patterns as a violation of the law.
Under the California Consumer Privacy Act (CCPA) and its amendment, the California Privacy Rights Act (CPRA):
In 2023â2024, the California Privacy Protection Agency (CPPA)Â issued guidance stating that:
âAn interface that subverts or impairs a consumerâs choice⌠is a dark pattern and does not constitute valid consent.â
Examples of illegal design patterns under CPRA:
Takeaway: In California, manipulative UX is not just frowned uponâitâs explicitly illegal.
â
â
The EUâs DSA and DMA, which came into effect between 2024â2025, expand the focus beyond just cookie banners.
These regulations explicitly require platforms to:
 Implication: This widens enforcement from data collection to the entire UX ecosystem.
In the United States, the Federal Trade Commission (FTC) has ramped up its scrutiny of deceptive design.Â
The FTC now considers dark patterns to be a form of deceptive or unfair trade practice under Section 5 of the FTC Act.
The FTC's stance:
If your UX nudges people to act against their interests or makes refusal unnecessarily hard, youâre deceiving users and breaking the law.
â
â
In 2025, American Honda Motor Co. faced scrutiny for employing dark patterns in their consent management processes, leading to significant privacy violations under the California Consumer Privacy Act (CCPA).
Honda's practices included the following:
According to the California Privacy Protection Agency:
The lesson: If your banner relies on confusion, delay, or intimidation, youâre not just losing trust, youâre opening the door to enforcement.
â
â
For years, companies created consent flows that passed legal reviewâwhile intentionally nudging users toward giving up data.
This is what regulators now call âcompliance theaterâ:
Interfaces that technically follow the letter of the law but blatantly violate its intent.
But in 2025, regulators arenât buying it.
Enforcement bodies are now:
Bottom line: UX is no longer just a product concern. Itâs a legal surface.
Some brands are already moving beyond minimum compliance:
These companies build trust through clarity, not coercion. They show that itâs entirely possible to respect privacy and maintain high-quality UX. In fact, doing so often builds deeper user loyalty and higher engagement over time.
â
â
â
âReject Allâ is presented alongside âAccept Allâ
â
All consent options are off by default
â
No vague or euphemistic language
â
Users can opt out with equal or fewer clicks than opt-in
â
Data rights requests do not require excessive personal info
â
Consent UX has been tested with real users for clarity
If you canât confidently check all of these, your consent UX could:
â
The legal world is no longer silent about design. If your consent flow tricks users, it doesnât matter how nice it looks or how many opt-ins it getsâitâs at risk.
From California to Brussels to Washington D.C., the future is clear: Consent UX must prioritize clarity, fairness, and autonomy.
So the question for your team isnât âAre we compliant on paper?â Itâs: âWould a reasonable person feel they had a real choice?â