
It’s no longer a gray area. In 2025, UX decisions are under legal scrutiny, especially when they manipulate users into giving consent or make it harder to refuse.
For years, companies hid “Reject All” behind multiple clicks and claimed technical compliance. But today, regulators have made it clear: that’s not just unethical. It’s illegal.
In the EU, under GDPR, consent must be freely given and informed–so dark patterns that obscure opt-out options or pre-select consent can invalidate it. In the U.S., states like California (under the CCPA/CPRA) explicitly prohibit the use of dark patterns to interfere with privacy choices. Globally, regulators are increasingly cracking down on these practices.
Whether it’s GDPR in Europe, CCPA in California, or FTC enforcement in the U.S., the message is the same:
“Consent must be clear, informed, and as easy to decline as to give.”
So why are dark patterns still in use? And what happens when companies cross the line?
Short-term performance pressure is a major driver. Teams often optimize for opt-in rates and data collection without considering user trust or long-term consequences. There’s also a lag in regulatory enforcement and a lack of cross-team alignment, especially between legal, marketing, and UX
Read more: Dark Patterns: Why They Still Work (and How to Spot Them)
Together, let’s unpack:
Because it’s no longer enough to look compliant on the surface. Consent UX is now a legal artifact—and intent matters.
Read more: How to stop using dark patterns (and build trust instead)
Under the EU’s General Data Protection Regulation (GDPR):
As clarified by the European Data Protection Board (EDPB):
“If the user is faced with a ‘Take it or leave it’ choice or if consenting is easier than refusing, consent is not valid.”
Under the ePrivacy Directive, cookies require prior consent–not coerced consent. Pre-checked boxes, obscured rejection paths, or “Consent Walls” that block access violate these principles.
Key legal interpretation:
If a banner buries opt-out, pre-selects consent, or uses confusing interface hierarchy, it’s likely non-compliant–even if it’s branded as a user-friendly design.
In theory, that leaves little room for dark patterns. But in practice, enforcement has been inconsistent–especially around user interface design.
California has gone further in directly naming dark patterns as a violation of the law.
Under the California Consumer Privacy Act (CCPA) and its amendment, the California Privacy Rights Act (CPRA):
In 2023–2024, the California Privacy Protection Agency (CPPA) issued guidance stating that:
“An interface that subverts or impairs a consumer’s choice… is a dark pattern and does not constitute valid consent.”
Examples of illegal design patterns under CPRA:
Takeaway: In California, manipulative UX is not just frowned upon—it’s explicitly illegal.
The EU’s DSA and DMA, which came into effect between 2024–2025, expand the focus beyond just cookie banners.
These regulations explicitly require platforms to:
Implication: This widens enforcement from data collection to the entire UX ecosystem.
In the United States, the Federal Trade Commission (FTC) has ramped up its scrutiny of deceptive design.
The FTC now considers dark patterns to be a form of deceptive or unfair trade practice under Section 5 of the FTC Act.
The FTC's stance:
If your UX nudges people to act against their interests or makes refusal unnecessarily hard, you’re deceiving users and breaking the law.
In 2025, American Honda Motor Co. faced scrutiny for employing dark patterns in their consent management processes, leading to significant privacy violations under the California Consumer Privacy Act (CCPA).
Honda's practices included the following:
According to the California Privacy Protection Agency:
The lesson: If your banner relies on confusion, delay, or intimidation, you’re not just losing trust, you’re opening the door to enforcement.
For years, companies created consent flows that passed legal review—while intentionally nudging users toward giving up data.
This is what regulators now call “compliance theater”:
Interfaces that technically follow the letter of the law but blatantly violate its intent.
But in 2025, regulators aren’t buying it.
Enforcement bodies are now:
Bottom line: UX is no longer just a product concern. It’s a legal surface.
Some brands are already moving beyond minimum compliance:
These companies build trust through clarity, not coercion. They show that it’s entirely possible to respect privacy and maintain high-quality UX. In fact, doing so often builds deeper user loyalty and higher engagement over time.
✅ “Reject All” is presented alongside “Accept All”
✅ All consent options are off by default
✅ No vague or euphemistic language
✅ Users can opt out with equal or fewer clicks than opt-in
✅ Data rights requests do not require excessive personal info
✅ Consent UX has been tested with real users for clarity
If you can’t confidently check all of these, your consent UX could:
The legal world is no longer silent about design. If your consent flow tricks users, it doesn’t matter how nice it looks or how many opt-ins it gets—it’s at risk.
From California to Brussels to Washington D.C., the future is clear: Consent UX must prioritize clarity, fairness, and autonomy.
So the question for your team isn’t “Are we compliant on paper?” It’s: “Would a reasonable person feel they had a real choice?”