

In February 2026, the California Attorney General settled a CCPA enforcement action against Disney DTC, LLC and ABC Enterprises for $2.75 million — the largest in the regulation's history. The case exposed four technical compliance failures common across enterprise privacy programs: (1) identity parity gaps for logged-out users, (2) disconnected opt-out architecture, (3) cross-brand opt-out propagation failures, and (4) non-web surface compliance gaps.
Since then, plenty of articles have covered what happened. Most of them do a thorough job covering the legal and interpretive dimensions, but stop short of the technical compliance questions that privacy leaders actually need to answer.
The technical compliance challenges at the center of this settlement are present in a wide range of enterprise privacy programs: not just streaming, not just media, not just large companies. And while the problems are real and genuinely complex, they are not mysterious or unsolvable. Privacy leaders who understand what enforcers expect — at the level of systems, data flows, and identity infrastructure — are in a much better position to assess their own programs and take concrete action.
We know this isn't an easy task: tackling hard technical problems, with lean teams, and regulations that never seem to slow down. Modern data infrastructure is genuinely complex, and operationalizing compliance is a work in progress. That’s what this article is for: a technical companion for navigating what the settlement actually means in practice.
The four critical, technical learnings from the Disney settlement are:
Together, we’ll walk through these four technical learnings, map them to the specific regulatory language, and give you a clear framework for auditing your own program against each one.
The most consequential principle in the Disney settlement has nothing to do with banners or toggles. It's this, stated directly in the complaint:
"If a business can associate a consumer's devices with the consumer for advertising purposes, it can and must associate those devices with the consumer for purposes of honoring the consumer's opt-out rights."
This is the parity principle. It means the scope of your opt-out obligation is defined by the scope of your data monetization — not by the scope of your consent management platform.
If your advertising infrastructure links a user's phone, laptop, and connected TV into a single identity for targeting purposes, a single opt-out request from that user must propagate across all three. The settlement's injunctive provisions make this explicit for logged-in users:
"When a consumer is logged-in to their Disney account…defendants shall effectuate the consumer's opt-out choice across all Disney streaming services that defendants associate with the consumer's Disney account."
That part most privacy leaders understand. What's less understood, and where this settlement breaks new ground, is what's required when a user is not logged in.
The Disney settlement requires that even for consumers who are not logged in, opt-outs must apply to "any consumer profile that defendants associate with that browser, application, or device, including pseudonymous profiles that defendants maintain in connection with selling/sharing or cross-context behavioral advertising."
Notably, this same language also appears in the California AG's October 2025 settlement with SlingTV. Regulators appear to be applying this standard across the industry, not selectively.
That phrase, “pseudonymous profiles,” is doing a lot of work. It means that if your systems can link an anonymous device signal to a known user profile for advertising purposes, that linkage creates an obligation to honor opt-outs for that profile, whether or not the consumer is currently authenticated.
This is where many privacy leaders push back: "We only apply consent to logged-in users — we don't know who the logged-out user is."
But the enforcement logic here is precisely the opposite. The question isn't whether the user is logged in. The question is whether you have the ability to identify them — and that ability doesn't need to be built in-house to create an obligation.
If you run advertising through platforms like Google or Meta, you are already benefiting from their cross-device identity resolution. The regulatory standard is clear: the obligation follows the use of identity capabilities, not who built them. If identity is being used to target, it must also be used to suppress.
Disney had opt-out infrastructure in place. There was a Do Not Sell or Share webform, in-app toggles, and GPC support. The enforcement action wasn't about the absence of compliance tools — it was about those tools operating in isolation.
The settlement's requirement is unambiguous:
"Defendants shall implement a consumer-friendly, easy to execute opt-out process that allows consumers to opt-out with minimal steps…for a consumer who opts-out, defendants shall stop selling and sharing the consumer's personal information and shall stop conducting cross-context behavioral advertising for that consumer."
"Minimal steps" and "stop selling and sharing" are not qualified by system, brand, or device. One opt-out. Full stop.
This is not a Disney-specific design flaw. This is a widespread privacy vendor architecture problem. Most enterprise privacy stacks separate two critical functions into two disconnected products: a consent management platform (CMP) — a core component of any CCPA compliance program — governing tag and cookie behavior on the website, and a data subject rights (DSR) tool handling opt-out webforms.
When these products are disconnected, a consumer submitting a Do Not Sell or Share request through the webform has no effect on the CMP's tag-firing behavior, meaning third-party data sharing continues uninterrupted on the site. The webform captured the request. The CMP and collection infrastructure never got the memo.
A particularly common version of this problem occurs when the DSR webform is hosted by the privacy vendor on their own domain rather than on the company's own web property.
OneTrust, for example, has widely used this approach — hosting the Do Not Sell form on a separate domain entirely. Because the form lives outside the company's web environment, it has no technical access to the CMP controlling that environment, making integration between the two effectively impossible by design.
Read more: Configuration, not code: Why OneTrust breaks at scale
The same architectural gap applies to opt-out preference signals like the Global Privacy Control (GPC). GPC is an intake mechanism — it tells your website that a consumer wants to opt out.
What happens next is entirely dependent on your architecture, and the signal itself does none of that downstream work. In this case, GPC opt-outs were applied only to the specific service and device where the signal was received.
The complaint is clear that this doesn't satisfy the law. Whether the opt-out is submitted via webform, in-app toggle, or GPC, the downstream enforcement requirement is identical. The intake path doesn't change what must happen next.
Read more: GPC signal compliance and privacy frameworks
Even for consumers who successfully opted out on one service, the enforcement action found that the opt-out did not automatically apply across other services tied to the same account. A consumer who opted out on Hulu remained subject to data sharing on Disney+ and ESPN+, despite the fact that Disney's advertising infrastructure treated those services as a unified identity system.
The settlement addresses this directly:
"When a consumer is logged-in to their Disney account…defendants shall effectuate the consumer's opt-out choice across all Disney streaming services that defendants associate with the consumer's Disney account."
The principle here is straightforward: propagation scope must match monetization scope. If your advertising advantage spans brands, properties, or business units, your opt-out enforcement must as well.
The method of opt-out invocation — whether a toggle, a webform, or a GPC signal — is irrelevant to this requirement. What matters is that a single consumer choice reaches every context in which that consumer's data is being sold or shared.
This has significant implications beyond streaming. Any organization operating multiple brands, sub-brands, or properties under a shared data infrastructure needs to examine whether an opt-out submitted on one property propagates across the others. In most cases, it doesn't — not because the technical capability doesn't exist, but because the compliance logic was never built to follow the monetization architecture.
The settlement also makes clear that the obligation extends downstream to third parties:
"Defendants shall comply with a consumer's opt-out choice as required by CCPA, including notifying all third parties to whom defendants have sold or shared a consumer's personal information."
Blocking tags on your own site is not sufficient. Third-party ad-tech partners who already hold that consumer's data must be notified and directed to comply. That requires direct integration and active signaling — not just tag suppression.
Connected TV CCPA compliance is the gap most enterprise privacy programs haven't closed. Cookie-based consent tools are web-native. They were built for browsers. They do not, by default, extend to mobile apps, connected TV environments, or other non-browser surfaces. The settlement makes clear that regulators expect opt-out functionality to work natively on every surface a consumer uses.
In this case, consumers on connected TV apps were directed to complete an opt-out on a different device entirely — a webform on a phone or computer. The problem: that webform had no effect on the code transmitting data from the CTV app to ad-tech partners.
There was, in practice, no way for those consumers to stop their data from being sold or shared via that surface.
The settlement addresses this directly:
"Defendants shall provide a clear and conspicuous opt-out link within all Disney streaming services that shall either immediately effectuate the consumer's choice to opt-out of sale and sharing, or in the alternative, direct the consumer to the notice of right to opt-out of sale/sharing."
Every surface. Not just web. And it must be formatted and functional for that environment — not a redirect to a different device.
The common thread across all four themes is this: compliance infrastructure that was built for a simpler, more web-centric privacy environment is no longer sufficient. Regulators understand how modern advertising and data monetization works — identity graphs, cross-device targeting, pseudonymous profiles, downstream data sharing — and they are now holding opt-out enforcement to the same standard.
These are genuinely hard technical problems. But "hard" and "impossible" are not the same thing, and complexity is not an acceptable defense in an enforcement action.
If your current privacy vendor can't clearly explain how they solve for cross-device identity propagation, unified opt-out workflows, and multi-surface enforcement, that's worth a direct conversation. The tooling to solve these problems exists. The question is whether the vendor you're paying to handle this actually has it.
Attorney General Bonta's summary of the case is worth keeping close:
"A consumer's opt-out right applies wherever and however a business sells data — businesses can't force people to go device-by-device or service-by-service."
That's the standard. Make sure your privacy program, especially the technology powering it, is built to meet it.