It’s going to be a long, hot summer for privacy leaders. As of July 1st, privacy frameworks in California, Colorado, and Connecticut will each be fully enforceable, and the CPRA’s amendments to CCPA will also kick in. Of course, teams have already been working hard to get into compliance with the new privacy rules taking effect in 2023, but questions remain about exactly how regulators and enforcers will apply those rules, and what they’ll seek to prioritize as their new powers kick in.
July 1st enforcement priority predictions
With a primary focus on California, Colorado, and the FTC, here are four key areas that I believe U.S. regulators will be focusing on over the next 18 months:
1. Health and Medical Data
To say that health will be an enforcement priority is probably an understatement given the FTC’s high-profile settlements with GoodRx, Premom, BetterHelp, and recent HHS guidance indicating that pixel data collected from covered entities is likely considered Protected Health Information (PHI) under HIPAA.
The FTC and HHS are telling us two things. First, the bar regarding what constitutes “sensitive” health is likely much different than most of us had previously thought. For instance, the FTC now says that even data indicating that a person suffers from acne is sensitive. While I don’t think that the FTC can necessarily enforce to that level, I’d be very careful about making assumptions about what is and isn’t considered sensitive–and extremely cautious when dealing with more clearly sensitive health information coming from apps.
Second, the FTC and HHS are making it clear that they’ll take a dim view of anyone collecting data based on visits to health sites and apps (and those run by HIPAA covered entities) without consent. In many cases, that’s tantamount to an absolute prohibition on such behavior.
At the state level, meanwhile, Colorado and California are adopting broad definitions of health data, particularly when it comes to inferences around such data. And then there’s the new Washington State Privacy Law - the My Health My Data Act–which arguably adopts the broadest definition of all. And when you combine broad and unclear definitions with a private right of action, you’ve got the makings of a regulatory nightmare for health advertisers.
My recommendations for health and medical data compliance:
- Rethink sensitivity. It might be that data labeling someone a “cold sufferer” is non-sensitive under FTC rules, but I would be very careful as you get into more intrusive segments. Don’t assume that simply because a condition is common or non-embarrassing that it’s non-sensitive.
- Take a state-by-state view. It may make sense to bifurcate your strategy, and even avoid processing sensitive data from places like California. Colorado, Connecticut, and Virginia, in order to sidestep some of the new rules.
Ultimately, I predict the health advertising marketplace is going to look really different in a year or two as operating under the status quo means taking on an unacceptable amount of risk for most companies in that space.
2. Data processing agreements (DPAs)
This might be chiefly a Californian enforcement priority, but it’s a significant one. Remember that DPAs will be required in California for just about any data sharing or sale (i.e., not just transfer to “service providers”). Remember, too, that the CPPA has the power to ask for just about any information it wants from companies. (Yes, that power will be tested–but consider whether your company has the budget to push back on one of these requests.)
It’s easy to imagine the CPPA sending out dozens (hundreds?) of requests to data-driven advertisers asking for DPAs as part of wider enforcement sweep. Depending on what they get back in response, regulators could then decide whether to make additional inquiries or proceed directly to enforcement actions.
Does your organization have DPAs in place to address all its data sharing arrangements, or is it limited to your company’s EU/UK data relationships? Do your DPAs cover all the potential data use-cases contemplated by the parties? Do your DPAs address all the criteria required by the CPRA and other state laws? And do you have DPAs in place for pixels your company places on third-party websites, even if the business relationship is currently inactive? If you don’t feel confident in your answers, now is a good time to revisit your DPAs.
My recommendations for DPA compliance:
- Take a fresh look at your DPAs. Make sure you have a DPA in place that covers all relevant U.S. privacy laws, outlines all potential use cases and data flows contemplated by the parties, and addresses all the key requirements laid out in the applicable regulations.
- Rethink your data practices. Consider turning off old pixels, or setting aggressive retention policies (such as a 24-hour retention period) for data collected via pixels, especially if you no longer need the data or don’t have a DPA in place. This is particularly important if those pixels are placed on sites run by HIPAA Covered Entities, or if the sites involved serve sensitive constituencies (e.g., ChristianMingle.com or Health.com/Epilepsy).
3. Secondary use of data
State privacy laws in California, Colorado, and Connecticut all focus on secondary uses of data, and Federal and International regulators are paying attention too. Twitter was recently dinged by the FTC for collecting data to enhance security, and then using that data for marketing purposes. Even more recently, Canada’s Privacy Commissioner penalized a retailer for using emails collected in the context of providing receipts for marketing purposes.
Too many companies have historically offered relatively vague privacy notices, the scope of which might not encompass all the use-cases. (Is simply saying “We collect data for third-party offers” sufficient to cover all the bases? I think not.) The data-driven advertising industry needs to do better, and quickly. This applies to advertisers and publishers, too, not just adtech/martech companies.
My recommendations for secondary data use compliance:
- Review your privacy notices. Make sure your upstream data and adtech partners, as well as those of your advertiser or publisher customers, are providing appropriate notices.
- Pay attention to sensitive data. Conducting privacy notice reviews is especially important insofar as you’re receiving sensitive data, since these will be the areas that regulators are most likely to scrutinize and target for enforcement. For many companies, simply removing sensitive data from your data set might be the most prodent approach.
4. “Other” Sensitive Personal Information
We’ve discussed health data, but there are a number of other forms of sensitive data that regulators are also watching closely.
I’d be particularly cautious when collecting or using data that indicates racial or ethnic origin, religious beliefs, mental or physical health diagnoses (or inferences about them), sexual orientation, citizenship or immigration status, biometrics, and data collected from children. Religious beliefs, race, and ethnicity are of particular concern given that it’s not uncommon for such factors to be used as targeting criteria by U.S. data-driven advertisers. And don’t forget that many jurisdictions now or will soon consider precise location data as sensitive. And a few (such as Connecticut and Washington State) have some particularly strict rules around the creation of sensitive locations (such as health facilities)–many of which go into effect over the upcoming weeks. Don’t get caught off guard!
Note, too, that collecting sensitive personal information now requires opt-in consent in Virginia, Connecticut, and Colorado. While California doesn’t require opt-in consent, they do require a number of additional disclosures.
My recommendations for sensitive personal information compliance:
- Don’t get cute. Noting that a browser is “Spanish-enabled” so you can deliver a product that is of interest to Spanish speakers is probably safe. But if you’re creating profiles based on keywords such as “Baptism” and “Sunday School” then you’re almost certainly creating profiles based on religious beliefs, even if you don’t label it that way in your taxonomy. As with health data, a profile can potentially be sensitive even if it isn’t based on uncommon, discriminatory, or embarrassing information.
- Be mindful of data sourcing. Be wary of vendors who claim to sell “public” data that’s exempted from privacy rules. While in some circumstances public data isn’t considered personal information, many brokers enrich their datasets with data from non-public sources. And that may, render the entire dataset subject to state privacy laws.
- Look out for secondary-use rules. As before, secondary use rules will apply to sensitive data types, especially if consent collection and privacy disclosures aren’t firmly aligned with the actual use-cases.
- Take a state-by-state view. Here, as elsewhere, organizations will need to look closely at the data they collect in each state, and consider declining to collect sensitive data types in states with strict privacy rules.
The bottom line
The new data privacy laws are a big deal, and regulators are eager to put them to use. You need to make sure you’re compliant with the letter of the law, but you also need to recognize that enforcement actions don’t happen in a vacuum–they’re a reflection of the key priorities of the agencies and regulators involved.
As you prepare for the July 1st deadline, make sure you’re considering not just the new statutory requirements that impact your business, but also the degree to which you’re operating in areas that regulators view as key enforcement priorities. Inevitably, we’re going to see some companies made examples of as regulators flex their muscles–so play it smart, and make sure you don’t wind up on their list of enforcement targets.
And one more thing: don’t shy away from outside help. The privacy rules are changing quickly, and privacy law is inextricably linked to competition, AI, international trade, and a host of other important areas. There’s so much going on these days that simply having a privacy resource–or even a whole team–is likely going to be insufficient for most data driven advertisers.
You’re going to need compliance solutions like Ketch, and a way to synthesize all of these changes in the privacy landscape so that the rules can be integrated into your business teams and incorporated into your larger strategies. Simply signing up for a few privacy email daily digests is unlikely to be as much help. You should consider a service–there are a few out there that I really like and am happy to share upon request. Connect with me on LinkedIn to get in touch.