🔮 What’s coming for Data Privacy in 2024? Download our definitive trend guide for exclusive insights

10 data privacy trends to watch in 2023

Odia Kagan of Fox Rothschild shares her tips on charting a path in a moving, twisting, and turning data privacy trends landscape.
Ketch is simple,
automated and cost effective
Book a 30 min Demo

When people ask what will happen in the world of data privacy over the next 12 months, I tell them to imagine looking through a kaleidoscope. Everything’s moving, twisting, turning: established rules are evolving, regulatory policies are shifting, and new legislation is coming in countless different jurisdictions. If you don’t focus, all the sparkly moving parts can make you dizzy. 

Fortunately, it’s possible to spot patterns amidst the chaos — and figure out a strategy for keeping both consumers and regulators happy. 

Top 10 in data privacy right now

As Chair of GDPR Compliance and International Privacy at Fox Rothschild, I am constantly examining legal developments in the US, the EU, and internationally, using my insights to help organizations to navigate these choppy waters. I was thrilled when Ketch asked me to share my thoughts. 

There are 10 key data privacy trends we can expect to see in 2023:

  • Sensitive and personal data
  • Children's privacy
  • Consent and dark patterns
  • Data minimization
  • Transparency and purpose specification
  • US federal and state rulemaking
  • Data residency and cross-border transfers
  • Employee privacy
  • AI governance
  • Risk assessments

‍

1. Sensitive & personal data

A key battleground in 2023 will be what counts as “sensitive” data. We’ll see a particular focus on biometrics, with state regulators taking inspiration from the Illinois Biometric Information Privacy Act (BIPA) and clamping down on everything from voice recognition to facial biometrics. Location privacy will also face scrutiny: in a post-Roe world, geolocation potentially reveals people’s reproductive health visits, so regulators are actively exploring how to determine location data sensitivity. 

Ad-targeting data is another sleeper issue, with seemingly innocuous data allowing sensitive inferences to be made. Certain companies are now reining in these practices to comply with voluntary industry guidelines, but regulators are still seeking new ways to clamp down on the misuse of ad data, especially insofar as targeting intersects with sectoral protections such as HIPAA. 

Key takeaway: Companies should think carefully about whether the data they’re collecting could be construed as sensitive and if it could - to put infrastructure in place to assess the risks it can pose and any measures that can be put in place to mitigate them. 

‍

2. Children’s privacy

Consumers care about their kids, so regulators aren’t pulling their punches when it comes to children’s privacy. Increasingly, data-privacy rules (including a proposed COPPA amendment) now seek to protect under-18s — a significant strengthening from previous rules that focused on under-13s. 

Another game-changer: regulators are looking not just at products targeted at children, but at products that children are likely to access. As anyone who’s tried to keep tabs on a teenager’s online behavior knows, that’s a wide net to cast, and many digital platforms will need to upgrade their privacy infrastructure. 

Key takeaway: Opt-out and consent tools won’t be enough: some state laws prohibit any use of children’s information for marketing or sales, putting the onus on organizations to create separate tracks for handling children’s data. 

‍

3. Consent & dark patterns

In 2023, regulators will continue their crackdown on “dark patterns” and other sneaky tricks that manipulate consent collection or allow data-gathering to fly under the radar. Of course, sneakiness is in the eye of the beholder, but if your UX nudges consumers to make a decision that favors your company, or makes it hard to opt out or request data deletion, don’t expect regulators to look on it kindly.  

Ongoing Meta Pixel litigation will also ensure that pixel-tracking technologies get continuing attention in coming months, especially insofar as they allow sensitive data such as healthcare information to be shared with third-party providers. We’re likely to see more enforcement actions as increased consumer awareness leads regulators to clamp down on these kinds of practices. 

Key takeaway: The key here is to align your practices with consumer expectations: tracking could be okay if users know about it, understand it, and consent to it. Cut corners or act furtively, though, and you’re potentially leaving yourself liable to suits under privacy rules or UDAP regulations.

‍

4. Data minimization

If you’re shopping for a jacket, it’s okay for your tailor to measure your arms or chest — but if they also log your health conditions and sexual orientation, well, that’s different. The same principle applies to data collected online: companies need to collect only data they truly need, and keep it no longer than necessary.

Some regulators effectively see this as a fiduciary issue: New York’s Digital Fairness Act, for instance, frames data minimization in terms of making decisions that are in the user’s interest rather than your own. With state AGs and the FTC sending clear signals, data minimization will be a key priority for all organizations in coming months.

Key takeaway: You need a convincing story for every bit of data you hold. If the regulators come knocking, you should be able to explain exactly why you collected a consumer’s data, and why you’re still holding onto it. 

‍

5. Transparency & purpose specification

Being clear about what you’re collecting is the only way to give consumers meaningful choice about how their data is used. Transparency may not be enough, but it is definitely a required component and is shaping up to be a top priority for regulators this year. 

We’ll see a particular focus on clearly articulating the purpose for which data is collected ad stating secondary uses, especially ones that are not immediately apparent to the user: this is already inherent in much of the regulatory language, but pending litigation and new applications of existing rules (such as the FTC’s rules against unfair and deceptive business practices) are likely to raise the stakes for businesses that fail to clearly communicate how data is being used. 

Key takeaway: Don’t give your customers homework or make them dig through complex privacy policies. Instead, offer a simple explanation in plain English of what’s at stake, so consumers can decide for themselves how their data is used. 

‍

6. US federal and state rulemaking

There are close to 20 different state data-privacy bills in the pipelines in the United States, which presents obvious problems — it’s that kaleidoscope again!

Even the existing state laws like California, Virginia and Colorado differ on some points. With other states introducing their own standards. 

While we are seeing some optimistic winds coming from Congress, after the Committee on Energy and Commerce meeting on March 1, we’re unlikely to see a single national framework anytime soon, and even a federal law wouldn’t  preempt all state-level regulations, so organizations will have to live with a regulatory patchwork for the foreseeable future. 

Key takeaway: Managing data-privacy manually is difficult to sustain. Instead, set your priorities and find a system or platform that’s flexible enough to manage complexity in a simple and repeatable way. With more complex processing an automated system may be what you need.   

‍

7. Data residency & cross-border transfers

Cross-border data transfers and data residency issues have been in the spotlight in recent months, with the OECD hammering out a kind of “gentleman’s agreement” on how intelligence agencies can access data, and Joe Biden issuing an executive order creating new safeguards for U.S. intelligence activities.

What does all that add up to? Well, the Biden order and the broader U.S.-EU Data Privacy Framework is still in draft status as we await an adequacy decision. We have now heard from the European Parliament and from the EDPB  and know that they are expecting some improvement to the framework, especially aligning the commercial principles with GDPR. I’m with the stoic philosophers on this one: we can’t worry about the US surveillance practice reform that is out of most of our hands, BUT - we can take steps to upgrade our privacy compliance in order to be ready for the final EU Data Privacy Framework certification criteria. 

Key takeaway: : Regulators on both sides of the pond will lean into enforcement as new rules are introduced. Take a look at your privacy compliance and see what you need to do in order to meet with the current (and potentially upgraded) DFP principles. 

‍

8. Employee privacy

Newsflash: employees are people too, and, at least in California, they now have the same data rights as your customers. Per CPRA in California employees and job applicants have clear privacy expectations that need to be respected, and state employee protections are unlikely to be preempted by any federal rulemaking.

That’s especially important given employers’ increasing reliance on automated employment decision technologies. A law in NYC and companion bills in New Jersey, DC, and elsewhere as well as recent EEOC guidance, make it clear that the use of AI is no defense against workplace discrimination claims, so employers will need to be very careful about the kinds of sensitive data — including disability status, gender, and race — that are captured or inferred by their machine learning tools.

Key takeaway: Put systems in place to ensure how you use sensitive employee data and adopt clear data deletion policies to ensure you retain only the data required to run your business.  If you are using AI, do an audit regarding fitness for purpose, the risks to the rights of individuals as well as the right of bias and discrimination. 

‍

9. AI governance

Artificial intelligence and machine learning are seeping into new areas of our organizations, and that creates big challenges when it comes to privacy and data management. What happens if a consumer wants you to delete data that was used to train an algorithm, for instance — do you need to “untrain” that model, and make it forget it ever saw the consumer’s data?

Regulators in the U.S. and Europe are considering questions like these, and asking probing questions about the interplay between AI, transparency, and consumers’ data rights. One important landmine to watch out for: wiretapping statutes are now being used to regulate capturing client conversations to train chatbots. 

Key takeaway: With existing rules being repurposed to enforce data-privacy priorities, waiting for new regulatory guidance could backfire. Be proactive and put transparency and risk assessment and mitigation at the core of your AI strategy, with data minimization as well as clear notification and choice processes.

‍

10. Risk assessments

The regulatory requirements for risk assessments are still in flux, with California regs only in the public inquiry stage and Colorado rules not yet final. Does this mean that you should put off adopting risk assessment processes in place now, because they might need to be tweaked in the future? In short: No 

It’s a better idea to start now, using Colorado’s very detailed almost-final rules and what we already know from GDPR as the baselines. There will be plenty of overlap between GDPR, California, and Colorado regulations, and it will be much easier to make the final changes you need once the regulations are final than to have to redo entire data processing operations because you did not do a DPIA in the beginning and developed an app that unduly risks people’s rights/ 

Key takeaway: Don’t let the perfect become the enemy of the good. It will be far easier to keep pace if you’ve put a solid baseline risk assessment process in place. 

‍

Prioritize and optimize for success in 2023

Data privacy has never been more important to consumers — or faced greater regulatory scrutiny. “Kaleidoscope 2023”, like the 650 unread email Inbox or the sink full of crusted over dirty dishes - seems very daunting. 

The solution: start, prioritize, iterate, repeat. 

Organizations should prioritize the lowest common denominator: data minimization, consumer expectations, transparency, choice, buttoned down supply chain. 

When doing that, also optimize for flexibility, and put solutions in place that will allow them to proactively adapt to new rules and evolving consumer expectations. 

With so much at stake in 2023, it’s worth investing now in the solutions your company needs to succeed in the new era of modern data privacy. 

‍

Want to see more from Odia Kagan and Ketch? Watch our on-demand IAPP webinar on 2023 data privacy trends:

Read time
10 min read
Published
March 2, 2023
Sign up for Ketch free

Practical privacy solutions for today, future-proof foundation for tomorrow

Sign up free

Continue reading

Regulations, Strategy
Dark patterns matter– and consumers are the victims
Jonathan Joseph
5 min read
Marketing, Privacy tech, Strategy
Navigating a cookieless future with Google Privacy Sandbox
Colleen Barry
7 min read
Case Study, Data subject rights
Ketch DSR automation unlocks efficiency and ease for 6sense
Kara Kennedy
5 min read

Get started with Ketch

Simplifying your privacy program has never been easier. Begin your journey to simplified privacy operations and granular data control across the enterprise.