From the rise of AI to new state and federal legislation, big changes are coming for the privacy industry — but amidst all the think pieces and regulatory updates, it can be hard to find actionable advice and practical guidance.
Recently, Ketch joined forces with Kelley Drye for a new kind of privacy event: a hands-on workshop for privacy practitioners to learn from tech and legal experts, discuss the transformations impacting their businesses, and work together to find pragmatic solutions.
Practical tips for 3 key data privacy challenges
Here are the three biggest challenges that attendees wrestled with during the workshop:
1. How can we collaborate effectively to complete risk assessments, specifically DPIAs?Â
Data Privacy Impact Assessments (DPIAs) can seem daunting, say Kelley Drye partner Aaron Burstein and Network Advertising Initiative VP for Public Policy David LeDuc, but they’re really a way of telling a story about your organization’s data use.Â
DPIAs can be more than just a regulatory requirement — it’s an opportunity to develop a comprehensive overview of your risk exposure, spanning your organization’s data practices, privacy governance, cross-team and cross-disciplinary engagement, and diligence processes with partners and vendors.Â
Here’s what you need to know to manage DPIA obligations effectively:
- The where: As of this year, DPIAs are required in Virginia, Colorado, and Connecticut — plus California, although they haven’t yet finalized their regulations. With privacy laws in Montana, Tennessee, Texas, and Indiana also requiring DPIAs, it’s a fair bet that most other states will go along with the trend, imposing DPIA obligations on most U.S. companies.
- The when: Common DPIA triggers include targeted advertising and the sale of personal data, and also profiling that involves sensitive data or carries a risk of injury or disparate treatment. Rules on DPIA submission and review vary from state to state, however: California requires proactive submission of DPIAs on a “regular basis,” while other states take an on-demand approach with attorneys general empowered to demand DPIAs from businesses.Â
- The what: Requirements vary by jurisdiction, but generally include a good-faith effort to articulate the risks and benefits of data processing. Colorado, for instance, requires a “genuine, thoughtful” analysis. Organizations should clearly state their processing activity, list specific risks and mitigation efforts (including technical measures and training), and communicate the benefits to individuals.Â
- The how: As DPIAs grow more common, there’s a risk of fatigue setting in unless you put efficient systems in place to reduce the burden. Fortunately, a single DPIA can cover a number of materially similar data-processing activities, as long as it’s updated if processing activities change, or the risk profile evolves over time.Â
The key will be to build out cross-disciplinary DPIA programs that enable collaboration between legal, privacy, and tech teams. That might require the creation of a Data Protection Officer, or the empowerment of existing privacy leaders to coordinate the assessment process. With the right processes and leadership, a DPIA is more than just checking boxes — it’s a real opportunity to assess and improve privacy and data-handling operations across your entire organization.
2. What do privacy leaders need to understand about clean rooms?Â
Clean rooms are a key part of the new privacy landscape, but they’re also poorly understood, warn Kelley Drye’s privacy and InfoSec chair Alysa Hutnik and Shullman Advisory founder Julia Shullman. That’s a dangerous combination because it fuels marketing spin and mythmaking, potentially leading organizations to make costly mistakes or poor investment choices.
To be clear on clean rooms, here’s what you need to know:Â
- What clean rooms are: A clean room is a framework for collaborative data analysis in a controlled environment, with restrictions on how data can be viewed and exported, and a range of tools to support privacy and data protection. Using a clean room, it’s possible to collaborate with partners while still ensuring the privacy and security of first-party data.
- What clean rooms aren’t: Clean rooms are powerful, but they aren’t magical places where ordinary privacy laws don’t apply. The legal requirements differ based on the use-cases and processing involved, but they never go away entirely. We should think of clean rooms as a way of getting things done while still complying with relevant laws and regulations — not a way of sidestepping the statute book altogether.
- What’s still TBD: Third-party cookie deprecation and the rise of new privacy laws are clarifying the need for solutions that enable continuing collaboration around permissioned first-party data. But while the need is clear, questions remain: we’re still waiting for an industry consensus to emerge around the way that clean rooms should operate. Data-processing standards, canonical use-cases, and regulatory perspectives are all still evolving.Â
- What regulators are watching for: The specific regulations impacting clean-room use will vary by jurisdiction and use-case. When reviewing your privacy and compliance programs, it’s important to consider which direction the data is flowing (are you providing or receiving data?), what types of data is being used, and what limitations are being imposed on the way data can be acted on and extracted from the clean room.
Given all of the above, it’s important to be clear about both your own responsibilities and those of the clean room provider. Your service provider contract should explicitly prohibit the sale or share of any personal information that passes through the clean room, and should also spell out all relevant privacy compliance requirements. Make sure these obligations also flow down to any subcontractors, and that first-party and third-party data are only commingled in ways that are expressly permitted by regulators.
3. How do we build privacy guardrails for generative AI?Â
Artificial intelligence is becoming a top concern for many privacy leaders. In a discussion with Alysa Hutnik of Kelley Drye and Ketch CTO Vivek Vaidya, we explored how privacy concerns and ethical considerations intersect as organizations deploy new generative AI technologies.
One key insight: when it comes to AI, running an effective privacy strategy means staying focused on real-world consequences. There aren’t yet many specific regulations covering data privacy and AI, so regulators will use consumer protection laws banning unfair and deceptive practices to police AI data privacy.Â
To stay safe, consider implementing these guardrails:Â
- Track inputs and outputs. If personal information is being fed into your AI algorithms, your privacy team needs to know about it and ensure the inputs are properly permissioned for the specific use-case in question. It’s also important to monitor the sensitivity of the data that’s being used, and the degree to which such data persists in algorithms or their outputs. As a practical matter, it may be easiest for privacy teams to pre-approve some specific use-cases while developing policies and monitoring systems to manage other applications of AI technologies.Â
- For internal use-cases: Using AI internally is generally less challenging, but you’ll need to make sure that “internal” really means internal. You need clear frameworks for sourcing training data and ensuring that no personal or confidential information is used in inappropriate ways, even for internal purposes. Be imaginative when considering worst case scenarios: it’s easier than you’d think for internal tools and data to seep into external applications, creating serious privacy vulnerabilities.
- For external use-cases: The stakes are undoubtedly higher when working with commercial or public-facing AI technologies, so make sure you’re conducting risk assessments and putting mitigation strategies in place. If you’re using third-party data as part of your AI solutions, make sure you’re doing enough to vet and monitor the incoming datastreams. Remember, the goal isn’t just compliance with regulations — it’s creating systems that your customers can trust, both to protect your brand reputation and to ensure a continuing stream of permissioned data for your algorithms.
Maintaining a fully permissioned, privacy-safe data set is critical for GenAI initiatives — not least because if organizations find their access to data limited by privacy and compliance gaps, their AI initiatives will be dead on arrival.Â
Building out transparent data processes and enforcing clear ethical boundaries doesn’t constrain innovation — it makes it possible to drive innovation in enduring and sustainable ways.
Find a path to privacy maturity
With new DPIA requirements, the rise of clean rooms, and the emergence of AI, privacy professionals have their work cut out for them in coming months. To succeed, they need to build mature privacy programs capable of adapting to emerging challenges.
To achieve this, privacy professionals need to take a pragmatic approach to meeting their organization’s evolving needs. Privacy leaders can’t simply focus on data privacy, risk mitigation, or even securing customer trust. We need to do all three of those things, while also focusing on leveraging new technologies and building cross-functional operational efficiencies, in order to deliver the pragmatic and effective privacy programs our organizations need.Â