🆕  2025 U.S. State Privacy Laws: what you need to know

5 ways to unify your legal and tech teams in the AI era

Privacy is more than just a team sport. Here’s how to drive real alignment across functional areas.
Read time
6 min read
Last updated
July 18, 2024
Ketch is simple,
automated and cost effective
Book a 30 min Demo

Privacy has always been a team sport — but in the AI era, it’s more important than ever for legal and tech leaders to understand and support one another’s efforts. Regulators are paying close attention to emerging AI technologies, and new research shows that consumers are similarly ambivalent about the new ways that businesses are using their data. To realize the enormous potential of AI without damaging their brand, organizations need legal and technical teams to work in lockstep to orchestrate responsible data practices effectively across an increasingly complex and fast-moving data ecosystem.   

AI governance is a team sport: 5 tips for collaboration

Nobody understands that challenge better than Kelley Drye partner Alysa Hutnik and Ketch co-founder and CTO Vivek Vaidya — two of the privacy industry’s leading legal and technical experts. We brought them together for an IAPP webinar to dig into the challenges of this new era — and explore how legal and tech teams can find common ground and work together effectively to manage risks and growth opportunities of the AI revolution. 

Here are five key takeaways from their discussion:

1. Privacy isn’t just a legal challenge 

According to Gartner, 40% of organizations now have thousands of Al models deployed, creating major governance challenges for both legal and technical teams. Lawyers already spend much of their time trying to see round corners and anticipate how data that’s collected for one purpose or use-case could seep into new AI models and be used in new, potentially more problematic ways. “You have to use your imagination: how will that information be used?” Alysa warns.

As more and more AI models are added to the mix, it gets harder for legal teams to navigate the resultant complexity. To cope, organizations need technological solutions to prevent sensitive data from being fed into AI models, and to manage the countless different AI models they’re deploying. Which datasets are drawn on by which AI models for what purpose, and how can that data be tracked and permissioned as it flows through your business? “This is Governance 101,” Vivek explains. “It starts with creating a system of record where you can track all these things, so you can confidently answer these kinds of questions.” 

2. Get clear about who owns what

In the AI era, there’s no room for turf wars. That means privacy initiatives need clear CEO-level sponsorship to ensure buy-in, Vivek says, and also clear investment in and ownership of privacy processes by the teams that are actually building and implementing AI solutions. “Privacy requires collaboration across a large number of stakeholders,” Vivek says. “But my preference is for this to ultimately be owned by the CTO, because they're the ones responsible for putting these things in production.”

That approach works well for legal teams, Alysa agreed, because it ensures that AI privacy initiatives aren’t seen merely as one more compliance box to be checked. Legal should play a supporting and guiding role from early on in any new project, but privacy needs to be seen as a core business priority for development teams and for the organization as a whole. “This is a business plan, not a legal plan,” Alysa says. “The legal team is going to support and enable it, but these are really business goals that have to be attacked head on.”

3. To win consumer trust, start with common sense 

Companies are running a deficit when it comes to consumer trust, with more than three-quarters of consumers saying they’d prefer to go back to a simpler time when businesses didn’t know so much about them. That’s fueled by confusion about how AI operates, with some consumers even believing companies are using AI to tap into their phones or smart devices and eavesdrop on their conversations. That underscores the importance of clear communication around new AI projects, Vivek says. “Companies aren’t transparent about how they use data,” he warns. “That’s what leads to all these conspiracy theories.”

Part of the problem is that companies are leaning too hard on privacy policies to alleviate such concerns, Alysa says. Privacy policies are important legal documents, but they’re written for lawyers and regulators, and focus on ensuring compliance rather than empowering consumers. That means it’s up to marketing and communication teams to effectively explain how AI is being used, and what that means for consumers’ data. “Nobody really expects consumers to take the time to read your privacy policy,” Alysa says. “The trust factor, the relationship factor, comes when you have a real conversation with the consumer.”

4. Build mutual understanding

To get lawyers and technical teams on the same page, it’s important to bring legal and privacy experts into the process of planning and developing AI projects, Alysa says. The old-school approach, where lawyers and privacy professionals parachuted in late in the day to create notices and disclosures for a new project, doesn’t work for AI initiatives that absorb vast amounts of data early in the training and development process. “We need input in the planning, so organizations can bake in privacy and security and spot issues at the outset,” Alysa explains.

In much the same way, Vivek agrees, tech teams need to think seriously about privacy from the earliest days of project design. Instead of building a solution then trying to layer on point solutions to comply with specific regulatory requirements, teams need to think early and often about how consumers experience privacy while using their tools. “Put yourself in the consumer’s shoes — how would you want to be treated?” Vivek advises. “Ask yourself those questions, then design a data architecture that builds trust with with your consumers.

5. Focus on the opportunities

Privacy planning often focuses on the potential for things to go wrong, and there are certainly risks involved. But there’s also a tasty carrot to go along with the stick. According to Ketch’s research on consumer perspectives, purchase intent jumps 15.4% when companies use ethical data practices. That means companies that build responsibly and communicate effectively could see big revenue gains in coming years. “Often lawyers aren’t comfortable talking about that angle — we just talk about compliance,” Hutnik says. “But that has to change as we work together to solve these problems.”

A new focus on opportunities instead of (or as well as) risks is needed to keep legal and technical teams on the same page as organizations deploy AI systems. Engineers need to make consent a top priority in everything they build, Vaidya says, while legal teams must support that effort by working with data engineers to surface opportunities to implement and promote responsible data practices. “We need to think offense, and not just defense,” Vaidya adds. “Don't think of it as compliance. Think of it as opportunity to elevate your business.” 

AI and privacy are inseparable

For today’s organizations, AI and privacy are now joined at the hip — and that means legal and technical teams will need to stick together, too. Engineers will need to put privacy first as they develop new technologies, and legal teams will need to leverage technological solutions and new kinds of data infrastructure to help them navigate the new reality of ubiquitous and data-hungry AI technologies.

At Ketch, we’re building the privacy-safe infrastructure that legal and technical teams need to ensure end-to-end responsible data handling in the AI era. From programmatic privacy tools designed to rapidly enforce legal interpretations across complex data ecosystems, to robust protections to ensure that sensitive business or consumer data isn’t improperly fed into GenAI tools, Ketch is here to establish and scale the infrastructure your organization needs. 

Get in touch to find out more — and learn how Ketch can keep your technical and legal teams pulling in the same direction to deliver privacy-safe solutions for the AI era.

Go further:

Read time
6 min read
Published
December 8, 2023
Need to assemble your privacy dream team?

Learn more about how to build consensus and support in your business with our Practical Guide to Privacy.

Check it out
Need an easy-to-use consent management solution?

Ketch makes consent banner set-up a breeze with drag-and-drop tools that match your brand perfectly. Let us show you.

Book a 30 min Demo

Continue reading

Product, Privacy tech, Top articles

Advertising on Google? You must use a Google certified CMP

Sam Alexander
3 min read
Marketing, Privacy tech

3 major privacy challenges for retail & ecommerce brands

Colleen Barry
7 min read
Marketing, Privacy tech, Strategy

Navigating a cookieless future with Google Privacy Sandbox

Colleen Barry
7 min read
Get started
with Ketch
Begin your journey to simplified privacy operations and granular data control across the enterprise.
Book a Demo
Ketch was named top consent management platform on G2