When I first started working on Microsoft's privacy-preserving strategy, I found that most people in advertising treated "privacy-enhancing technologies" as a single abstract concept. Something that would magically solve the post-cookie problem without anyone having to change how they actually build systems. That's not how it works.
PETs are a family of distinct mathematical and engineering approaches, each with specific strengths, real limitations, and trade-offs that matter enormously in practice. Here's how I think about them after several years of working to integrate them into real advertising infrastructure.
Differential privacy: adding noise on purpose
Differential privacy is mathematically elegant. You add carefully calibrated noise to query results so that no individual record can be reverse-engineered from the output. Apple uses it for keyboard telemetry. Google uses it in Chrome reports. At Microsoft, we've applied it to audience analytics where advertisers need aggregate insights without accessing individual-level data.
The catch is the privacy-utility trade-off. More noise means more privacy but less precision. For large datasets this works well — the noise washes out. For small publishers with limited traffic, the noise can overwhelm the signal entirely. This is a real constraint that I've seen glossed over in too many vendor pitches.
Federated learning: keeping data where it lives
Instead of centralising user data to train a model, federated learning brings the model to the data. Devices train locally, then share only gradient updates with a central server. Google's Gboard was the canonical example; now the approach is spreading to ad tech for things like bid optimisation and content recommendation.
I'm genuinely excited about this one. It aligns the technical architecture with user expectations — your browsing data never leaves your device. But it requires careful implementation to prevent gradient attacks that can sometimes reconstruct individual data points from the updates. The devil, as always, is in the details.
Clean rooms: collaboration without exposure
Data clean rooms have become the practical workhorse of privacy-preserving measurement. An advertiser and publisher can combine their datasets in a controlled environment, run aggregate queries, and get measurement results without either party seeing the other's raw data.
I've been involved in several clean room deployments and the technology genuinely works. But the governance is as important as the cryptography. Who defines the queries? What aggregation thresholds prevent re-identification? How do you audit compliance? These operational questions are where most clean room implementations succeed or fail.
Homomorphic encryption and MPC
Homomorphic encryption — computing on encrypted data without decrypting it — sounds like science fiction, and frankly it still mostly is for real-time advertising use cases. The computational overhead remains prohibitive for anything that needs to happen in milliseconds. But for offline reporting and attribution, it's becoming practical.
Secure multi-party computation (MPC) has a better performance profile for advertising. Multiple parties can jointly compute a result without any single party seeing the others' inputs. Meta's been using MPC for conversion measurement. It's real, it works, and it's going to become standard infrastructure within a few years.
Zero-knowledge proofs
ZK proofs let you prove a claim without revealing underlying data. Prove you're over 18 without sharing your birthdate. Prove a user saw an ad without revealing which user. The applications for age-gating, frequency capping, and privacy-preserving verification are significant.
The technology is maturing fast, driven largely by blockchain applications, but the advertising use cases are where I think ZK proofs could have the most meaningful societal impact.
What actually matters
No single PET solves everything. The real skill is combining them appropriately — differential privacy for aggregate reporting, clean rooms for cross-platform measurement, federated learning for model training. The companies that treat PETs as an integrated toolkit rather than a checkbox will be the ones that build sustainable advertising businesses in a privacy-regulated world.
The companies that wait for a single magic solution will be waiting a long time.