Ted is writing things

On privacy, research, and privacy research.

Joining Tumult Labs

— updated

I'm joining Tumult Labs, a startup focused on making differential privacy ubiquitous. I'm incredibly excited about taking DP to the next stage, and I think Tumult Labs is a fantastic place to do that. Let me tell you why!

Why differential privacy?

Differential privacy (DP) makes it possible to safely publish statistics about sensitive data. Today, using DP in practice isn't very easy, but potential applications are everywhere. I'm hopeful that by making this tech more usable and more useful, we can make progress on three fronts.

Making existing practices safer

Today, many organizations compute statistics about people, and share it with third parties. Government agencies publish information about their citizens. Online platforms give content creators data about their audience. Hospitals share patient data with researchers. And of course, a myriad of commercial applications exist too.

All this data is often more revealing than people expect: anonymization is difficult to get right, and easy to get wrong. Problems happen all the time. Worse, these problems are especially harmful to folks in marginalized populations. Only releasing aggregate data, or using clever-looking mitigations, isn't enough. The US Census reconstruction attack made that very clear.

Worse, these are only the problems we hear about. Many organizations generate and archive statistical data, without ever sharing it with outsiders. What if they misclassify this data as "anonymized", even though reidentification is possible? This can have painful compliance implications, even though it might not make headlines.

Differential privacy is the only tool we know that prevents these problems. The math works: DP provides provable guarantees. The main obstacle to adoption is usability. So let's work on that, and make all these practices much safer!

Enabling new use cases

These existing practices are only the tip of the iceberg. How many applications never see the light of day, for privacy and compliance reasons? Governments want to do more open data, but don't want to breach their constituents' privacy. Organizations want to share data with researchers, without risking privacy incidents. Valuable data gets deleted, even though it could have been useful to share it with others, or archive some of it.

Robust and accessible anonymization tooling will enable many such use cases. It will lower the cost of both compliance and respectful privacy practices. Today, DP is still a shiny new concept that requires considerable investment to use: only large organizations can afford it. In a few years, we can turn DP into a standard industry best practice, like encryption at rest or audit trails. I can't wait for us to get there!

Improving incentive structures

What happens once it gets dead simple to use provable anonymization techniques? Enabling the use cases I mentioned before is one thing. But there might also be rippling beneficial effects, much more impactful in the long run.

Here's an example. Today, large companies can see value in sharing data with researchers. But the choice is theirs. What if they don't see the point? Or worse, what if they're afraid of what the research might uncover? Nothing stops them from keeping the data secret. Pushing these companies to be more transparent about their impact is difficult: privacy can be a convenient excuse to keep damaging information secret.

Once differential privacy is the new standard, and using it is easy, this excuse goes away. It will be easier to demand openness from companies holding massive amounts of data. This could happen, for example, through legislation. Today, regulators who want to mandate such transparency face a difficult task: how to do so without creating requirements that conflict with data protection regulations? Differential privacy could change the equation.

And legislation isn't the only path: changing cultural norms can have a massive impact as well. The easier it is to share data, the more organizations will. Over time, the perception of what is and isn't legitimate to keep secret will shift. And such a global shift towards openness can create a lot of pressure!

So that's why I'm excited about differential privacy. Now, why do I think Tumult Labs is a great place for making it successful?

Why Tumult Labs?

Tumult Labs was founded by three professors, who have been working on DP for more than a decade. They worked on foundational theory and on DP's first-ever real-world deployment. Now, they're advising the US Census Bureau on some of their most complex challenges.

Their work on real-world deployments led them to a realization I'm quite familiar with: if we want to reach wide adoption of DP, we have to invest in building excellent tooling. So they've been assembling a talented team to do exactly that: build what their clients need into a single general-purpose platform. And I was very impressed by what they published about this work! It shows both deep technical insight, and a focus on building things in a modular and reusable way. I can't wait to dig into this tech some more, and learn from the folks behind it.

Finally, I'm also eager to tackle the personal challenges that await. Being a part of a small startup should be a much-needed change of pace. Helping external clients will be new and pretty exciting. I'll also be Tumult's first employee in Europe, but not the last: I'm thrilled about the idea of growing the team there. Speaking of which, if the mission resonates with you, check out our careers page, or drop me a line directly!

All opinions here are my own, not my employers'.
I'm always glad to get feedback! If you'd like to contact me, please do so via e-mail (se.niatnofsed@neimad) or Twitter (@TedOnPrivacy).