Since the original introduction of differential privacy, many variants and extensions have been proposed to adapt it to different contexts or assumptions. These new definitions enable practitioners to get privacy guarantees, even in cases that the original DP definition does not cover well. This happens in a variety of scenarios: the noise mandated by DP can be too large and force the data custodian to consider a weaker alternative, the risk model might be inappropriate for certain use cases, or the context might require the data owner to make stronger statements on what information the privacy mechanism can reveal.
Figure 2.7 shows the prevalence of this phenomenon: approximately 225 different notions2, inspired by DP, were defined in the last 15 years. As we show in Figure 2.7, this phenomenon does not seem to slow down over time. These definitions can be extensions or variants of DP. An extension encompasses the original DP notion as a special case, while a variant changes some aspect, typically to weaken or strengthen the original definition.
With so many definitions, it is difficult for new practitioners to get an overview of this research area. Many definitions have similar goals, so it is also challenging to understand which are appropriate to use in which context. These difficulties also affect experts: a number of definitions have been defined independently multiple times (often with identical meaning but different names, or identical names but different meanings). Finally, variants are often introduced without a comparison to related notions.
In this section, we attempt to solve these problems by systematizing the scientific literature on variants and extensions of differential privacy. We propose a unified and comprehensive taxonomy of these variants and extensions, providing short explanations of the intuition, use cases and basic properties of each. By categorizing these definitions, we attempt to simplify the understanding of existing variants and extensions, and of the relations between them. We hope to make it easier for new practitioners to understand whether their use case needs an alternative definition, and if so, which existing notions are the most appropriate, and what their basic properties are.
We define seven dimensions: these are ways in which the original definition of DP can be modified or extended. We list variants and extensions that belong to each dimension, and we highlight representative definitions for each. Whenever possible, we compare these definitions and establish a partial ordering between the strengths of different notions. Furthermore, for each definition, we specify whether it satisfies Kifer et al.’s privacy axioms [225, 226] (post-processing and convexity), and whether they are composable.
This section is organized as follows.
- In Section 2.2.1, we introduce our dimensions along which DP can be modified, we present basic properties of privacy definitions, we define how definitions can relate to each other, and explain our methodology for this literature review.
- In the following 7 sections (Sections 2.2.2 to 2.2.8), we introduce our dimensions, and list and compare the corresponding definitions.
- In Section 2.2.9, we summarize the results from the previous sections into a table, showing the corresponding properties with proofs, and list the known relations.
- In Section 2.2.10, we mention related concepts and definitions which were considered out of scope for this work, and review the related literature.
2We count all the definitions which are presented as “new” in the papers introducing them.