Rényi DP

Rényi Divergence

Before we go into Rényi DP, we need to understand what Rényi divergence is. Rényi divergence is a way of measuring the matching between two distributions. It’s formally defined as follows:

\[D_\alpha(P||Q) = \frac{1}{\alpha - 1}log(\sum_{i=1}^{n}\frac{p_{i}^{\alpha}}{q_{i}^{\alpha - 1}})\]

when \(0< \alpha < \infty\) and \(\alpha \neq 1\). We can define the Rényi divergence for the special values \(\alpha = 0, 1, \infty\) by taking a limit.

Formal definition of Rényi DP

Rényi differential privacy (RDP)1 is a generalization of ε-differential privacy. A randomized mechanism \(M\) is said to have: Rényi DP of order \(\alpha\), or \((\alpha,\epsilon)\)-RDP, if for any neighbouring databases \(x\) and \(x'\), it holds that,

\[D_\alpha(M(x)||M(x')) \le \epsilon\]

A mechanism satisfying ε-DP is equivalent to saying that it satisfies RDP of order \(\infty\). RDP gives privacy guarantees that are somewhere between ε-DP and \((\epsilon, \delta)\) -DP.

  1. Ilya Mironov. Rényi differential privacy.