TY - JOUR
T1 - Elliptical perturbations for differential privacy
AU - Reimherr, Matthew
AU - Awan, Jordan
N1 - Funding Information:
∗Research supported in part by NSF DMS 1712826, NSF SES 1853209, and the Simons Institute for the Theory of Computing at UC Berkeley. †Research supported in part by NSF SES-153443 and NSF SES-1853209.
Funding Information:
Research supported in part by NSF DMS 1712826, NSF SES 1853209, and the Simons Institute for the Theory of Computing at UC Berkeley. Research supported in part by NSF SES-153443 and NSF SES-1853209.
Publisher Copyright:
© 2019 Neural information processing systems foundation. All rights reserved.
PY - 2019
Y1 - 2019
N2 - We study elliptical distributions in locally convex vector spaces, and determine conditions when they can or cannot be used to satisfy differential privacy (DP). A requisite condition for a sanitized statistical summary to satisfy DP is that the corresponding privacy mechanism must induce equivalent probability measures for all possible input databases. We show that elliptical distributions with the same dispersion operator, C, are equivalent if the difference of their means lies in the Cameron-Martin space of C. In the case of releasing finite-dimensional summaries using elliptical perturbations, we show that the privacy parameter e can be computed in terms of a one-dimensional maximization problem. We apply this result to consider multivariate Laplace, t, Gaussian, and K-norm noise. Surprisingly, we show that the multivariate Laplace noise does not achieve e-DP in any dimension greater than one. Finally, we show that when the dimension of the space is infinite, no elliptical distribution can be used to give e-DP; only (e, d)-DP is possible.
AB - We study elliptical distributions in locally convex vector spaces, and determine conditions when they can or cannot be used to satisfy differential privacy (DP). A requisite condition for a sanitized statistical summary to satisfy DP is that the corresponding privacy mechanism must induce equivalent probability measures for all possible input databases. We show that elliptical distributions with the same dispersion operator, C, are equivalent if the difference of their means lies in the Cameron-Martin space of C. In the case of releasing finite-dimensional summaries using elliptical perturbations, we show that the privacy parameter e can be computed in terms of a one-dimensional maximization problem. We apply this result to consider multivariate Laplace, t, Gaussian, and K-norm noise. Surprisingly, we show that the multivariate Laplace noise does not achieve e-DP in any dimension greater than one. Finally, we show that when the dimension of the space is infinite, no elliptical distribution can be used to give e-DP; only (e, d)-DP is possible.
UR - http://www.scopus.com/inward/record.url?scp=85090171079&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85090171079&partnerID=8YFLogxK
M3 - Conference article
AN - SCOPUS:85090171079
SN - 1049-5258
VL - 32
JO - Advances in Neural Information Processing Systems
JF - Advances in Neural Information Processing Systems
T2 - 33rd Annual Conference on Neural Information Processing Systems, NeurIPS 2019
Y2 - 8 December 2019 through 14 December 2019
ER -