Free gap information from the differentially private sparse vector and noisy max mechanisms

Zeyu Ding, Yuxin Wang, Danfeng Zhang, Daniel Kifer

Research output: Contribution to journalConference articlepeer-review

1 Scopus citations

Abstract

Noisy Max and Sparse Vector are selection algorithms for differential privacy and serve as building blocks for more complex algorithms. In this paper we show that both algorithms can release additional information for free (i.e., at no additional privacy cost). Noisy Max is used to return the approximate maximizer among a set of queries. We show that it can also release for free the noisy gap between the approximate maximizer and runner-up. This free information can improve the accuracy of certain subsequent counting queries by up to 50%. Sparse Vector is used to return a set of queries that are approximately larger than a fixed threshold. We show that it can adaptively control its privacy budget (use less budget for queries that are likely to be much larger than the threshold) in order to increase the amount of queries it can process. These results follow from a careful privacy analysis.

Original languageEnglish (US)
Pages (from-to)293-306
Number of pages14
JournalProceedings of the VLDB Endowment
Volume13
Issue number3
DOIs
StatePublished - 2020
Event46th International Conference on Very Large Data Bases, VLDB 2020 - Virtual, Japan
Duration: Aug 31 2020Sep 4 2020

All Science Journal Classification (ASJC) codes

  • Computer Science (miscellaneous)
  • Computer Science(all)

Fingerprint Dive into the research topics of 'Free gap information from the differentially private sparse vector and noisy max mechanisms'. Together they form a unique fingerprint.

Cite this