Abstract
Noisy Max and Sparse Vector are selection algorithms for differential privacy and serve as building blocks for more complex algorithms. In this paper we show that both algorithms can release additional information for free (i.e., at no additional privacy cost). Noisy Max is used to return the approximate maximizer among a set of queries. We show that it can also release for free the noisy gap between the approximate maximizer and runner-up. This free information can improve the accuracy of certain subsequent counting queries by up to 50%. Sparse Vector is used to return a set of queries that are approximately larger than a fixed threshold. We show that it can adaptively control its privacy budget (use less budget for queries that are likely to be much larger than the threshold) in order to increase the amount of queries it can process. These results follow from a careful privacy analysis.
Original language | English (US) |
---|---|
Pages (from-to) | 293-306 |
Number of pages | 14 |
Journal | Proceedings of the VLDB Endowment |
Volume | 13 |
Issue number | 3 |
DOIs | |
State | Published - 2020 |
Event | 46th International Conference on Very Large Data Bases, VLDB 2020 - Virtual, Japan Duration: Aug 31 2020 → Sep 4 2020 |
All Science Journal Classification (ASJC) codes
- Computer Science (miscellaneous)
- Computer Science(all)