Differential privacy is an elegant formulation of conditions an algorithm must meet in order to avoid leaking personal information contained in its inputs. It is becoming mainstream in many research communities and has been deployed in practice in the private sector and some government agencies. Accompanying this growth is an unfortunate, but expected, difficulty: many algorithms that are claimed to be differentially private are incorrect. This phenomenon affects both new-comers and seasoned veterans of the differential privacy field because of the difficulty and subtlety of developing new differentially private algorithms.
This proposal outlines a research plan for the development of a system called DevDP, whose purpose is to enable novice and expert users to develop prototypes and explore differentially private algorithms. In particular, the project will develop program analysis tools and theory that leverage both programming language and machine learning technology to aid the development of correct differentially private programs by automating much of the verification and reasoning about errors. As part of broader impacts, DevDP also has the potential to help educate students and less-technical members of the scientific community by providing interactive software tools. A solid understanding of differential privacy will become crucial as it makes its way into public policy.
|Effective start/end date||7/1/17 → 6/30/22|
- National Science Foundation: $1,200,000.00