Research Scientist, Mónica Ribero Díaz, discusses the importance of Differential Privacy (DP) in protecting user data during data processing and analysis. DP ensures that individual user information is not compromised while enabling the use of data in various industries and government applications. However, researchers have identified errors in the mathematical proofs and implementations of private mechanisms, highlighting the need for thorough testing and auditing processes.
To address this issue, an open-source library called DP-Auditorium has been introduced. This library allows for the automated testing of DP mechanisms, using property testers and dataset finders to identify privacy guarantee violations. Three novel testers have been implemented: HockeyStickPropertyTester, RényiPropertyTester, and MMDPropertyTester, which do not rely on explicit histogram approximations of tested distributions. These testers optimize divergences over function spaces to ensure accurate testing results.
DP-Auditorium has been effective in identifying privacy violations in both private and non-private mechanisms, including faulty implementations like the Laplace and Gaussian mechanisms for computing the mean, and a DP gradient descent algorithm in TensorFlow. The property testers in DP-Auditorium have shown superior performance in detecting bugs, even in cases where common errors in literature are present.
Overall, DP-Auditorium provides a comprehensive tool for auditing DP mechanisms, ensuring the protection of user data in various applications.
Source link