differentialprivacy
Differential privacy is a mathematical framework for quantifying and protecting the privacy of individuals in datasets from statistical analyses and data releases. It provides guarantees that the inclusion or exclusion of a single individual's data has only a limited effect on the outputs of a data-analysis mechanism.
The core idea is formalized via a randomized mechanism M. In its standard form, M gives epsilon-differential
Common DP mechanisms add calibrated random noise. The Laplace mechanism adds Laplace noise to numeric query
Applications include private data analysis, statistics, and machine learning. DP has been adopted in national statistics
Differential privacy remains an active area of research, balancing rigorous privacy guarantees with practical utility across