The notion of differential privacy has recently emerged as a gold standard in the field of database privacy. While this notion has the benefit of providing concrete theoretical privacy (compared to various previous ad-hoc approaches), the major drawback is that the mechanisms needs to inject some noise in the output limiting its applicability in many settings. In this work, we initiate the study of a new notion of privacy called noiseless privacy.
The (very natural) idea we explore is to exploit the entropy already present in the database and substitute that in the place of external noise to the output. The privacy guarantee we provide is very similar to DP but where that guarantee “comes from" is very different in the two cases. While differential privacy focuses on generality, we make assumptions about the database distribution, the auxiliary information which the adversary may have and the type of queries.
In this work, we first formalize the notion of noiseless privacy, introduce two definitions and show that they are equivalent. We then study certain types of Boolean and real queries and show natural (and well understood) conditions under which noiseless privacy can be obtained with good parameters. We also study the issue of composability and introduce models under which it can be achieved in the noiseless privacy framework.
Paper (In submission):
Noiseless Database Privacy, Raghav Bhaskar, Abhishek Bhowmick, Vipul Goyal, Srivatsan Laxman, and Abhradeep Guha Thakurta, Asiacrypt 2011.
Privacy in Data Mining (http://research.microsoft.com/en-us/projects/ppdm/default.aspx)