Reconstruction attack

A reconstruction attack is any method for partially reconstructing a private dataset from public aggregate information. Typically, the dataset contains sensitive information about individuals, whose privacy needs to be protected. The attacker has no or only partial access to the dataset, but has access to public aggregate statistics about the datasets, which could be exact or distorted, for example by adding noise. If the public statistics are not sufficiently distorted, the attacker is able to accurately reconstruct a large portion of the original private data. Reconstruction attacks are relevant to the analysis of private data, as they show that, in order to preserve even a very weak notion of individual privacy, any published statistics need to be sufficiently distorted. This phenomenon was called the Fundamental Law of Information Recovery by Dwork and Roth, and formulated as "overly accurate answers to too many questions will destroy privacy in a spectacular way."[1]

The Dinur-Nissim Attack

In 2003, Irit Dinur and Kobbi Nissim proposed a reconstruction attack based on noisy answers to multiple statistical queries.[2] Their work was recognized by the 2013 ACM PODS Alberto O. Mendelzon Test-of-Time Award in part for being the seed for the development of differential privacy.[3]

Dinur and Nissim model a private database as a sequence of bits , where each bit is the private information of a single individual. A database query is specified by a subset , and is defined to equal . They show that, given approximate answers to queries specified by sets , such that for all , if is sufficiently small and is sufficiently large, then an attacker can reconstruct most of the private bits in . Here the error bound can be a function of and . Nissim and Dinur's attack works in two regimes: in one regime, is exponential in , and the error can be linear in ; in the other regime, is polynomial in , and the error is on the order of .

gollark: Use score voting or approval voting.
gollark: NO!
gollark: It says it's not.
gollark: https://osmarks.tk/p3.html
gollark: It is not documented as being potato. Did you *read* it?

References

  1. The Algorithmic Foundations of Differential Privacy by Cynthia Dwork and Aaron Roth. Foundations and Trends in Theoretical Computer Science. Vol. 9, no. 3–4, pp. 211‐407, Aug. 2014. DOI:10.1561/0400000042
  2. Irit Dinur and Kobbi Nissim. 2003. Revealing information while preserving privacy. In Proceedings of the twenty-second ACM SIGMOD-SIGACT-SIGART symposium on Principles of database systems (PODS '03). ACM, New York, NY, USA, 202–210. DOI:10.1145/773153.773173
  3. "ACM PODS Alberto O. Mendelzon Test-of-Time Award".
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.