Contextual Integrity
Contextual integrity is a theory of privacy developed by Helen Nissenbaum and presented in her book Privacy In Context: Technology, Policy, and the Integrity of Social Life.[1]
Contextual Integrity comprises four essential descriptive claims:
- Privacy is provided by appropriate flows of information.
- Appropriate information flows are those that conform with contextual information norms
- Contextual informational norms refer to five independent parameters: data subject, sender, recipient, information type, and transmission principle
- Conceptions of privacy are based on ethical concerns that evolve over time
Contextual Integrity can be seen as a reaction to theories that define privacy as control over information about oneself, as secrecy, or as regulation of personal information that is private, or sensitive.
This places contextual integrity at odds with privacy regulation based on Fair Information Practice Principles; it also does not line up with the 1990s Cypherpunk view that newly discovered cryptographic techniques would assure privacy in the digital age because preserving privacy is not a matter of stopping any data collection, or blocking all flows of information, minimizing data flow, or by stopping information leakage.[2]
The fourth essential claim comprising Contextual Integrity gives privacy its ethical standing and allows for the evolution and alteration of informational norms, often due to novel sociotechnical systems. It holds that practices and norms can be evaluated in terms of:
- Effects on the interests and preferences of affected parties
- How well they sustain ethical and political (societal) principles and values
- How well they promote contextual functions, purposes, and values.
The most distinctive of these considerations is the third. As such, Contextual Integrity highlights the importance of privacy not only for individuals, but for society and respective social domains.
Contextual Integrity’s Parameters
The “contexts” of contextual integrity are social domains, intuitively, health, finance, marketplace, family, civil and political, etc. The five critical parameters that are singled out to describe data transfer operation are:
- The data subject
- The sender of the data
- The recipient of the data
- The information type
- The transmission principle.
Some illustrations of contextual informational norms in western societies, include:
- In a job interview, an interviewer is forbidden from asking a candidate's religious affiliation
- A priest may not share congregants’ confession with anyone
- A U.S. citizen is obliged to reveal gross income to the IRS, under conditions of confidentiality except as required by law
- One may not share a friend's confidences with others, except, perhaps, with one's spouse
- Parents should monitor their children's academic performance
Examples of data subjects include patient, shopper, investor, or reader. Examples of information senders include a bank, police, advertising network, or a friend. Examples of data recipients include a bank, the police, a friend. Examples of information types include the contents of an email message, the data subject's demographic information, biographical information, medical information, and financial information. Examples of transmission principles include consent, coerced, stolen, buying, selling, confidentiality, stewardship, acting under the authority of a court with a warrant, and national security.
A key thesis is that assessing the privacy impact of information flows requires the values of all five parameters to be specified. Nissenbaum has found that access control rules not specifying the five parameters are incomplete and can lead to problematic ambiguities.[3]
Nissenbaum notes that the some kinds of language can lead one's analysis astray. For example, when the passive voice is used to describe the movement of data, it allows the speaker to gloss over the fact that there is an active agent performing the data transfer. For example, the sentence “Alice had her identity stolen” allows the speaker to gloss over the fact that someone or something did the actual stealing of Alice's identity. If we say that “Carol was able to find Bob’s bankruptcy records because they had been placed online”, we are implicitly ignoring the fact that someone or some organization did the actual collection of the bankruptcy records from a court and the placing of those records online.
Example
Consider the norm: “US residents are required by law to file tax returns with the US Internal Revenue Service containing information, such as, name, address, SSN, gross earnings, etc. under conditions of strict confidentiality.”
Data Subject: A US resident
Sender: The same US resident
Recipient: The US Internal Revenue Service
Information type: tax information
Transmission principle: the recipient will hold the information in strict confidentiality.
Given this norm, we can evaluate a hypothetical scenario and see if it violates the contextual integrity norm:
Hypothetical: “The US Internal Revenue Service agrees to supply Alice’s tax returns to the city newspaper as requested by a journalist at the paper.”
This hypothetical clearly violates contextual integrity because providing the tax information to the local newspaper would violate the transmission principle under which the information was obtained.
Applications
As a conceptual framework, contextual integrity has been used to analyze and understand the privacy implications of socio-technical systems on a wide array of platforms (e.g. Web, smartphone, IoT systems), and has led to many tools, frameworks, and system designs that help study and address these privacy issues.
Social media: Privacy in the public
In her book Privacy In Context: Technology, Policy, and the Integrity of Social Life, Nissenbaum discussed the privacy issues related to public data, discussing examples like Google Street View privacy concerns and problems caused by converting previously paper-based public records into digital forms and making them online. In recent years, similar issues happening in the context of social media have revived the discussion.
Shi et al. examined how people manage their interpersonal information boundary with the help of the contextual integrity framework. They found that the information access norms was related to who was expected to view the information.[4] Researchers have also applied contextual integrity to more controversial social events, e.g. Facebook–Cambridge Analytica data scandal [5]
The concept of contextual integrity have also influenced the norms of ethics for research work using social media data. Fiesler et al. studied twitter users' awareness and perception of research work that analyzed twitter data, reported results in a paper, or even quoted the actual tweets. It turned out that users' concerns were largely dependent on contextual factors, i.e. who is conducting the research, what the study is for, etc., which is in line with the contextual integrity theory.[6]
Mobile privacy: Using contextual integrity to judge the appropriateness of the information flow
The privacy concerns induced by the collection, dissemination and use of personal data via smartphones have received a large amount of attention from different stakeholders. A large body of computer science research aims to efficiently and accurately analyze how sensitive personal data (e.g. geolocation, user accounts) flows across the app and when it flows out of the phone.[7]
Contextual integrity has been widely referred to when trying to understand the privacy concerns of the objective data flow traces. For example, Primal et al. argued that smartphone permissions would be more efficient if it only prompts the user "when an application’s access to sensitive data is likely to defy expectations", and they examined how applications were accessing personal data and the gap between the current practice and users' expectations.[8] Lin et al. demonstrated multiple problematic personal data use cases due to the violation of users' expectations. Among them, using personal data for mobile advertising purposes became the most problematic one. Most users were unaware of the implicit data collection behavior and found it unpleasantly surprising when researchers informed them of this behavior.[9]
The idea of contextual integrity has also infiltrated the design of the system. Both iOS and Android are using a permission system to help developers manage their access to sensitive resources (e.g. geolocation, contact list, user data, etc.) and to provide users with control over which app can access what data. In their official guidelines for developers,[10][11] both iOS and Android recommend developers to limit the use of permission-protected data to situations only when necessary, and recommend developers to provide a short description of why the permission is requested. Since Android 6.0, users are prompted at runtime, in the context of the app, which is referred to as "Increased situational context" in their documentation.
Other Applications of Contextual Integrity
In 2006 Barth, Datta, Mitchell and Nissenbaum presented a formal language that could be used to reason about the privacy rules in privacy law. They analyzed the privacy provisions of the Gramm-Leach-Bliley act and showed how to translate some of its principles into the formal language.[12]
References
- Helen Nissenbaum, Privacy in Context, 2010
- Benthall, Sebastian; Gürses, Seda; Nissenbaum, Helen (2017-12-22). Contextual Integrity Through the Lens of Computer Science. Now Publishers. ISBN 978-1-68083-384-3.
- Martin, K and Helen Nissenbaum. "What is private about ‘public’ records data?" Targeted Submission: Fall Law Reviews.
- Shi, Pan, Heng Xu, and Yunan Chen. "Using contextual integrity to examine interpersonal information boundary on social network sites." Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2013.
- https://www.ntia.doc.gov/files/ntia/publications/rfc_comment_shvartzshnaider.pdf
- Fiesler, Casey, and Nicholas Proferes. "“Participant” Perceptions of Twitter Research Ethics." Social Media+ Society 4.1 (2018): 2056305118763366.
- Enck, William, et al. "TaintDroid: an information-flow tracking system for realtime privacy monitoring on smartphones." ACM Transactions on Computer Systems (TOCS) 32.2 (2014): 5.
- Lange, Patricia G. "Publicly private and privately public: Social networking on YouTube." Journal of computer-mediated communication 13.1 (2007): 361-380.
- Lin, Jialiu, et al. "Expectation and purpose: understanding users' mental models of mobile app privacy through crowdsourcing." Proceedings of the 2012 ACM conference on ubiquitous computing. ACM, 2012.
- https://developer.apple.com/design/human-interface-guidelines/ios/app-architecture/requesting-permission/
- https://developer.android.com/training/permissions/usage-notes
- Barth, Adam; Datta, Anupam; Mitchell, John; Nissenbaum, Helen (2006). Privacy and Contextual Integrity: Framework and Applications. Proceedings of the 2006 IEEE Symposium on Security and Privacy. pp. 184–198. CiteSeerX 10.1.1.76.1610. doi:10.1109/SP.2006.32. ISBN 978-0-7695-2574-7.
See also
- H. Nissenbaum, Privacy in Context: Technology, Policy and the Integrity of Social Life (Palo Alto: Stanford University Press, 2010), Spanish Translation Privacidad Amenazada: Tecnología, Política y la Integridad de la Vida Social (Mexico City: Océano, 2011)
- K. Martin and H. Nissenbaum (2017) “Measuring Privacy: An Empirical Examination of Common Privacy Measures in Context,” Columbia Science and Technology Law Review (forthcoming).
- H. Nissenbaum (2015) "Respecting Context to Protect Privacy: Why Meaning Matters," Science and Engineering Ethics, published online on July 12.
- A. Conley, A. Datta, H. Nissenbaum, D. Sharma (Summer 2012) “Sustaining both Privacy and Open Justice in the Transition from Local to Online Access to Court Records: A Multidisciplinary Inquiry,” Maryland Law Review, 71:3, 772-847.
- H. Nissenbaum (Fall 2011) "A Contextual Approach to Privacy Online," Daedalus 140:4, 32-48.
- A. Barth, A. Datta, J. Mitchell, and H. Nissenbaum (May 2006) “Privacy and Contextual Integrity: Framework and Applications,” In Proceedings of the IEEE Symposium on Security and Privacy, n.p. (Showcased in “The Logic of Privacy,” The Economist, January 4, 2007)