‘Contextual Integrity: Privacy as Data Governance’ | Talk by Prof. Helen Nissenbaum, Cornell Tech
New Academic Block, Room 203
Thursday, January 9, 2025, 5:00 pm
Open to the public

The National Law School of India University, Bengaluru, along with the Infosys Science Foundation is organising a talk on January 9, 2025 by Prof. Helen Nissenbaum from Cornell Tech, New York, on the topic “Contextual Integrity: Privacy as Data Governance.” The talk will be delivered on the NLSIU campus at 5 pm.
About the Speaker
Helen Nissenbaum is the Andrew H. and Ann R. Tisch Professor of Information Science and the founding director of the Digital Life Initiative at Cornell Tech. Her research spans issues of bias, trust, security, autonomy, and accountability in digital systems, most notably, privacy as contextual integrity. Professor Nissenbaum’s publications include the books Obfuscation: A User’s Guide for Privacy and Protest, with Finn Brunton (MIT Press, 2015), Values at Play in Digital Games, with Mary Flanagan (MIT Press, 2014), and Privacy in Context: Technology, Policy, and the Integrity of Social Life (Stanford, 2010). These, along with numerous research articles, have been translated into seven languages, including Polish, Chinese, and Portuguese. She received the 2014 Barwise Prize from the American Philosophical Association and the IACAP Covey Award for computing, ethics, and philosophy. Professor Nissenbaum has also contributed to privacy-enhancing free software, such as TrackMeNot (designed to prevent the profiling of web search histories) and AdNauseam (designed to counter profiling based on ad clicks). She holds a Ph.D. in philosophy from Stanford University and a B.A. (Hons) in Philosophy and Mathematics from the University of the Witwatersrand, South Africa.
An overview of her publications is available here: https://nissenbaum.tech.cornell.edu/
Abstract of the talk
Contextual Integrity (CI) is a different way of defining privacy – not as secrecy and not as control over personal information but as appropriate flow. It answers an urgent societal need for a definition that is meaningful, explains why privacy is ethically compelling, and points to how we may protect it through law, regulation, and technology. My talk will review key theoretical ideas behind contextual integrity, provide evidence of its empirical robustness, and explain why successful regulation of privacy needs to be accompanied by effective data governance, aimed at protecting legitimate societal institutions (“contexts”) and their associated ends and values.
About the Infosys Science Foundation
The Infosys Science Foundation, a not-for-profit trust, was set up in 2009 by Infosys and members of its Board, with the objective of encouraging, recognizing, and fostering world class scientific research connected to India. The Foundation furthers its objectives primarily through the Infosys Prize an annual award, to honor outstanding achievements of researchers and scientists in six categories – Economics, Engineering & Computer Science, Humanities & Social Sciences, Life Sciences, Mathematical Sciences, and Physical Sciences. The Foundation also partners with educational institutions around the world to host lectures featuring Infosys Prize laureates and jurors aiming to spark curiosity and inspire the next generation of scholars. The Foundation creates conversations around science and society, engaging with various sections of the community, through talks, initiatives, workshops and training.
Excerpts from the talk
On what should be the definition of privacy
“What kind of understanding do we have to have of privacy to concede that this definition was good enough? So here are some of the basic benchmarks. 1) this conception has to be faithful to common use and by that, I mean that whatever the definition of privacy we have, it has to more or less track the way people think about the meaning of privacy. And in my view, there are various definitions of privacy that you see in the technical arena that are very rigorous, but they don’t respond to this concern.
They are rigorous because if you just adopt a natural definition, sometimes with these definitions of complex concepts that take something like justice, people, fairness, there’s a lot of argument. There could be inconsistency. So you have to shave off some of the meanings that’s very rigorous. And importantly, it has to explain privacy’s ethical significance.
So if I say, you have intruded on my privacy, that’s an ethical statement. It means you’ve done something wrong and you need to stop. A definition of privacy that doesn’t give that falls short. And I offer privacy as contextual integrity.”
On ‘privacy’
“I wanted to introduce the first premise of contextual integrity so that if everybody had to vacate the room for some reason and there’s only one thing you remember, this is it: The right to privacy is a right to appropriate flow of information. So that’s the most important difference, which is to say – flow. Concentrate on that word for a second because what I want to emphasise right from the beginning is that some definitions of privacy would define privacy as a form of secrecy. So the more information somebody has about you, the less privacy you have. And I didn’t think that that actually tracked what people cared about. It wasn’t that they cared that any information flows, because the flow of information is absolutely fundamental to just about everything we do in society.
So a theory of privacy that calls for secrecy is not a theory that can hold up the requirement of an ethical definition. So we need flow, but what privacy wants is appropriate flow. Then you might say, of course, well, what is appropriate flow? And that’s really what the theory tries to do. So what are the basic building blocks of contextual integrity? 1) social contexts, 2) contextual informational norms, that’s another big part of the theory, and then 3) this concept of contextual ends, values, and purposes.”
On ‘contextual norms’
“The idea is that we live in societies. Our social lives are not in some undifferentiated social space, but rather we are in and out of different social spheres: health, education, family, politics, and so forth. These contexts are defined by goals and purposes and values. So if we’re sitting in this room, here’s another one of the aspects of context. It’s associated with certain functions or practices.
It’s governed by certain norms and rules, so I’m fulfilling people’s expectations. So far, you’re also fulfilling people’s expectations. There are many things. If I stood up on this desk and started singing, you might be surprised and that’s because there’s certain norms.
So that’s really important and I want it to stick in our heads. Among the contextual norms are norms and rules that govern information flow. Now, that too is a fundamental part of the theory. The contextual informational norms have five parameters. Three of them have to do with the actors. You have subject, sender, recipient, attributes, which is information type, and transmission principle, which is the condition under which the information flows from party to party.”
On ‘transmission principles’
“A transmission principle is the constraint on the flow as it passes from one actor to the other actor. So the most common one that most of us are familiar with is with consent. So if some information passes from one party to another party and it’s with consent of the data subject, that’s one transmission principle. But it’s not always the case.
Sometimes in a court of law, the judge requires the person giving evidence to provide information and in that case, that information is coerced. Or information can flow because someone buys it and sells it. So there are lots of different transmission principles, but the idea is that it governs the flow from one party to the other.
… Now, this was one of the most inspiring cases to me. It was in 2007 and when I was already working on privacy. There was a huge outcry about Google Maps Street View. People complained about different things. So in the US, people would say, oh, you showed me sunbathing in a bikini. And in Japan, they were really upset because it showed the outside front of their homes. And men were saying, oh, I’m going to get in trouble for coming out of the strip club and so forth.
And the Google engineers defended themselves. They said, public is public. We’re not doing anything different. Whatever you did was in public, and we captured that. And so what contextual integrity shows is that actually, there is a radical difference, and I’m sure, it’s obvious to everybody that you can pinpoint by showing that once you’re posting it online, you’ve changed the recipient and you’ve changed the transmission principle. Because when people see you in public, for the most part, you see them. So there’s a reciprocity in public that does not exist when you move this onto the web (online mediums).”