There is absolutely no argument or question that the technology revolution and the advent of communication technologies like the Internet has made human life better. These technologies improved economic productivity, economic efficiency, and economic growth not only in the US and the developed world, but also in many developing countries like India, Brazil, and China. In these countries, these technologies have been a great social equalizer in removing poverty and creating employment opportunities for the masses. On a more personal level, information digitization created opportunities to grow, to relate to others, and to connect and collaborate beyond the geographical and political boundaries that were created in the past. In fact, the digital revolution has helped people exponentially realize not only one’s own creative potential, but also the collective creative potential of many communities and nations globally who voluntarily embraced the technology revolution. Hence within this context, It must be remembered that the moment a person is connected to the Internet, he or she can create, post, and retrieve vast amounts of data, some of which is private in nature, and almost all of which is in the hands of third parties.
These benefits come at a price. In order to offer the consumers ever better services and improve their performance, the newer generation applications we use require a good amount of personal information about people who use and consume these services. Nowadays, people give their personal information to the service providers without thinking twice. They are voluntarily giving out their personal information, data about the services they use (Facebook, Twitter, etc.), location information on their whereabouts, data about the software they use (like SaaS or PaaS), and even provide applications access to their photo albums, emails, and contact lists.
Now the Internet itself is stateless (meaning every web request and response in and of itself, know who does not maintain the state of the user or device making the request), so to maintain the user/device state, applications must store information on their server(s) (which is exactly the kind of thing certain three letter agency may be after for various reasons). But what is interesting here is that the people of the world woke up to the fact that they have been giving their personal data voluntarily for years now. Why is it is that the debate over the tradeoff between security and privacy beginning just now, when in reality that debate was over long ago and privacy lost ?
Now let’s look at the paradox of this debate on “Security vs. Privacy. There are three characteristics to Information Security: Confidentiality, Integrity and Availability.
- Confidentiality means preserving authorized restrictions on access and disclosure, including means for protecting privacy, proprietary, and sensitive information.
- Integrity means guarding against improper information modification or destruction, and it also includes ensuring information non-repudiation and authenticity.
- Availability means ensuring timely and reliable access to and easy use of information for users.
Now, if we need these security qualities in the information services used every day, then this requires a fundamental concept known as "attribution". Attribution refers to the ability to associate the actions and requests, made by an individual or a device. This visibility through attribution can also help improve detecting the attack attempts of the “bad guys” who are trying to steal sensitive information by locating the points of entry for successful attacks, identifying already-compromised machines, interrupting the infiltrating attackers’ activities, and gaining information about the sources of an attack. In other words, attribution can help to increase an organization’s, and more importantly, a country's situational awareness of its environment in the cyber world.
Now to examine the three characteristics of Privacy: Anonymity, Unlinkability, Unobservability
- Anonymity of a subject means that the subject is not identifiable within a set of subjects, which is called the anonymity set. A real life example would be the inability to identify one person out of a set of people, or distinguish one certain device from a corresponding set of devices.
- Unlinkability assures that two or more related events in an information processing system about a person or device cannot be related or linked to each other.
- Unobservability assures that an observer (which can be a system or person) is unable to identify or otherwise infer the identities of any and all parties involved in a digital transaction.
Anonymity, Unlinkability and Unobservability leads to “Undiscoverability,” which is also called the attribution problem. By now, it should be rather visible as to the nature of the paradox emerging from this: to have security, attribution is required; for attribution, at least in our current internet model, the sacrifice of some privacy will be necessary.
There have been discussions in various academic circles which suggest that these very attributes of privacy can be used to enhance security; an example would be using what is called Pseudonymity: using a pseudonym as an identifier of a subject other than one of the subject’s real name and real identity.
Pseudonymity can be used to mitigate the loss of anonymity and loss of unlinkability from a security perspective by using a Trusted Broker identity and Access architecture. However, loss of unobservability is still very much a concern, both from an enterprise perspective (in terms of its employees), as well as a governmental perspective (in terms of its citizens). There is also ongoing discussion of using the "Panopticism" model to mitigate the unobservability problem, and therefore, as a feasible solution for privacy.
Jeremy Bentham, a 19th century British philosopher, jurist, and social reformer, developed the Panopticon, a circular prison building with an observation tower in the centre of an open space, which was, in turn, surrounded by an outer wall. This wall would contain cells for the prison occupants. This design would increase security by facilitating more effective surveillance. Residing within cells flooded with light, occupants would be readily distinguishable and visible to an official invisibly positioned in the central tower. Simultaneously, occupants would be invisible to each other, with concrete walls dividing their cells. This is a form of called mass “surveillance” derived from the french word “sur” meaning above.
Now this solution will raise the question of”Who is going to wield the "Anonymous Power" of being the Information Panopticon in a centralized privacy defined identity and access broker model? Panopticism is already in use in the field of Information Technology; the CCTV systems in multifarious offices, shopping malls, subways of New York or London are examples. The use of drones is another example, in the digital marketing context, increasingly visible private data through web beacons leading proliferation of “dataveillance,” may be described as a mode of "mass surveillance” aimed to single out particular transactions through routine algorithmic methods for individualized marketing purposes.
It is somewhat plausible that this conceptual model of Pseudonymity (from a Digital Identity perspective), when added to Panopticism, can be developed as an alternate solution to resolve the Security vs. Privacy paradox from a mass surveillance context. To this new model if we add “Sousveillance” coined by Steve Mann from the french word “sou” from below and “veiller” to watch, we may achieve equivalence a state of equilibrium between surveillance and sousveillance.
David Brin in his book “The Transparent Society” argues that it will be good for society if the powers of surveillance are shared with the citizenry, allowing "sousveillance" or "viewing from below," enabling the public to watch the watchers. According to Brin, this only continues the same trend promoted by Adam Smith, John Locke, the US Constitutionalists and the western enlightenment movement, who held that any elite (whether commercial, governmental, or aristocratic) should experience constraints upon its power, and there is no power-equalizer greater than KNOWLEDGE.
Inverse surveillance subset of sousveillance can be used to preserve the contextual integrity of mass surveillance data. For example, a lifelong capture of personal experience could provide "best evidence" over external surveillance data, to prevent the mass surveillance data (example: protection against the Meta data connections from mass surveillance data) from being taken out of context.
To have a society within which personal freedoms and justice are equally distributed, I strongly believe equivalence is needed. Equiveillance will represent a harmonious balance that ensures maximization of human freedom and individual rights at the same time, balancing the need for communal democracy and communal security. Equiveillancy will address the privacy issues of ubiquitous computing, wearable computing, augmented reality and increasing artificial intelligence of machines, by maintaining a balance between surveillance and sousveillance while maximizing personal privacy “wants” and optimizing individual, organizational and national cyber security “needs”.
I hope we can raise the awareness of sousveillance on World Sousveillance Day (WSD) which occurs on December 24th - the busiest shopping day of the year. WSD aims to raise awareness of the imbalance today between surveillance and sousveillance, as particularly exemplified in shopping malls where surveillance is ubiquitous whereas sousveillance is prohibited.
I will be talking about the need to build resilience in Information Security at HP Discover 2014, at Barcelona on December 4, 2014. The Topic is “Moving Beyond Vulnerability Testing” https://h30550.www3.hp.com/connect/sessionDetail.ww?SESSION_ID=5265