Approaching the Anniversary of Anonymous Dog
The most famous Internet cartoon in the 90´s is probably the one with a caption “On the Internet nobody knows you´re a dog”, originally published in The New Yorker in 1993. One of the best Internet cartoons I saw in the last few years depictures a meeting between two unknown persons. They use mobile phone 3D face recognition to identify the other person, and subsequently connect to some social network that contains rather explicit comments about them such as: "emotionally inaccessible" or "lying cheater". Meeting over!
Anonymity on the net was a fact (and rather funny) some 20 years ago. Now, the total loss of anonymity, in the cyber as well as in the physical world, seems to be a fact (and maybe not so funny). So, what has happened with Internet users privacy in the past 20 years? And what will happen in the next 20 years?
One of things that has happened is that service providers, among others, realized the potential that private information has for business. Profiling, personalization, customization, targeting, context-based services – all that sounds and smells like the future of business. It is also heavily dependent on data such as location, preferences or opinions – not to mention the more obvious PII (personally identifiable information).
PII is, however, a legal term, not technical or economic (PII is used mainly in United States. In Europe, a term similar to PII, "personal data" is defined in EU directive 95/46/EC). While there are some obvious PII categories and candidates, such as health or criminal records, many others are tagged as “potentially sensitive”. The US Commercial Privacy Bill of Rights Act of 2011, talks about PII “which, if lost, compromised, or disclosed without authorization either alone or with other information, carries a significant risk of economic or physical harm”. So here we have the link between the economy and the privacy – if there is an economic risk for me, shouldn`t I put a price tag on my own PII?
At the beginning of the panel dedicated to The Economics of Privacy, held during the Future Internet Assembly in Budapest in May 2011, I asked people in the audience to raise their hands if they would reveal their religion in exchange for 10 Euros. Surprisingly (or not), the majority did raised their hands, implying that the value (or “risk of economic or personal harm”) related to this piece of information is not very high. I was wondering how many of them would do the same in another context, e.g. if this conference was held in my home country Bosnia during the nineties. Maybe the idea of letting people putting a price tag on their own private data is not very good, after all. Maybe it should be lawyers or government doing this?
As part of a process that revises EU legal framework, in 2012 the EC proposed several changes, including the intention to fine all organizations that breach EU data protection and privacy laws up to 2% of their global annual turnover. As a matter of fact, penalties have already started to take on serious proportions.
For example, in the summer of 2012, Brighton and Sussex University Hospitals NHS Trust was fined £325,000 following a serious breach of the Data Protection Act (DPA). How much this represents per each piece of personal data that has been disclosed is still difficult to calculate. And in the United States, in November 2012 the government of California sent a letter to several companies, including Delta Airlines and United Airlines, stating that they could “face fines of up to $2,500 each time a user downloads its app”. This would probably be the closest example of “putting price tag” to one person PII.
Atos’ Research & Innovation department (ARI) is actively involved in many research projects that deal with privacy, although mainly from the technological angle. One of these projects that does contemplate economics is PACT (Public Perception of Security and Privacy: Assessing Knowledge, Collecting Evidence, Translating Research Into Action). This is a 36 month collaborative project which started in February 2012 and is co-funded by the European Commission.
While PACT’s overarching goal is to carry out a root and branch review of public perceptions of privacy and security, it is also expected to deliver Privacy Reference Framework and Decision Support System that will try to quantify all relevant parameters, including perceived risks and concerns of different stakeholders. The model is expected to be context aware, meaning it should adapt cost or investment parameters to the specific time, location, application or perception. I am especially curious to see how will it treat cultural and national aspects related to privacy – the word does not even exist in some languages!!!
In 2013 we will celebrate a strange anniversary: 20 years of the famous “anonymous dog” Internet cartoon. In these 20 years, personal information has become increasingly used. e.g. in the marketplace for personalized services and behavior-based pricing, as well as targeted marketing. The question of what would be the fair price for personal information still remains an open issue.
European legislators are currently moving forward with a complete data protection reform package with the ultimate goal of passing the reform by summer 2013. Anniversary or not, 2013 promises to be the year that sheds more light on privacy controversies and opens the debate to more people. Whatever the future brings, there will certainly be a need to enable informed decisions by Internet users, data controllers/processors or data protection authorities. A Decision Support System might not be a magic stick, but it does point in the right direction.