22Nov/12Off

Interview with I. FALQUE-PIERROTIN, President CNIL

Published in COMMUNICATIONS & STRATEGIES No. 88, 4th Quarter 2012

Summary of the issue: Information and Communication Technologies (ICT), and the Internet in particular, offer companies the ability to collect large amounts of data about their users, and to use this information as a key input for value creation. New business models based on gathering and aggregating personal data and leveraging big data technologies, lead to innovative market offerings. To become successful, they depend on disclosure (openness) and trust on the users' side. Though the disclosure of personal information might benefit consumers via, for example, better tailored services, openness also creates risks of abuse of personal data, ranging from increasing market power (e.g., due to price discrimination) to privacy breaches by the data holder, or even cybercrime from initiatives of rogue third parties.

Exclusive:

Interview with Isabelle FALQUE-PIERROTIN
President, CNIL, France
Conducted by Vincent BONNEAU (IDATE, Montpellier)

> Discover the interview with Hal VARIAN Chief Economist at Google

C&S: What are the current key topics for CNIL when addressing privacy regulation (mobile, social, cloud, etc.)? Is big data specifically on the radar, or is it analysed like any other development?

Isabelle FALQUE-PIERROTIN:
CNIL, as all the data protection authorities, is addressing a wide array of key topics at the moment. As you have probably seen, we are currently having substantial discussions with some major internet platforms, like the ones we are having with Google, about their new privacy policies and the questions they raise regarding European data protection legislation. Biometrics is a major area of concerns, too. Also, cloud computing services have considerably developed the last few years and their use by companies raises new questions in terms of legal and risk management. Reacting to this trend, CNIL launched a public consultation on Cloud computing at the end of 2011, which resulted in the publication of a number of practical recommendations for companies using these new services [1] in order to ensure compliance with the applicable legal framework.
As for big data, it is a convenient way of tackling the emerging paradigm shift in data sciences and data economy. This is why CNIL approaches it not so much as a new topic of regulation per se, but rather as a way to describe the concept necessary to grasp the new economic and sociologic landscape that currently takes shape. This is why it is one of the major trends we are following within our innovation and foresight program, for example.

Can we consider that users are fully aware of the usage of their personal data by third parties? Do they measure the real risks but still arbitrate in favour or usage and innovation? Have they different reactions depending on type of service or personal data?

It would obviously be wrong to assert that users are fully aware of the whole economy behind personal data combinations and analysis. Surely, there is some kind of a privacy paradox, because people express concerns but still use and plebiscite services and systems that are really "black boxes" for them in terms of personal data processing. Let's take tracking systems, such as cookies for examples: one common browsing day means encountering hundreds of cookies, linked in one way or another with dozens of different third parties, intermediaries, operators... the existence of which you are not even aware of. Those systems are not always threatening privacy, but they are surely not known. It cannot be argued that this attitude is only the sign of an absence of concern for privacy: in fact, the issue is probably more complicated and thinking about security and protection is neither "fun" nor rewarding. And finally, individuals do care more about other sensitive personal information, the categorization of which is changing. Geolocation is clearly considered very sensitive data by all, even if we all like location-based services. For example, in a survey we made in December 2011 about "smartphones and privacy", more than 90% of French smartphone users wanted strong control over their locations data. People want to know what happen to their traces and data, and they want more control on them.

The IT ecosystems are more and more developing around platforms aggregating users and developers. What are the additional challenges coming from platforms in terms of privacy?

First of all, the business model there is generally based on free access, sometimes with freemium offers. But, as it's often said for example about Facebook, if you do not pay, you are not a customer. And if you are not a customer, you are most likely to be a product! So a lot of those platforms are running on data, mostly personal data, like a car is running on oil. Therefore, data-minimization and privacy by default are clearly not the natural tendency of those platforms.
Then, as is emphasized in our cloud recommendations, another issue arises around the balance of responsibility between those platforms, developers, users, etc. The "one-stop shop" for users should not hide the fact that you have plenty of actors participating to the service. The standardization of offers and the use of take-it-or-leave-it contracts by Cloud providers to formalize contractual relationships with their customers leave no space to the negotiation of the terms of use of Cloud services. In addition, it appears that providers generally provide very few information to their clients about the technical and organizational measures implemented to guarantee data security and confidentiality of data processed on behalf of clients. This lack of transparency and control mean that their clients do not have all necessary information to comply with their duties as data controllers.

Should personal data collected by tracking, by acquisition from third parties or by direct collect from the user (like a form) be treated/considered the same way regarding privacy aspects?

We are clearly witnessing a revolution in the way personal data is generated and collected today. While all the major data protection and privacy legislations were designed with a basic economic pattern in mind (i.e. the collection of data through paper forms directly with the concerned individual, and the possible resale of such information to third parties), the situation has thoroughly evolved. A lot of personal material is now put online by the individuals themselves, namely on social networks; mobile devices are used for an increasing variety of purposes, which also generate personal data that can be deemed sensitive (geolocation; payment etc.); our browsing activity is tracked and analyzed… These situations are totally different from those that the law originally aimed to regulate, so that the applicable legal analysis can vary. But whatever the services at stake, a number of fundamental requirements will always apply: the person to whom these different data relate must be able to know how they are processed and how to control their dissemination. His or her rights of objection, deletion, rectification must be strictly complied with, by whichever organization they are detained. Otherwise, the operations carried out on such data may not comply with the essential condition of legitimacy that any data processing must comply with. In short, they may be plainly illegal, because of the imbalance between the company's business interests and the rights of the persons concerned.

If personal data is anonymized by service providers and then used for patterns discovery for instance, does it still imply privacy concerns?

Your question raises the issue whether the application of some form of anonymization process to personal data can be considered sufficient to avoid any negative consequence for the individual concerned – and, behind this, whether data protection rules apply. I believe that the answer to this question is two-fold.
To start with, the rule of principle is that anonymized data may indeed be processed in ways which would have been strictly prohibited under an identifiable format. This faculty opens very interesting possibilities, eg in the field of clinical trials, where very sensitive information is commonly processed.
But as simple as it seems, the rule must be applied with extreme caution. For one must be sure, in those cases, that the data processed definitely cannot be re-linked to specific individuals – which takes more, sometimes, than making it impossible to trace back to the person' precise identity, i.e. its first and last names. More generally, data may not be considered as non-personal data when it can be used to determine or influence the way in which the individual behind the data is treated or evaluated – which can be different from detaining her basic identity features.
These issues were well summed-up by the Article 29 Working Party in 2007, in its very thorough working paper on the definition of personal data. More recently, in a very interesting academic paper, Professor Paul OHM even referred to "the surprising failure of anonymization" (2010), and much literature was published lately, whether in Europe or the United States, that goes along the same lines. Indeed, a number of recent case studies have shown that re-identification of specific persons inside an anonymized data set is possible under circumstances. The development of new technologies makes it constantly easier to combine data from different sources, hence to identify individuals that were supposedly unidentifiable in the first place.
With these factors in mind, professionals should carry out comprehensive risk-based analyses, in order to identify and solve any privacy arguments that may occur on their projects. On its side, CNIL works on the development of incentives to develop systematic privacy impact assessments at all stages of data projects, e.g. by developing specific guidelines. Other data protection authorities are working on this idea of anonymization, too: for example, the UK Information Commissioner's Office conducted a public consultation on its draft Anonymization Code of Practice during summer 2012. In substance, the efficiency of anonymization processes depend on the type and number of data processed, the purpose behind this processing, as much as on the technical features of the anonymization procedure.
The issues at stake are fundamental, namely in the light of open data. The development of open data projects triggers huge expectations in terms of innovation and growth, but these must be combined with equally strong privacy expectations. Efficient anonymization is key to reconciling both lines of expectations.

Can we expect new models empowering end users with their personal information, around for instance vendor relationship management initiatives? Should VRM require some specific regulation?

To me, this is quite an educated guess. I think that a trend towards consumer empowerment could emerge in the near future. Whether vendor relationship management can be the ground for effective new user-centric business models around data is not yet established – but it is a smart and legitimate way to think about balancing between individuals and vendors in the incoming data deluge. What seems obvious to me is that those new business models to be developed will need to be met by innovative and modern regulations. For example, when looking at the first prototypes of VRM-based services, e.g. personal data stores, it is easy to anticipate that those operators will have to focus on consumer trust and really be on the side of the consumers. If these projects develop into variables of Trojan horses, encouraging more data dissemination or more uncontrolled combination of data across services, they will either fail or design a very worrying landscape for privacy and individual freedoms.
Moreover, we shouldn't forget that the scope of this is pretty narrow: empowering individuals is a good thing, but it's a mistake if we end up giving them the whole burden of regulation. People don't spend all their day managing their relations with vendors, that's not their purpose in life to deal with them. They are not going to spend a lot of time negotiating and setting these relations, and so it may actually not be in their best interest to have a 100% market-oriented point of view. We should never forget that privacy isn't solely a consumption and market issue: it's above all a citizen freedoms and rights issue.

How are taken privacy-related initiatives from data protection agencies at the national level?

At the European level? Is it enough when dealing with non-European companies (typically the US-based internet giants)?
The year 2012 provides us with two cases of stronger cooperation between European data protection authorities (DPAs), and between them and other non-EU DPAs. First, the Irish Data Protection Commissioner led a comprehensive audit of Facebook compliance with European Data Protection law, which was instrumental in the decision by Facebook to change the way its face recognition tools are used on its European platform (the photo Tag Suggest feature, for example).
Then, the Article 29 Working Party mandated CNIL to lead the European investigation into the new privacy policy which Google implemented in March 2012. It was really the first time that a national authority was leading a process involving all 27 agencies. The letter sent to Google on October 16th was therefore signed by all the chairpersons of the agencies, not only by CNIL and Article 29 Working party chairs.
Two questionnaires were successively sent to Google to address the numerous implications of by these changes. The analysis of Google's answers and the examination of numerous documents and technical mechanisms by CNIL's IT and legal experts have led all the EU DPAs to draw their conclusions and make substantial recommendations of change to Google. We recommend that clearer information is provided to users. Also, we require Google to offer improved user control over the combination of data across its numerous services, and to modify the tools it uses to avoid an excessive collection of data. The release of this October report clearly is not the end of the process. We shall follow-up on these recommendations over the coming months, together with our EU colleagues as well as with other non-EU DPAs which endorsed our common findings. As you can see, cooperation among DPAs is an efficient lever for regulation and compliance, even when dealing with non-European firms. This is why this point is, to me, a core issue in the debate around the future EU privacy regulation.

Short Biography

Isabelle FALQUE-PIERROTIN graduated in France from the HEC School of Business Management ("Ecole des Hautes Etudes Commerciales"), the National Administration School ("Ecole Nationale d'Administration") and the Multimedia Institute ("Institut Multimédia"). She first held various posts with the French State Council ("Conseil d'Etat"), as an "auditeur" from 1986 to 1989, a "maître des requêtes" (counsel) from 1989 to 2001 and was responsible of the relations with the print and broadcast media from 1988 to 1991. Ms. Falque-Pierrotin also served as Deputy Chair of the French Ministry of Culture and French-Speaking World Matters from 1993 to 1995. She became State Counselor ("Conseiller d'Etat") in November 2001. After serving as Chair of the Interministerial Commission on Internet Affairs in 1996, she was appointed as an expert adviser for the Organization for Economic Cooperation and Development (OECD) in 1997 and as "rapporteur général" of the report of the French State Council on "Internet and Digital Networks" from 1997 to 1998. From 2001 to December 2010, Ms. Falque-Pierrotin was Chair of the Advisory Board and General Delegate ("délegué générale") of the French Internet Rights Forum ("Forum des droits sur l'internet").Isabelle Falque-Pierrotin has been a member of the French Data Protection Authority ("Commission nationale de l'informatique et des libertés") since January 2004. Appointed as Deputy Chair of this authority from February 2009 to September 2011, she became its Chair as of September 21, 2011.

> For more information about our activities: www.comstrat.org

Contact
COMMUNICATIONS & STRATEGIES
Sophie NIGON
Managing Editor
s.nigon@idate.org

[1] See "Cloud computing: CNIL's recommendations for companies using these new services", 25 June 2012.