Privacy in Search Engines: Negotiating Control: Dr. Federica Casarosa, University of Trento
Privacy in Search Engines: Negotiating Control: Dr. Federica Casarosa, University of Trento
Abstract
Internet and generally modern communication technologies have radically modified current society, bringing in new risks for citizens privacy. In the last few decades, an impressive number of innovations and technological progress have been produced, so that communication and/or dissemination of information can rely on a global network infrastructure. These developments raise a demand for a more precise and uniform data protection legal framework, which could spell out citizens rights concerning their personal data whenever they are given away. The tangible effects of such technological progress have been the improvements of tools for retrieval and collection of data, and the increased capability of storage and aggregation of collected information. This can be interpreted positively in terms of much greater and better opportunities for the development of personality, making available information previously inaccessible. However, the same technical tools can be used to achieve the opposite result: prevent the expression of users personality through a continuous, though imperceptible, control. A recent example that can show the market dynamics and the legal reactions concerning data protection, is the search engine Google, which received a brief but significant letter from the Art 29 Working Party due to the low level of protection assured by the Mountain-view society in the delivery of its services. The intervention has triggered a positive process in the direction of an improvement of data protection policies of search engines, so as to achieve the level required by European legislation.
1. Introduction
Internet and generally modern communication technologies have radically modified current society, bringing in new risks for citizens privacy. In the last few decades, an impressive number of innovations and technological progress have been produced, so that communication and/or dissemination of information can rely on a global network infrastructure. These developments raise a demand for a more precise and uniform data protection legal framework, which could spell out citizens rights concerning their personal data whenever they are given away. The tangible effects of such technological progress have been, on the one hand, the improvements of tools for retrieval and collection of data, and, on the other hand, the increased capability of storage and aggregation of collected information1. This can be interpreted positively in terms of much greater and better opportunities for the development of personality, making available information previously inaccessible (due to high cost or efforts needed to access). However, the same technical tools can be used to achieve the opposite result: prevent the expression of users personality through a continuous, though imperceptible, control that could shift the interpretation of user profiles from a pre-judgment into a prejudice.2 Another difficulty is the lack of awareness concerning the amount of personal information that any user sprinkles around through modern technologies. A clear example are the so-called social networks, where users, once registered, make public their personal data in order to establish a set of contacts with others who went to the same schools or universities (such as in facebook.com3), or who work in the same sector or firm (such as in linkedin.com4), or even in order to make self-promotion (such as in myspace.com5). In all these cases, users make available personal information, including sensitive data, to the entire circle of registered users. This behaviour can simplify the collection and the elaboration of users profiles, giving leeway also to secondary use by third parties, who can take advantage of such information for behavioural targeting or, more dangerously, for identity theft6.
M. Castells, The Internet Galaxy: Reflexions on the Internet, Business, and Society, OUP, Oxford, 2001.
2
The autonomy taken into consideration by the Law is not the liberty of an isolated individual but of a member of a free society and precisely able to bring his or her original input within a democratic society. The value of self-determination is fundamentally to be viewed as a condition for ensuring a free and democratic society in which each individual feels free to express him or herself without fearing to be judged by private or public authorities on the basis of information collected and/or processed by them, in Y. Poullet e J.M. Dinant, Towards new Data protection principles in a new ICT environment, IDP Revista dInternet, Dret I Politica, 5, 2007, 3. See also R. Pardolesi, Dalla riservatezza alla protezione dei dati: una storia di evoluzione e discontinuit, in R. Pardolesi (a cura di), Diritto alla riservatezza e circolazione dei dati personali, Giuffr, Milano, 2003, 20.
3
See http://www.facebook.com. On this issue see I. Brown, L. Edwards e C. Marsden, Stalking 2.0: privacy protection in a leading social networking site, paper presented at the conference GiKii returns, London University College, 19 September 2007, available at http://www.law.ed.ac.uk/ahrc/gikii/docs2/edwards.pdf, and more recently L. Edwards and I. Brown, Data control and social networking: irreconcilable ideas?, available at http://ssrn.com/abstract=1148732.
4 5
From a legal point of view, different solutions have been put forward, descending from different approaches. On the one hand, we can observe the case of selfregulation, where technology itself can help to limit the aforementioned risks for personal data, as in cases of the Platform for Privace Preferences (P3P) or the Platform for Internet Content Selection (PICS)7. On the other hand, we can take the example of legislative harmonisation implemented by the Member States in the EU, where the monitoring activity is carried out by independent authorities, the so called Data protection Authorities8. Such different approaches show the difficulties in balancing the need of protection claimed by users with the interest in the free flow of data9. The solutions, however, can be limited also by technology because the collection of data is often connected to the normal functioning of the net itself10. A recent example that can show the market dynamics and the legal reactions concerning data protection, is the search engine Google, which received a brief but significant letter from the Art 29 Working Party (hereinafter Art 29 WP)11 due to the low level of protection assured by the Mountain-view society in the delivery of its services. The intervention, though not binding, has been the first step for Google in the direction of an improvement of its data protection policy, so as to achieve the level required by European legislation.
G. Maccaboni, La profilaizone dllutente telematico fra tecnoche pubblicitarie on-line e tutela della privacy, Riv. Dir. Inf. e Informatica, 2001, 425; C. DAgata, Self e strict regulation: il trattamento dei dati personali nellapproccio pluridisciplinare di tutela introdotto dal codice della privacy, Riv. Dir. Inf. e Informatica, 2004, 883; L. Edwards e G. Howells, Anonimity, consumers and the Internet: where everyone knows youre a dog in C. Niccol, J.E.J. Prins, M.J.M. van Dellen (a cura di), Digital anonimity and the law Tensions and dimensions, T.M.C. Asser, The Hague, 2003, 221.
7 8 9
L. Edwards e G. Howells, Anonimity, consumers and the Internet, cit., 240. See Directive n. 95/46/CE on the protection of personal data, 23 November 1995, L 281/31. E. Pellecchia, Scelte contrattuali e informazioni personali, cit., 14.
10
See the example of cookies which are capable of storing a large amount of information concerning the surfing activity of each user, but they are also indispensable in order to accelerate the download of web pages reducing the connection costs. See also recital 25 in directive n. 2002/58/EC: such devices, for instance so-called cookies, can be a legitimate and useful tool, for example, in analysing the effectiveness of website design and advertising, and in verifying the identity of users engaged in on-line transactions. Where such devices, for instance cookies, are intended for a legitimate purpose, such as to facilitate the provision of information society services, their use should be allowed on condition that users are provided with clear and precise information in accordance with Directive 95/46/EC about the purposes of cookies or similar devices so as to ensure that users are made aware of information being placed on the terminal equipment they are using.
11
The Working Party has been established by Article 29 of Directive 95/46/EC. It is the independent EU Advisory Body on Data Protection and Privacy. Its tasks are laid down in Article 30 of Directive 95/46/EC and in Article 14 of Directive 97/66/EC. The Working Party was set up to achieve several primary objectives: to provide expert opinion from member state level to the Commission on questions of data protection; to promote the uniform application of the general principles of the Directives in all Member States through co-operation between data protection supervisory authorities; to advise the Commission on any Community measures affecting the rights and freedoms of natural persons with regard to the processing of personal data and privacy; to make recommendations to the public at large, and in particular to Community institutions on matters relating to the protection of persons with regard to the processing of personal data and privacy in the European Community.
If the main profits that can be earned out of Internet users was initially connected with the provision of access to the network itself, better results are now available through the creation and functioning of search engines. One of the most important, known and used is Google12, which has recently achieved the amount of 8 billions pages indexed. Originally the algorithm, that underlines the current search engine, was created by two young university students in 1998, in order to define a set of variables that could help the selection of information in relation to any specific keyword. Nowadays the main objective of this firm is to organise information at global level and make them universally available and accessible13 This has recently been confirmed by the chief executive of Google, Eric Schmidt, in an interview14, where he clearly stated that the next step in Googles development will be to provide an answer also to match the right answer with the right user also for questions like What shall I do tomorrow? or What job should I choose?. This would mean a shift from the former selection of possible outcomes for users queries, to the provision of personalised answers. This evolution should not be interpreted as a fascinating utopia, rather as a potential reality that could be developed as soon as mathematical algorithms drafted by Googles staff will improve, and obviously when personal data provided by users will be included in the variables too. As a matter of fact, in order to achieve such an ambitious result, one of the main basis will still be the availability of users profiles15. Many are the services provided by Google in a personalised manner: iGoogle offers the users a personalised webpage, including the possibility to publish user-made content16; Google web history offers the collection of the past searches (plus their chronology, results, and selected pages)17. A more interesting example is Gmail, a webmail service presented in 2004 which offers the great advantage of a storage capacity of more than 6 Gigabytes online18. This service, as the aforementioned, is provided for free, however, what is not explicitly stated is that there is an exchange: the service is paid through the accessibility of users emails content. In other words, Google included in the terms of the service a clause concerning the fact that a software run program19 will read users emails (at the moment these are opened on the
12 13
See also that the name Google is connected to this mission as it comes form gogool, which is the term that indicates a number constituted by 1 followed by 100 zeros. In the presentation pages it is stated that Google's play on the term reflects the company's mission to organize the immense amount of information available on the web, See at http://www.google.it/intl/en/corporate/index.html .
14 15
Available at http://www.ft.com/cms/s/2/c3e49548-088e-11dc-b11e-000b5df10621.html.
One of the main elements highlighted by the chief executive was that Google cannot even answer the most basic questions because we dont know enough about you. That is the most important aspect of Googles expansion.
16 17 18
In 2004, the storage capacity was 1 gygabite, and this quantity has increased progressively, see the project at http://mail.google.com/mail/help/intl/en/about_whatsnew.html .
19
Computers scan the text on a given page, perform a mathematical analysis on it and match it to ads in our extensive database. No humans are involved in this process and no one reads your mail. You may find that hard to believe when you see how closely the ads match the topic of your message, see at the webpage http://mail.google.com/mail/help/intl/en/why_gmail.html.
screen) and, having recognised the keywords in the text, will provide small, unobtrusive and relevant advertisements20 on the right side of the screen. What emerges from this picture is that Google has already started to provide services in which the information is selected through the lens of users preferences and taste. Although this is not completely new, as there has always been monitoring activities in the past, particularly from peers; it is worth noting that, nowadays, the means to store and organise information are improved, eliminating the imperfections of human memory and increasing number and quality of aggregated results21. In so far, it is not impossible that profiling, if lawfully compiled, can provide advantages for users, as he/she will receive correct and personalised information, however, if this treatment is done in a hidden way or disguising the real objective of the treatment, then the risks are those of manipulation of users behaviour22 and of lost of substantial equality among users23. Thus, it is necessary to verify the terms and conditions of the data treatment in order to evaluate if it is proportionate with its objectives.
20
By offering Gmail users relevant ads and information related to the content of their messages, we aim to offer users a better webmail experience. For example, if you and your friends are planning a vacation, you may want to see news items or travel ads about the destination you're considering. [] Many people have found that the search-related ads on Google.com can be valuable--not merely a necessary evil, but a welcome feature. We believe that users will also find Gmail's ads and related pages to be helpful, because the information reflects their interests. In fact, we have already received positive feedback from Gmail users about the quality and usefulness of our ads and related pages, Ibidem.
21
However, it is necessary to stress the fact that in real world transactions, privacy/anonymity is the norm, and the consumer has the chance to explicitly choose to disclose more personal information [] whereas in on-line transactions, it is generally more true to say that disclosure of personal information is the norm, and the consumer has to take explicit action to protect their privacy, L. Edward e G. Howells, Anonymity, Consumers and the Internet: Where everyone knows youre a dog, cit., 209.
22
See E. Pellecchia, Scelte contrattuali e informazioni personali, Torino, Giappichelli, 2005, XIII, where the Author affirms that la crescente utilizzazione a fini decisionali di profili personali automatizzati per la valutazione dei requisiti dei possibili contraenti, ed esempio, soddisfa adeguatamente le esigenze di celerit e standardizzazione nelle contrattazioni, ma pone anche problemi di non scarso rilievo. La selezione del contraente o la previsione di condizioni contrattuali particolari spesso il risultato della (sola) comparazione tra profilo concreto e profilo standard di riferimento assunto a parametro di valutazione di determinate qualit.
23
See A. Fici e E. Pellecchia, Il consenso al trattamento, in R. Pardolesi (a cura di), Diritto alla riservatezza e circolazione dei dati personali, Giuffr, Milano, 2003, vol. I, 470, and less recently S. Rodot, Persona, riservatezza, identit. Prime note sistematiche sulla protezione dei dati personali, Riv. crit. dir. priv. 1997, 605.
24
Available at : http://www.google.it/privacypolicy.html.
In this case, information concerns the name and the email to register into services, or credit card number for pre-paid services, anyway, in the privacy policy it is added that: We may combine the information you submit under your account with information from other Google services or third parties in order to provide you with a better experience and to improve the quality of our services. For certain services, we may give you the opportunity to opt out of combining such information. Thus, in some cases users can limit the coordination and the organisation of his/her own data, but generally the data collection is done without any request of consent. In particular, in case of cookies or server logs, such collection is carried out surreptitiously25, storing all the information concerning online activity; in Googles privacy policy this is justified affirming that We use cookies to improve the quality of our service by storing user preferences and tracking user trends, such as how people search. Most browsers are initially set up to accept cookies, but you can reset your browser to refuse all cookies or to indicate when a cookie is being sent. Therefore cookies are clearly set so as to gather as much as information available on users preferences and online behaviour in order to use them to improve and develop new search tools. From a market perspective, this is perfectly acceptable, however, as the data subjects are not always aware of such treatment it would not be deemed as lawful. The art 29 WP pointed out exactly at these issues in its letter to Google 26, namely the WP indicated that: Google has so far not sufficiently specified the purpose for which server logs need to be kept, as required by article 6 (1) (e) of Data protection Directive 95/45/EC. Taking into account Googles market position and ever-growing importance, the Article 29 Working Party would like further clarification as to why this long storage period was chosen. The Art 29 WP, referring to the provisions of dir. 2002/58/CE on privacy and electronic communication, namely art 5(3), underlines that in case of cookies the purpose of their processing activity should be made clear to the user in a clear and comprehensible manner; whereas the google cookie has a validity of approximately thirty years, which is disproportionate with respect to the purpose of the processing which is performed and goes beyond what seems to be strictly necessary for the provision of the service.
25
Obviously, this is not true when the user decides to set the navigation preferences in the browser so as to receive a notification when a cookie is ask to be installed on the computer, and eventually requiring his consent to such installation.
26
The WP reminded that [a]lthough Googles headquarters are based in the United States, Google is under legal obligation to comply with European laws, in particular privacy laws, as Googles services are provided o European citizens and it maintains data processing activities in Europe, especially the processing of personal data that takes place at its European centre.
The reaction from Google was diluted: firstly, on June 10th 2007, Googles privacy counsellor announced the new policy: to anonymize the search server logs after eighteen months, rather than the previously established period of eighteen-to-twentyfour months27. However it has been a conditioned concession, as the privacy counsellor stated that this period would have to be extended again in order to comply with Data retention directive n. 2006/24/EC28. A second step happened in July 16th 2007, when, following the increased interest of consumers on privacy issues, Google published the new policy concerning cookies: each Google cookie will expire after two years, though this duration is calculated from the moment of its last utilisation of Googles services. Thus, anytime the user opens up Google webpage a new expiration date is set. This solution does not change much the results for Google, as regular users will move the expiration date further and further. Moreover, this solution allocates on the user the choice to decide whether or not to allow the collection of data through cookies. Anyway, this cannot be interpreted as a consent to the data process as it should have been well informed and freely given, i.e. a voluntary choice by the user should have been made. In this case instead, the choice between using search engines under current policies and forgoing use altogether is not a choice at all, as the services would not be provided in case of denial of consent29.
It should be noted that, in January 2007, the Datatilsynet, the Norway data protection Authority, started an investigation of search engines operating on national territory, verifying the quantity of data collected and their anonymity. The Norway Authority, after national providers as Sesam and Kvasir, approached also Google because it operated with user resident in Norway. The data protection Authority contested to Google the retention of users search histories (within the server logs) for up to two years before anonymising them. After an initial denial of cooperation from Google, in March 2007 the privacy counsellor announced a reduction of data retention period in server logs to eighteen-totwenty-four months.
28
Directive n. 2006/24/EC of the European Parliament and the Council, 15 March 2006 on the retention of data generated or processed in connection with the provision of publicly available electronic communications services or of public communications networks and amending Directive 2002/58/EC, O.J. 13 April 2006 L 105/54.
29
See O. Tene, What Google knows: privacy and internet search engines, available at http://ssrn.com/abstract=1021490. Not using Google means not participating in today's information society. It is tantamount to never using a telephone, not riding a car, or residing in a secluded cabin in the woods. Google has become ubiquitous, practically a public utility. "Consent" is illusory where it is given (implicitly) by a captive audience to an agreement few if any users have ever read, which includes provisions that the vast majority of users are not aware of, that were unilaterally drafted to serve corporate interests. A privacy protection regime based on such consent provides no privacy protection at all.
30
provide a unified approach towards privacy issues. Moreover, Microsoft privacy counsellor, Peter Cullen, announced that also its company will anonymise their server logs in eighteen months, following Google; Yahoo! goes further and reduced the timeframe to thirteen months, while Ask.com announced that it will provide users a tool to erase completely their search activity made on the website. The framework described above shows how market actors, Googles competitors, reacted to the changed service conditions proposed by Google, simply adjusting their own. This was not due to the shame and blame of receiving a letter of disapproval from the Art 29 WP, rather not to lose those users that are privacy-sensitive and that could have perceived the changes done by Google as an peerless improvement of the service31. On the other hand, the debate in Europe on privacy in electronic communication has raised also public opinion and institutions attention on the other side of the Atlantic. In November 2007, the Federal Trade Commission (FTC) proposed a dialogue among all the stakeholders concerning the issue of behavioural targeting, asking comments and suggestions concerning the opportunity to define a set of guidelines for selfregulation on such issue32. Answering to this call, the Network Advertising Initiative (NAI), a consortium involving the main companies working with behavioural targeting33, declared that it would present the updated principles of self-regulation, taking into account both the FTC proposal and the technical developments appeared after 2000 (when the principles were initially drafted)34. Here, the public opinion promoted an awareness-raising process, which triggered an effort for a dialogue among all the stakeholders and the federal institution so as to achieve a balance between the right of users to privacy and the interest of companies to exploit economically new technologies, through the tool of self-regulation.
31
See could have been defined by law & economics studies as a case of self-regulation of the market advantageous for consumers. See A. Schwartz and L.L. Wilde, Intervening in markets on the basis of imperfect information: A legal and economic analysis, 127 U. Pa. L. Rev. 630, spec. 638; H. Beales, R. Craswell and S.C. Salop, The Efficient Regulation of Consumer Information, Journal of Law and Economics, 1981, 503.
32 33 34
See the document available at: http://www.ftc.gov/opa/2007/12/principles.shtm. Among the members of the consortium there are also AOL, Yahoo!, e Google.
Opinion on data protection issues related to search engines, adopted on 4th April 2008, WP 148, available at http://ec.europa.eu/justice_home/fsj/privacy/docs/wpdocs/2006/wp148_en.pdf (hereinafter WP 148).
protection rules on foreign controllers of personal data 37 due to the application of Art 4 of the Data protection Directive and, finally, the denial of the application of directive 2006/24/EC on data retention. Concerning the second point, Art 4 of the Data protection directive provides that a Member States data protection law should be applied, when certain operations of personal data processing by the controller are carried out in the context of the activities of an establishment of that controller on the territory of a Member State. For those search engines whose headquarters are locates outside the European Economic Area (EEA), the applicability depends on whether the processing of user data involves the establishment on the territory of a Member State. An alternative criterion is the use of equipment, automated or otherwise, situated on the territory of the said Member State, unless such equipment is used only for purposes of transit through the territory of the Community. The Art 29 WP understands very widely such provision so as to interpret cookies provided from any ISP as technical equipment 38. This kind of interpretation can surely simplify the application of the data protection directive: almost every ISP, not only search engines, use cookies in order to improve rapidity of connection and to obtain navigation information from users, thus each ISP can be defined as a controller of data treatments and obliged to comply with the European legislation. This could have a double effect: on the one side, it would lead to an implicit global application of the data protection directive, irrespective of the origins of the ISPs, thus posing obvious difficulties in compliance control; on the other side, such wider field of application would be the outcome of an extended meaning for establishment, diverging from the one currently applied in contract or in commercial law39. Concerning the application of the Data retention directive, Art 6 of the directive provides an obligation to retain data between six to twenty-four months of communication data, which is imposed only on data are generated or processed, and stored (as regards telephony data) or logged (as regards Internet data), by providers of publicly available electronic communications services or of a public communications network within the jurisdiction of the
36
See WP 148, pag. 6. Concerning the role of search engines as concept creators see N. Elkin Koren and E.M. Salzberg, Law, Economics and Cyberspace The effects of cyberspace on the economic analysis of law, Edward Elgar, Cheltenam, 2004.
37
A search engine provider that processes user data including IP addresses and/or persistent cookies containing a unique identifier falls within the material scope of the definition of the controller, since he effectively determines the purposes and means of the processing.
38
This issue was discussed also before in the working document on determining the international application of EU data protection law to personal data processing on the Internet by non-EU based web sites, WP56, available at http://ec.europa.eu/justice_home/fsj/privacy/docs/wpdocs/2002/wp56_en.pdf. It stated that the users PC can be viewed as equipment in the sense of Article 4 (1) c of Directive 95/46/EC. It is located on the territory of a Member State. The controller decided to use this equipment for the purpose of processing personal data and, as it has been explained in the previous paragraphs, several technical operations take place without the control of the data subject. The controller disposes over the users equipment and this equipment is not used only for purposes of transit through Community territory.
39
See J. Catchpole, The regulation of electronic commerce: A comparative analysis of the issues surrounding the principles of establishment, International journal of law and information technology, 2001, 1.
Member State concerned in the process of supplying the communication services concerned 40. Thus, only on the Internet Service Providers and not on search engines. Moreover, the data to be retained are only the categories of data specified in Article 5, in particular the data necessary to trace and identify the source of a communication41, the data necessary to identify the destination of a communication42, the data necessary to identify the date, time and duration of a communication43, the data necessary to identify the type of communication44, the data necessary to identify users communication equipment or what purports to be their equipment45, excluding any reference to the content of the communication, such as the content of the users searches46. The processing of personal data, however, can be lawfully made when the purpose and/or grounds are clearly stated by search engines. Art 29 WP analyses each justification provided concluding that the majority of them are not acceptable, or at least they do not legitimately justify the storing of data that as not been anonymised. Possible suggestions proposed are a reduced period of retention for search engines (not beyond six months), a limitation of processing for multiple purposes (usually aimed at the development of new service whose nature is as yet undecided), and a limitation for data correlation across services. In general, the evaluation of Art 29 WP does not provide many hints to the industry, except for some negative proposals. Yet, this eagerly awaited opinion does not meet the expectation of the institutional and market actors, while it could have provided, if not positive proposal, at least best practices currently available in the sector analysed so as to show the better path to achieve an effective data protection level.
40 41
Those concerning Internet access, Internet e-mail and Internet telephony are the user ID(s) allocated; the user ID and telephone number allocated to any communication entering the public telephone network; the name and address of the subscriber or registered user to whom an Internet Protocol (IP) address, user ID or telephone number was allocated at the time of the communication.
42
I.e. the user ID or telephone number of the intended recipient(s) of an Internet telephony call; the name(s) and address(es) of the subscriber(s) or registered user(s) and user ID of the intended recipient of the communication.
43
I.e. the date and time of the log-in and log-off of the Internet access service, based on a certain time zone, together with the IP address, whether dynamic or static, allocated by the Internet access service provider to a communication, and the user ID of the subscriber or registered user; the date and time of the log-in and log-off of the Internet e-mail service or Internet telephony service, based on a certain time zone.
44 45
I.e the calling telephone number for dial-up access; the digital subscriber line (DSL) or other end point of the originator of the communication.
46
See also art 5, 2 directive 2006/24/EC. Art 29 WP in a recent opinion (n. 3/2006) on the Data retention directive did stated similarly that The data to be retained should be kept to a minimum, and any changes to that list should be subject to a strict necessity test, moreover Providers of public electronic communication services or networks are not allowed to process data retained solely for public order purposes under the Data Retention Directive for other purposes, especially their own. See WP n. 119, 25 March 2006, available at http://ec.europa.eu/justice_home/fsj/privacy/docs/wpdocs/2006/wp119_en.pdf.
4. Conclusion
Since 2007, a debate has animated both sides of the Atlantic involving both public institutions, industry and consumers, which brought a better awareness concerning the risks for privacy in online services. Not only users become aware that the sense of anonymity and security under which they perceive their surfing activity is fake, but also they are now conscious that the search engines build up bit by bit their queries into a rich personal profile.
47
See Online Behavioral Advertising: Moving the Discussion Forward to Possible Self-Regulatory Principles, available at http://www.ftc.gov/os/comments/behavioraladprinciples/080411microsoft.pdf.
48
A similar proposal concerning multi-layered notices to be provided to the data subjects was proposed by Art 29 WP in a previous opinion (Opinion on More Harmonised Information Provisions, 25 November 2004, WP 100).
49
As a result the service providers would have always to comply with the higher level obligations in order to be sure that advertising activity is lawful.
50
Paraphrasing the old comic concerning Internet, is it really true that On internet nobody knows that youre a dog, if the searches you do concerns the best offers for dog cookies, the available vets, how to become a member of the man best friend, etc.?
51
available
at
Art 29 WP, Opinion 4/2007 on the concept of personal data, 20 June 2007, WP 136.
Therefore, what has started as a non binding recommendation from the Art 29 WP triggered a deeper analysis of the search engine activity concerning user personal data, imposing at the same time a better and wider cooperation at European and global level. However, the question still unanswered is if a complete privatization of internet governance is acceptable (leaving the choice concerning user protection and security to market actors) or if a better dialogue on common rules and principles should be established in order to provide a real multi-stakeholder protection where neither the interests of users will be under protected nor the needs of industry will be unheeded. What is obvious, then, is that the protection of personal data is still an unsolved issue at the core of online surfing activity.