In an earlier report on social media, I touched on the May 2014 decision of the Court of Justice of the European Union in Google Spain SL, Google Inc. v Agencia Española de Protección de Datos (AEPD), Mario Costeja González (C-131-12), which effectively established in the European Union what is now known as the ‘right to be forgotten.’
Google responded to the decision by setting up a process enabling people to request the removal of certain search results. According to Google’s latest transparency report, to date it has received 318,269 ‘right to be forgotten’ requests, resulting in the evaluation of 1,126,518 URLs, and the removal of 41.6% of the flagged URLs.
As a matter of practice, Google only removes such results from searches performed on sites with European Union domain extensions, leaving the search results intact elsewhere.
The original ruling sent shock waves through Google, other internet search companies, and cyberspace generally, and pitted the reverence Americans have for freedom of speech, against a European penchant for privacy. It would appear those shock waves continue to reverberate.
In May 2015 the President of the French data privacy regulator, Commission nationale de l’informatique et des libertés (CNIL) issued a notice to Google requiring it to remove relevant search results from searches not just in the European Union, but worldwide.
While France is the first European country to push for a global application of the ‘right to be forgotten,’ the Article 29 Data Protection Working Party took a similar position in November last year on the issue of cleaning up search results globally, noting it was the only way to ensure the ‘effective and complete protection of data subjects’ rights and that EU law cannot be circumvented:’
7. Territorial effect of a de-listing decision
In order to give full effect to the data subject’s rights as defined in the Court’s ruling, delisting decisions must be implemented in such a way that they guarantee the effective and complete protection of data subjects’ rights and that EU law cannot be circumvented. In that sense, limiting de-listing to EU domains on the grounds that users tend to access search engines via their national domains cannot be considered a sufficient mean to satisfactorily guarantee the rights of data subjects according to the ruling. In practice, this means that in any case de-listing should also be effective on all relevant domains, including .com.
Guidelines on the implementation of the Court of Justice of the European Union judgment on “Google Spain and Inc v. Agencia Española de Protección de Datos (AEPD), Mario Costeja González” C-131/12, Article 29 Data Protection Working Party (26 November 2014)
In July, Google lodged an informal appeal against the notice, asking for it to be withdrawn, noting it would impede the public’s right to information and would be a form of censorship.
On 30 July, Google’s Global Privacy Counsel, Peter Fleischer responded in detail to the notice on Google’s Europe Blog:
While the right to be forgotten may now be the law in Europe, it is not the law globally. Moreover, there are innumerable examples around the world where content that is declared illegal under the laws of one country, would be deemed legal in others: Thailand criminalizes some speech that is critical of its King, Turkey criminalizes some speech that is critical of Ataturk, and Russia outlaws some speech that is deemed to be “gay propaganda.”
If the CNIL’s proposed approach were to be embraced as the standard for Internet regulation, we would find ourselves in a race to the bottom. In the end, the Internet would only be as free as the world’s least free place.
We believe that no one country should have the authority to control what content someone in a second country can access. We also believe this order is disproportionate and unnecessary, given that the overwhelming majority of French internet users—currently around 97%—access a European version of Google’s search engine like google.fr, rather than Google.com or any other version of Google.
As a matter of principle, therefore, we respectfully disagree with the CNIL’s assertion of global authority on this issue and we have asked the CNIL to withdraw its Formal Notice.
We have worked hard to strike the right balance in our implementation of the European Court’s ruling and have maintained a collaborative dialogue with the CNIL and other data protection authorities, who agree with our decisions in the majority of cases referred to them. We are committed to continuing to work with regulators in this open and transparent way.
While I am sympathetic to the points made by Mr Fleischer, arguably there is a significant difference between protecting personal privacy and reputation on the one hand, and supporting state enforced censorship on the other.
On 21 September CNIL rejected Google’s appeal noting, among other things, that:
- once Google accepts information should be removed, that removal should be implemented across all domain extensions, not just in the EU; and
- limiting the ‘right to be forgotten’ to only some domain extensions, makes it easily circumventable, effectively stripping away the right.
The rejection of its appeal by CNIL means Google must comply with the notice, or risk sanctions by CNIL, which include fines of up to €150,000, reaching €300,000 for repeat offences.
Under proposed amendments to the current European Data Protection Directive 95/46/EC, that fine could soon increase. The European Commission proposes a maximum fine of €1 million, or 2% of the annual worldwide turnover of a company, but the EU Parliament, wanting to send a message about the great importance it attaches to compliance, proposes this amount be raised to €100 million, or 5% of the annual worldwide turnover of a company.
While Google can’t appeal the notice issued by CNIL, it will be able to appeal any fines levied to the French supreme court for administrative justice, the Conseil d’État.