GC & others vs CNIL and Google: This is a special case

On the 24 September 2019, the Grand Chamber of the Court of Justice (hereafter: ECJ) released its judgment in the second of two cases in as many weeks concerning the ‘Right to be forgotten’. The first, Google v CNIL, tacked the territorial scope of the right. In the second, GC, AF, BH, and ED v Commission nationale de l’ínformatique et de Libertes (CNIL), Premier ministre, and Google LLC (hereafter: GC), the Court tackled a request for a preliminary ruling after the French data protection authority (CNIL) refused to require Google to dereference various links to third party websites in the list of results displayed following searches of their names.

A claimant known as GC wanted a link to a satirical photomontage depicting her in an illicit relationship with a politician removed from Google’s search returns. AF wanted search results removed that identified him as a public relations officer for the Church of Scientology, a position he no longer held. BH wanted deindexing of articles linking him to contemporaneous investigations into the funding of political parties, but that did not reveal their outcomes. ED had requested the de-indexing of articles that mentioned a prison sentence of seven years and ten years judicial supervision for sexual assaults on children under the age of 15. The common thread between all of the parties was that the links included special categories of personal data within the meaning of Article 8(1) and (5) of the now repealed Data Protection Directive 95/46/EC (similar provisions can now be found in Article 9 of the European Union’s General Data Protection Regulation).

CNIL concluded that these requests were to be denied and closed their files. The parties brought an action before the Conseil d’Etat asking the Court to serve notice on Google to carry out the dereferencing request. The Court admitted to a struggle interpreting the meaning of special categories of data under Article 8(1) and (5) of Directive 95/46/EC. The first question asked the ECJ to determine whether the general prohibition of processing sensitive data applied to a search engine:

“Having regard to the specific responsibilities, powers and capabilities of the operator of a search engine, does the prohibition imposed on other controllers of processing data caught by Article 8(1) and (5) of Directive 95/46, subject to the exceptions laid down there, also apply to this operator as the controller of processing by means of that search engine?”

Article 8 of Directive 95/46/EC contains the prohibition on the processing of special categories of data (now Article 9 GDPR). Simply put, the referring court wanted to know whether Google is a data controller for the special categories of data it processes. If the answer is yes, then must Article 8(1) and (5) of the Directive be interpreted in a way that requires search engines, subject to exceptions in the Directive, to grant the requests for de-indexing links to web pages that contain that type of data?

Background

Before we go on to discuss the case further, some background is provided. The ECJ held in Google Spain (Case C-131/12), discussed previously here and herethat search engines are data controllers under Article 2 of the Data Protection Directive. In European data protection law, any actor that has satisfied the personal scope is still prohibited from processing personal data unless they have satisfied one of a handful of grounds. In Google Spain, the Court found that Google, when acting as a search engine, was processing personal data in its legitimate interest, one of the recognized grounds found in Article 7 of the Directive.

However, in Google Spain, the Court was not asked to determine whether search engines also processed special categories of personal data. According to Article 9 of the GDPR and Article 8(1) and (5) of the Data Protection Directive, a data controller must not only have a legal basis for processing, but must also fulfil one of the criteria in the exception in order to and to circumvent the prohibition on the processing sensitive data. Surprisingly, the question of what exception Google has been  using has never been answered.

GC and Others

 In GC, the first question was answered affirmatively by the Court at para. 48:

“It follows from the above that the answer to Question 1 is that the provisions of Article 8(1) and (5) of Directive 95/46 must be interpreted as meaning that the prohibition or restrictions relating to the processing of special categories of personal data, mentioned in those provisions, apply also, subject to the exceptions provided for by the directive, to the operator of a search engine in the context of his responsibilities, powers and capabilities as the controller of the processing carried out in connection with the activity of the search engine, on the occasion of a verification performed by that operator, under the supervision of the competent national authorities, following a request by the data subject.”

The Court then goes on to answer the second of several questions made in the Preliminary Reference. This question asks whether a data controller like Google should comply with a delisting request, subject to the exceptions provided for by the Directive, when the web pages contain special category personal data referred to by those provisions:

Must Article 8(1) and (5) of Directive 95/46 be interpreted as meaning that the prohibition so imposed on the operator of a search engine of processing data covered by those provisions, subject to the exceptions laid down by that directive, would require the operator to grant as a matter of course the requests for de-referencing in relation to links to web pages concerning such data? From that perspective, how must the exceptions laid down in Article 8(2)(a) and (e) of Directive 95/46 be interpreted, when they apply to the operator of a search engine, in the light of its specific responsibilities, powers and capabilities? In particular, may such an operator refuse a request for de-referencing, if it establishes that the links at issue lead to content which, although comprising data falling within the categories listed in Article 8(1), is also covered by the exceptions laid down by Article 8(2) of the directive, in particular points (a) and (e)?” (para. 31)

The Court started its justification with a reminder that Article 52(1) of the European Union’s Charter of Fundamental Rights (hereafter: Charter) permits limitations on the exercise of rights, as long as they are provided for by law, respect the essence of the rights and freedoms found in the Charter, and are proportional, especially when balanced against other rights. The erasure right now found in Article 17(1) of the GDPR (“the right to be forgotten”) does not apply when processing is necessary for one of the grounds found in Article 17(3)(a) of the Regulation – one of which is the exercise of the right of expression and information, guaranteed by Article 11 of the Charter.

The Court then examined the conditions when a search engine is required to accede to a deindexing request to delete links to web pages which contains special category personal data that follows a search for a data subject’s name. Of note is the Court’s reference to Article 14 of the Data Protection Directive, which provides data subjects with the legal basis to obtain erasure of data does not comply with the Directive. Furthermore, the Court refers to instances (examples provided in Article 7(3) and (f)) wherein a data subject can object at any time on compelling legitimate grounds to the processing of personal data relating to him or her, unless provided for in national legislation.

At Paragraph 52, the Court lays out its reasoning for determining that a search engine is “obliged to remove from the list of results displayed following a search on the basis of a person’s name links to web pages, published by third parties… even, as the case may be when its publication in itself…is lawful.”

Consequences for Google?

Beyond the additional protections for data subjects, this also seems to be a subtle way of suggesting that Google’s business may not have complied with the EU’s data protection regime. As mentioned above, in order for Google to process special categories of personal data, it must also rely on one of exceptions under Article 9(2) of the GDPR. As any search engine has different processing goals from the original controllers (the hosts publishing the data), Google cannot generally ‘piggyback’ on the same legal basis for processing as the original publisher. It needs to find its own ground for processing.

However, what is glossed over is whether there is an exception to the prohibition of processing under EU or national law upon which Google can rely. This can likely be attributed to the phrasing of the second question by the referencing court. In other words, the Court was not asked if any of the exceptions to processing apply to Google but rather when they apply? If they did apply, then would that preclude honouring a de-referencing request?  So, the main question, whether Google can rely on the exceptions to the prohibition of processing under Articles 9 and 10 GDPR, remains unanswered.

We will briefly run through all of the exceptions and their applicability to Google.  The exceptions under GDPR are:

  1. Consent (Article 9(2)(a))> As consent is not given, or in any case not given to Google, it isnot applicable in most cases. Furthermore, the fact that a data subject is making a deindexing request means they are removing any consent that may have been provided. 
  2. Obligations in the field of employment law, etc. (Article 9(2)(b))> not applicable[1]
  3. Vital interests of the data subject (Article 9(2)(c))> not applicable
  4. By a foundation, association or any other not-for-profit body with a political, philosophical, religious or trade union aim and on condition that the processing relates solely to the members (Article 9(2)(d)) > not applicable
  5. Manifestly made public by the data subject (Article 9(2)(e))> definitely not in all cases
  6. Necessary for exercising a legal right (Article 9(2)(f)) > not applicable
  7. Necessary for reasons of substantial public interest (Article 9(2)(g))> possibly applicable
  8. Preventive medicine etc. (Article 9(2)(h))> not applicable
  9. Public interest in health (Article 9(2)(i))> not applicable
  10. Archiving/research (Article 9(2)(j))> not applicable unless you reinterpret ‘archiving in the public interest’

Therefore, the only exception that Google could possibly rely on is whether under Article 9(2)(g) of the GPDR, Google’s is processing special category data for reasons of substantial public interest:

Processing is necessary for reasons of substantial public interest, on the basis of Union or Member State law which shall be proportionate to the aim pursued, respect the essence of the right to data protection and provide for suitable and specific measures to safeguard the fundamental rights and the interests of the data subject.”

But what is a “substantial public interest”? And what are the consequences of Google processing special categories of personal data without hitting this threshold?

Based on our brief assessment, any defence that processing can be legitimised under Article 9(2) is shaky. This leaves Google with the more general exception of the freedom of expression and information. This is provided for in Article 85 of the GDPR which requires Member States to provide exemptions and derogations to Chapter II (which includes Article 9) in order to protect the freedom of expression and information. Indeed, it is the right to freedom of expression that is weighed against the rights of the data subject in de-indexing cases. But whether Google can rely by default on the freedom of expression to process special categories of personal data remains uncertain.

The GC judgment does shed some light on this question, which may have far-reaching consequences for Google and rights of the public to access information.  At first reading, the concept of informational self-determination has superseded principles of media freedom and open justice, in spite of the Court’s continued referral to the balancing test and the need to take into consideration of freedom of information.

Conclusion

Like Google Spain, it does not take a stretch of the imagination to see that there are potentially serious implications for the search giant.  By stating that Google is processing special categories of data, the Grand Chamber has put Google in an akward position. Google will have to rely on either article 85 GDPR or substantiate that is has a ‘substantial public interest’ in processing special categories in each case, lest its processing be incompatable with GDPR.  In theory, Google should do a balancing test prior to all processing, something that is clearly incompatible with its business model.

The Court acknowledges in GC that provisions of the GDPR are to be interpreted in light of the Charter. Article 11 of the Charter is clearly engaged and, in all instances, Article 52 requires balancing. If the effects of this ruling undermine Google’s business model, they could claim an infringement of Article 16, the fundamental right to operate a business.

[1] For sake of brevity we use ‘not applicable’. While in some case an exception could be used, it will not be in the majority of cases, hence the use of ‘not applicable’.