The ECJ’s First Landmark Case on Automated Decision-Making – a Report from the Oral Hearing before the First Chamber
Please note: this post discusses the public oral hearing in Case C-634/21 held by the ECJ in Luxembourg on 26 January 2023, which the author attended in person. The summary of the oral hearing as presented here is based on the author’s own observations and notes taken during the hearing.
After decades of existence, the right not to be subject to automated decision-making is finally being considered for the first time before the European Court of Justice (‘ECJ’). On Thursday, 26 January 2023, the First Chamber of the ECJ held its hearing in Case C-634/21, the very first in which it has been asked to interpret Article 22 of the General Data Protection Regulation (‘GDPR’). The latter provision grants data subjects the right “not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her”. In a very similar form, this right had already been enshrined in Article 15 of the Data Protection Directive. However, the latter provision never ‘made’ it to the ECJ before Article 22 of the GDPR repealed and replaced it.
The relevance of this case goes beyond the GDPR, and particularly for the upcoming age of algorithms in which self-learning algorithms, automated analytics and various forms of scoring are likely to become even more commonplace. Although the judgment has not yet been handed down, this report outlines insights from the oral hearing before the First Chamber in this landmark case. This report is a summary of the oral hearings based on the author’s own observations and notes. Notably, it should not be understood as a thorough analysis of Article 22 GDPR.
The scope of Article 22 GDPR
The following three cumulative requirements must be satisfied for Article 22(1) GDPR to apply, namely:
(1) a decision is made, that is (2) based solely on automated processing or profiling and (3) has either legal effects or similarly significant effects on the data subject.
The case at hand primarily deals with requirement (1), namely whether the automated establishment of a credit score in itself amounts to an automated decision within the meaning of Article 22(1) GDPR. As mentioned in the introduction, this is the very first case on Article 22 GDPR. It might potentially resolve some debates in academia relating to this provision, for instance, regarding the nature of Article 22(1) GDPR (i.e. is it a prohibition or a right to be invoked?) and whether the GDPR provides for a right to explanation of specific automated decision-making (both discussed further below).
The parties of this case are the applicant OQ (‘data subject’), the defendant Land Hesse, represented by the Hesse Commissioner for Data Protection and Freedom of Information (‘HBDI’), and the joined party SCHUFA Holding AG (‘SCHUFA’).
The case concerns an action brought by the applicant (‘data subject’) against the credit score calculated by SCHUFA Holding AG (‘SCHUFA’) in respect of that data subject.
SCHUFA is a private German credit information agency that provides its contractual partners with information on the creditworthiness of consumers. SCHUFA establishes credit scores, where the probability of a person’s future behaviour is predicted on the basis of certain characteristics of that person using mathematical statistical methods. These credit scores are based on the assumption that future behaviour can be predicted by assigning a person to a group of other persons with comparable characteristics who have behaved in a certain way in the past.
The data subject was refused credit by a third party after SCHUFA provided the latter with the established credit score. Then, the data subject requested SCHUFA to provide her with information regarding the data stored and to erase what she considered to be incorrect data. Subsequently, SCHUFA informed the data subject that it had given her a score of 85.96% and, in broad outline, about the basic functioning of its score calculation process. However, SCHUFA did not disclose which individual pieces of information are included in the calculation and with what weighting, arguing it is not obliged to disclose the calculation methods due to commercial and industrial secrecy protection. Also, SCHUFA stressed that it merely provides contractual partners with information but does not take actual decisions.
On 18 October 2018, the data subject lodged a complaint with the Hesse Commissioner for Data Protection and Freedom of Information (‘HBDI’, the defendant), requesting that the latter order SCHUFA to comply with its request to access and erasure. The data subject stated that SCHUFA is obliged to provide information about the logic involved in the calculation of the credit score, as well as the significance and consequences of the processing. By administrative decision of 3 June 2020, the HBDI refused to take further action in respect of SCHUFA. The HBDI stated that SCHUFA’s calculation of the credit score complies with the requirements specifically outlined in German law (§31 of the Bundesdatenschutzgesetz ‘BDSG’). This national provision establishes rules on scoring as a sub-category of profiling. The permissibility thereof forms part of the procedure before the ECJ (see second referred question below).
Ultimately, the Administrative Court, Wiesbaden (‘referring court’) stayed the administrative proceedings and referred two questions to the ECJ for a preliminary ruling.
In Question 1, the referring court seeks to clarify whether the automated establishment of a probability value concerning the ability of a data subject to service a loan in the future (‘credit score’) in itself constitutes a decision based solely on automated processing, including profiling, in the sense of Article 22(1) GDPR.
In the scenario of the referred case, that credit score is transmitted to a bank (a third-party data ‘controller’, in the parlance of the GDPR), which then enters into or refrains from entering into a contractual relationship with the data subject, a result which is strongly influenced by that credit score. According to the referring court, it is ultimately the credit score established by credit rating agencies that actually decides whether and how the third-party controller enters into a contract with the data subject. In the context of consumer loans, an insufficient score will lead to the refusal of a loan in almost every case.
If Question 1 is answered in the negative, the referring court wishes to know whether Articles 6(1) and 22 GDPR must be interpreted as precluding national legislation under which the use of a credit score for the purpose of deciding on the establishment, implementation or termination of a contractual relationship with a data subject is permissible only if certain further conditions are met.
Questions asked by the ECJ prior to the oral hearing
Prior to the oral hearing, and in line with Articles 61 and 62 of the ECJ’s Rules of Procedure, the parties were invited to prepare answers to certain questions asked by the First Chamber of the Court. Among others, the latter wished to know whether:
- there is a lacuna in legal protection for data subjects if Question 1 is answered in the negative;
- that lacuna might be closed by interpreting Article 15(1)(h) GDPR extensively so that SCHUFA would have to inform data subjects even if there is no decision in the sense of Article 22(1) GDPR, and
- that lacuna can be closed by accepting joint controllership (Article 26 GDPR).
These questions seem to suggest that the ECJ might fill gaps of legal protection if it considers that such gaps are indeed present.
Nature of Article 22 GDPR: likely a prohibition
The nature of Article 22(1) GDPR is heavily debated in academia. One crucial question is if Article 22(1) GDPR contains an in-principle prohibition of automated decision-making or whether it enshrines a right which must be invoked by the affected data subject. According to the latter view, Article 22(1) is a right that must be exercised by the data subject to apply in similar manner to the right to object (Article 21 GDPR). It follows that data subjects need to actively enforce Article 22(1) GDPR. According to the other view, which is backed by the European Data Protection Board, paragraph 1 of Article 22 establishes an in-principle prohibition that is subject to the exceptions contained in paragraph 2.
In its request for a preliminary ruling, the referring court clearly takes the view that Article 22(1) GDPR establishes a prohibition of automated decision-making (para 27). Based on his comments and questions expressed during the oral hearing, ECJ judge Thomas von Danwitz also seems to interpret Article 22(1) GDPR as an in-principle prohibition. None of the parties opposed this view during the oral hearing, which indicates a sense of agreement in this regard. Based on the insights obtained during the oral hearing, I predict the ECJ will likely interpret Article 22(1) GDPR as an in-principle prohibition and put an end to this academic debate.
The notion of a decision under Article 22(1) GDPR
In terms of the requirement that a decision has been made, requirement 1 of Article 22, the HBDI argued during the hearing that the calculation of the credit score itself is not yet a decision within the meaning of Article 22 GDPR. The HBDI argued that Article 22(1) GDPR does not apply to preparatory acts such as the calculation of credit scores, even if the latter is established automatically (requirement 2) and arguably has the required effect (requirement 3). Next to the automatic refusal of an online credit application (recital 71 of the GDPR), the HBDI compared decisions captured by Article 22(1) GDPR with the annulment of a decision in administrative law.
SCHUFA urged to take also the drafting history of Article 22 GDPR into account. Accordingly, the initial proposal of the GDPR did not contain the term ‘decision’, but rather referred to ‘measure’, whose scope seems broader. Based on the term ‘measure’, Article 22 GDPR would likely be applicable to SCHUFA because the credit score may be seen as a ‘measure’. However, SCHUFA pointed out that the legislator decided to use the term ‘decision’ in the final version of the GDPR, instead of ‘measure’. Therefore, Article 22 GDPR does not apply to SCHUFA.
During the oral hearing, Judge von Danwitz implied a rather broad interpretation of a decision under Article 22(1) GDPR. First, he mentioned that the notion of ‘decision’ does not necessarily have to be interpreted in administrative law terms. Subsequently, he asked the parties whether it could be argued that the establishment of a credit score is indeed a decision within the meaning of Article 22(1) GDPR because the scoring takes place and is assigned to a specific individual, namely that on 26 January 2023 that individual’s credit score is X.
No joint controllership
With regard to question (iii) posed by the First Chamber of the ECJ, all parties argued that there is no joint controllership in this case. While referring to case law of the ECJ, SCHUFA argued that there is no joint responsibility. The latter essentially requires a joint decision on the purposes and means of processing (see judgments in Wirtschaftsakademie, Jehovan todistajat, Fashion ID), which does not apply here because nothing is decided jointly by the controllers involved. Rather, SCHUFA argued that the case at issue resembles a chain of different processing activities, where different responsibilities apply to different controllers based on their different processing activities. The Commission seemed to agree with this, arguing that the case contains at least two separate processing activities (establishment of the credit score versus actual decision taken) performed by two different controllers (SCHUFA and the bank). Based on the legal arguments provided, I predict that the ECJ will likely not assume joint controllership in the specific context of the case.
Active versus passive information
Based on his comments and questions expressed during the oral hearing, Judge von Danwitz seems to suggest that recital 71 of the GDPR obliges controllers to inform data subjects actively about automated decision-making. During the hearing, Judge von Danwitz cited the German version of recital 71 of the GDPR which states that suitable safeguards mentioned in Article 22(3) GDPR must be present in any case, obliging controllers to provide data subjects with specific information ‘einschließlich der spezifischen Unterrichtung’. He also referred to the French version of recital 71, which contains the wording ‘information spécifique’. In his view, the German wording ‘spezifische Unterrichtung’ and the French ‘information spécifique’ require something active from the controller, unlike the right of access, where the data subject must actively seek information. This interpretation could contribute to the vivid debate in academia dealing with Article 22(3) GDPR. This debate deals with the question whether the GDPR provides for a right to explanation of specific automated decision making, or not.
Questions posed by Advocate General Pikamäe
The questions posed by AG Priit Pikamäe focused on the effects of credit scores for data subjects and the methods used for the establishment of the credit score. In particular, AG Pikamäe wished to know from both the HBDI and SCHUFA what consequences or effects negative scores established by SCHUFA have for data subjects, including specific numbers in this regard. The representative for HBDI stated he did not possess figures in this regard. SCHUFA also did not have general figures but mentioned that roughly 20% of the individuals with a negative credit score will still receive loans from banks despite the negative score. According to SCHUFA, it can be derived from this information that banks do not blindly follow the credit scores and also consider other information when deciding whether to grant a loan or not. Obviously, one can also derive from this information that 80% of individuals with a negative credit score do not receive a loan, which is significantly more than those who do receive a loan.
AG Pikamäe also asked SCHUFA what the foundation of the scoring method is, how the credit score is calculated, and whether the method is internationally recognised. SCHUFA specifically referred to §31 BDSG, which requires that the data used to calculate the probability value are demonstrably essential for calculating the probability of the predicted behaviour on the basis of a scientifically recognised mathematic-statistical procedure.
Does SCHUFA use self-learning algorithms?
Judge von Danwitz asked whether SCHUFA uses self-learning algorithms to establish the credit score, and whether SCHUFA thinks it is legally allowed to use such algorithms. In its response, SCHUFA stated it does not use self-learning algorithms currently but may do so in the future. Also, SCHUFA noted that generally the requirements of §31 BDSG must be complied with. Some doubts may arise whether self-learning algorithms could be considered as ‘scientifically recognised’ procedures as required by that provision.
The AG’s opinion and main take-aways
During the oral hearing, AG Pikamäe announced the delivery of his Opinion for 16 March 2023. Based on the best-case scenario, the ruling could thus be expected somewhere between September and December this year. The two main take-aways from the oral hearing are as follows. First, it seems likely that the ECJ will interpret Article 22(1) GDPR as a prohibition. Second, based on the question asked by the First Chamber prior to the oral hearing and the comments made by Judge von Danwitz during the hearing, the notion of a decision under Article 22(1) GDPR also seems likely to be interpreted broadly by the ECJ. The Opinion and ruling will create a landmark precedent for the application of Article 22 GDPR, against the backdrop of a spectacular emergence of algorithms and artificial intelligence in our daily lives.