Judgment of the Court of Justice of the European Union

The judgment of the Court of Justice of the European Union in the case concerning automated decision-making in creditworthiness assessment: what lessons does it bring for the use of such solutions?
In December 2023 The Court of Justice of the European Union (CJEU) has issued its first judgment on automated decision-making. The case concerns SCHUFA, a German company that deals with creditworthiness assessment (private credit information office). Based on data analysis and the use of statistical tools, it calculates the probability of how a given person will behave. This result (score) is then passed on e.g. banks, which – based on scoring – decide whether to grant a loan or not. The judgment confirms the broad interpretation of what constitutes an automated decision and contains a number of comments on the need to guarantee the rights of data subjects in situations where such solutions are used. Below I present some of the most important conclusions that arise after reading the judgment. Dr. Joanna Mazur, analyst at DELab UW The case before the CJEU concerned a situation in which a third party refused to grant a loan to OQ. OQ challenged the decision, citing the fact that it was based on incorrect personal data and demanding their deletion. In the proceedings that resulted from that situation, the national court decided to refer questions to the CJEU on how the provisions of the General Data Protection Regulation (GDPR) relating to automated decision-making should be understood: Is Article 22 paragraph 1 [GDPR] to be interpreted as meaning that the mere automated calculation of a probability value relating to the data subject's ability to repay a loan in the future constitutes a decision based solely on automated processing, including profiling, which produces legal effects for that person or significantly affects him in a similar manner, where that value, determined on the basis of that person's personal data, is transferred by the controller to another controller and that value is of decisive importance for that other controller when deciding on the conclusion, performance or termination of a contract with the data subject? (couple. 27) The CJEU's response contains several interesting elements. The first group relates to the assessment of whether we are dealing with automated decision-making, the second – to the broader context of the provisions on this issue in the GDPR. Below I discuss them divided into these two subgroups. Scoring as automated decision-making The Court analysed the extent to which the situation should be subject to the regulation of automated decision-making under Article 22 GDPR. For this purpose, three conditions were analysed, which must be cumulatively met in order for Art. 22 applied (para. 43). The first, whether we are dealing with a decision, was confirmed by the CJEU. The Court referred to the need for a broad understanding of the word decision: the concept of "decision" within the meaning of Article 22(1) of the GDPR may therefore [...] cover a range of activities that may affect the data subject in different ways; that concept is sufficiently broad to encompass the result of a calculation of the solvency of a given person in the form of a probability value relating to that person's ability to meet payment obligations in the future (para. 46). The second issue, namely the need for such a decision to be based solely on automated processing, including profiling, was largely resolved by the CJEU by recognising that a given situation constitutes an example of profiling (para. 47). In addition, the Court referred to the wording of the question referred for a preliminary ruling, treating the manner in which it was formulated as confirming the automated nature of the data processing in question. Thus, the CJEU avoided speculation on the extent to which the final decisions on whether or not to grant a loan are based “exclusively” on automated processing, but focused on the very fact that the decision on scoring is automated. The third condition, i.e. the need for the decision to produce legal effects or to have a similarly significant effect on the person concerned, was also found to be met by the CJEU: the probability value determined by the credit information bureau and communicated to the bank plays a decisive role when granting a loan; the calculation of this value must in itself be classified as a decision producing “legal effects” for the data subject or “significantly” affecting him “in a similar way” (para. 50). The CJEU also emphasised that the interpretation made requires that the provisions of the GDPR be treated as a whole and that the rights of the data subject be protected also in a situation where the scoring is carried out not by the institution that issues the final decision (see, for example, for. 61). Hence, the very fact of scoring should be treated as a “decision” under the GDPR, and subject to the provisions on automated decision-making. This statement is also extremely important in the context of the Polish legal system, in which banking law provides for the right to an explanation of the decision concerning the assessment of the applicant's creditworthiness, but it is formulated in a way referring to institutions granting credit or loans: Banks and other institutions authorised by law to grant credits, at the request of a natural person, legal person or an organisational unit without legal personality, provided it has legal capacity, applying for a loan, shall provide, in writing, an explanation of the assessment of the applicant's creditworthiness made by them (Article 104 of the Banking Law). 70a). The CJEU judgment indicates that even if such an assessment were carried out by another company, the data subject should still be able to rely on the provisions of the GDPR regarding automated decision-making. Requirements for automated decision-making in the GDPR Furthermore, the ruling contains three additional elements worth noting. Firstly, the judgment also confirms that art. Article 22 GDPR should be read as a fundamental prohibition on the use of automated decision-making, from which the GDPR provides for three exceptions (para. 52: This provision establishes a fundamental prohibition, the violation of which does not require individual invocation by such a person.). Hence, in order to use automated decision-making, it must be possible to invoke one of the exceptions provided for in the GDPR. In accordance with the GDPR, even in these situations it is necessary to guarantee appropriate measures to protect the rights and freedoms and legitimate interests of the persons whose data are processed. The CJEU also refers to recital 71, which contains a catalogue of examples of such measures: such measures should include, in particular, the obligation on the controller to use appropriate mathematical or statistical procedures, to implement appropriate technical and organisational measures to minimise the risk of errors and to correct them, to secure personal data in a way that takes into account the potential risk to the interests and rights of the data subject and, in particular, to prevent discriminatory effects against him or her. These measures shall, at least, include the right of the data subject to obtain human intervention on the part of the controller, to express his or her point of view and to contest a decision taken concerning him or her (para. 66). The Court first lists the measures listed in the recitals and then says that "those measures also include at least". Hence, the second interesting suggestion arising from the judgment is that - in connection with the way in which this paragraph is worded - also the measures only listed in the recital of the GDPR and not in the text of Article 22, should be implemented by the entity that uses automated decision-making. This suggestion is clearer in the English version of the judgment, which states that In the light of recital 71 of the GDPR, such measures must include [...]. The third element of the judgment that deserves to be emphasised is the importance that the CJEU attaches to emphasising the need to fulfil the conditions set out in Articles 5 and 6 of the GDPR, i.e. the provisions on the principles of processing and its legal bases, in a situation where Member States introduce the possibility of automated decision-making in their national systems. The CJEU emphasises that: where the law of a Member State permits, in accordance with Article 22(2)(b) of the GDPR, the adoption of a decision based solely on automated processing, that processing must not only fulfil the conditions set out in the latter provision and in Article 22(4) of that regulation, but also meet the requirements set out in Articles 5 and 6 of that regulation (para. 68). The CJEU emphasised that the legal basis provided for by the law of a Member State cannot ignore the case-law of the Court on how specific issues should be understood, e.g. cannot assume a priori that the interests of the data processor outweigh the interests of the data subject (para. 69–70) and based on such an assumption, anticipate the possibility of using automated decision-making. The CJEU’s conclusion in the case quite clearly sided with a broad interpretation of the rights of data subjects related to automated decision-making: not only in terms of the interpretation of the content of the conditions relating to what is covered by the content of Art. 22, but also with regard to the systemic significance of the prohibition contained therein. More information: Case note on noyb.eu. Post about the case on vergassungsblog.de. Documents relating to the case on the CJEU website.