You are currently viewing The judgment of the Court of Justice of the European Union in the case concerning automated decision-making in creditworthiness assessment: what lessons does it bring for the use of such solutions?

The judgment of the Court of Justice of the European Union in the case concerning automated decision-making in creditworthiness assessment: what lessons does it bring for the use of such solutions?

  • mail category:Blog

In December 2023, the Court of Justice of the European Union (CJEU) issued its first judgment regarding automated decision-making. The case concerns SCHUFA, a German company that deals with creditworthiness assessment (private credit information office). Based on data analysis and the use of statistical tools, it calculates the probability of how a given person will behave. This result (score) is then transferred to banks, for example, which, based on scoring, decide whether to grant or not to grant a loan. The ruling confirms the broad interpretation of what an automated decision is and contains a number of comments on the need to guarantee the rights of data subjects in situations where such solutions are used. Below I present several of the most important conclusions that arise after reading the ruling.

Dr. Joanna Mazur, DELab UW analyst

The case before the CJEU concerned a situation in which a third party refused to grant a loan to OQ. OQ challenged the decision, arguing that it was based on incorrect personal data and requesting their deletion. In the proceedings that arose from this situation, the national court decided to refer questions to the CJEU on how the provisions of the General Data Protection Regulation (GDPR) concerning automated decision-making should be understood:

Should Article 22(1) of the [GDPR] be interpreted as meaning that the mere automated calculation of a probability value relating to the data subject’s ability to repay a loan in the future constitutes a decision based solely on automated processing, including profiling, which produces legal effects for that person or significantly affects them in a similar manner, where that value, determined on the basis of that person’s personal data, is transferred by the data controller to another controller and that value is of decisive importance for that other controller when deciding to enter into, perform or terminate a contract with the data subject? (para. 27)

The CJEU’s response contains several interesting elements. The first group of them relates to the assessment of whether we are dealing with automated decision-making, the second – to the broader context of the provisions on this issue in the GDPR. Below I discuss them divided into these two subgroups.

Scoring as automated decision-making

The Court analysed the extent to which the situation should fall under the regulation of automated decision-making under Article 22 of the GDPR. For this purpose, three conditions were analysed, which must be met cumulatively for Article 22 to apply (paragraph 43). The first, whether we are dealing with a decision, was confirmed by the CJEU. The Court referred to the need for a broad understanding of the word decision: the concept of "decision" within the meaning of Article 22(1) of the GDPR may therefore […] cover a range of activities that may affect the data subject in different ways, the concept is sufficiently broad to encompass the result of a calculation of the solvency of the person concerned in the form of a probability value concerning the ability of that person to meet payment obligations in the future (paragraph 46).

The second, i.e. the necessity for such a decision to be based solely on automated processing, including profiling, was resolved by the CJEU to a large extent by recognising that the situation in question constitutes an example of profiling (para. 47). Additionally, the Court referred to the wording of the preliminary question, treating the way it was formulated as confirming the automated nature of the assessed data processing. In this way, the CJEU avoided speculation on the extent to which final decisions on granting or not granting credit are based “solely” on automated processing, and focused on the very fact that the decision on scoring is automated.

The third condition, i.e. the need for the decision to produce legal effects or to have a similarly significant impact on a given person, was also found to be met by the CJEU: the probability value determined by the credit information bureau and communicated to the bank plays a decisive role in granting a loan, the calculation of this value must in itself be classified as a decision producing "legal effects" for the data subject or "significantly" affecting him "in a similar manner" (paragraph 50).

The CJEU also emphasised that the interpretation made requires that the provisions of the GDPR be treated as a whole and that the rights of the data subject be protected also in a situation where scoring is not carried out by the institution that issues the final decision (see para. 61). Hence, the very fact of scoring should be treated as a “decision” in the light of the provisions of the GDPR, and subject to the provisions on automated decision-making.

This statement is also extremely important in the context of the Polish legal system, in which banking law provides for the right to explain the decision regarding the assessment of the applicant's creditworthiness, but it is formulated in a way that refers to institutions granting credit or loans: Banks and other institutions authorised by law to grant loans, at the request of a natural person, legal person or an organizational unit without legal personality, provided that it has legal capacity, applying for a loan, shall provide, in writing, an explanation of their assessment of the applicant's creditworthiness (Art. 70a). The CJEU judgment indicates that even if such an assessment were carried out by another company, the data subject should still be able to rely on the provisions of the GDPR regarding automated decision-making.

Requirements for automated decision-making in the GDPR

In addition, the judgment contains three additional elements worth noting. First, the judgment also confirms that Article 22 of the GDPR should be read as a fundamental prohibition on the use of automated decision-making, from which the GDPR provides three exceptions (paragraph 52: This provision establishes a fundamental prohibition, the violation of which does not require individual invocation by such a person.).

Hence, in order to be able to use automated decision-making, it must be possible to rely on one of the exceptions provided for in the GDPR. According to the GDPR, even in these situations, it is necessary to guarantee appropriate measures to protect the rights and freedoms and legitimate interests of the data subjects. The CJEU also refers to recital 71, which contains a list of examples of such measures: Such measures should include, in particular, the obligation on the controller to use appropriate mathematical or statistical procedures, to implement appropriate technical and organisational measures to minimise the risk of errors and to correct them, to secure personal data in a way that takes into account the potential risk to the interests and rights of the data subject and, in particular, to prevent discriminatory effects against him. In addition, such measures should include, at least, the right of the data subject to obtain human intervention on the part of the controller, to express his point of view and to contest a decision taken concerning him (para. 66). The Court first lists the measures mentioned in the recitals and then says that "those measures also include at least" Hence, the second interesting suggestion resulting from the judgment is that – due to the way in which this paragraph is worded – also the measures only listed in the recital of the GDPR, and not in the content of Article 22, should be implemented by the entity that uses automated decision-making. This suggestion is clearer in the English version of the judgment, which states that In the light of recital 71 of the GDPR, such measures must include […].

The third element of the judgment that deserves emphasis is the importance that the CJEU attaches to emphasising the need to meet the conditions set out in Articles 5 and 6 of the GDPR, i.e. the provisions on the principles of processing and its legal basis, in a situation where Member States introduce the possibility of automated decision-making in their national laws. The CJEU emphasises that: Where the law of a Member State permits, pursuant to point (b) of Article 22(2) of the GDPR, a decision based solely on automated processing, that processing must not only fulfil the conditions set out in that latter provision and in Article 22(4) of that Regulation, but also meet the requirements set out in Articles 5 and 6 of that Regulation (para. 68). The CJEU emphasised that the legal basis provided for by the law of a Member State cannot ignore the case-law of the Court on how specific issues should be understood, e.g. it cannot assume a priori that the interests of the entity processing the data outweigh the interests of the data subject (paras. 69-70) and, based on such an assumption, provide in advance for the possibility of using automated decision-making.

Conclusion

In the case, the CJEU quite clearly sided with a broad interpretation of the rights of data subjects related to automated decision-making: not only in terms of interpreting the content of the conditions concerning what is covered by the content of Article 22, but also in relation to the systemic significance of the prohibition set out therein.

For more information:

Note about the case on the website noyb.eu.
Post about the case on the website vergassungsblog.de.
Documents related to the case on the website CJEU.
Book The algorithm as public information in European law, Chapter II of which concerns the provisions of the GDPR on automated decision-making.

Add a comment