The Judgement of the Court of Justice of the European Union in the case concerning automated decision making used for credit scoring: the lessons on using such solutions

In Decem­ber 2023, the Court of Justi­ce of the Euro­pe­an Union (CJEU) deli­ve­red the first judg­ment on auto­ma­ted deci­sion-making. The case con­cerns SCHUFA, a Ger­man com­pa­ny that deals with cre­di­twor­thi­ness asses­sment (pri­va­te cre­dit infor­ma­tion agen­cy). Based on data ana­ly­sis and the use of sta­ti­sti­cal tools, it cal­cu­la­tes the pro­ba­bi­li­ty of how a given per­son will beha­ve in the futu­re (e.g., whe­ther he or she will be able to pay off the debt). This result (sco­re) is then trans­fer­red, for exam­ple, to banks, which – based on such sco­ring – deci­de whe­ther to grant or not a loan. The ruling con­firms a bro­ad inter­pre­ta­tion of what con­sti­tu­tes a deci­sion made in an auto­ma­ted man­ner and con­ta­ins seve­ral requ­ire­ments that sho­uld be ful­fil­led to guaran­tee com­plian­ce with the rights of data sub­jects in situ­ations whe­re this type of solu­tion is used. Below I pre­sent some of the most impor­tant conc­lu­sions that come to mind after reading the judgment.

Dr. Joan­na Mazur, DELab UW analyst

The case befo­re the CJEU con­cer­ned a situ­ation in which a third par­ty refu­sed to grant a loan to OQ. OQ questio­ned the deci­sion sin­ce it was made using incor­rect per­so­nal data and deman­ding the­ir remo­val. During the pro­ce­edings that aro­se from this situ­ation, the natio­nal court deci­ded to refer questions to the CJEU regar­ding how the pro­vi­sions of the Gene­ral Data Pro­tec­tion Regu­la­tion (GDPR) rela­ting to auto­ma­ted deci­sion-making sho­uld be understood:

Is Artic­le 22(1) of the [GDPR] to be inter­pre­ted as meaning that the auto­ma­ted esta­bli­sh­ment of a pro­ba­bi­li­ty value con­cer­ning the abi­li­ty of a data sub­ject to servi­ce a loan in the futu­re alre­ady con­sti­tu­tes a deci­sion based sole­ly on auto­ma­ted pro­ces­sing, inc­lu­ding pro­fi­ling, which pro­du­ces legal effects con­cer­ning the data sub­ject or simi­lar­ly signi­fi­can­tly affects him or her, whe­re that value, deter­mi­ned using per­so­nal data of the data sub­ject, is trans­mit­ted by the con­trol­ler to a third-par­ty con­trol­ler and the lat­ter draws stron­gly on that value for its deci­sion on the esta­bli­sh­ment, imple­men­ta­tion or ter­mi­na­tion of a con­trac­tu­al rela­tion­ship with the data sub­ject? (para. 27)

The CJEU­’s respon­se con­ta­ins seve­ral inte­re­sting ele­ments. The first gro­up refers to the asses­sment of whe­ther we are dealing with auto­ma­ted deci­sion-making, and the second – to the bro­ader con­text of the pro­vi­sions on this issue in the GDPR. Below I discuss them divi­ded into the­se two subgroups.

Sco­ring as auto­ma­ted decision-making

The CJEU ana­ly­zed the extent to which the situ­ation sho­uld be sub­ject to the regu­la­tion of auto­ma­ted deci­sion-making under Art. 22 GDPR. For this pur­po­se, three con­di­tions were ana­ly­zed that must be join­tly met for Art. 22 was appli­ca­ble (para. 43). The first one, name­ly, whe­ther we are dealing with a deci­sion, was con­fir­med by the CJEU. The Court refer­red to the need for a bro­ad under­stan­ding of the word deci­sion: The con­cept of ‘deci­sion’ within the meaning of Artic­le 22(1) of the GDPR is thus […] capa­ble of inc­lu­ding seve­ral acts which may affect the data sub­ject in many ways sin­ce that con­cept is bro­ad eno­ugh to encom­pass the result of cal­cu­la­ting a person’s cre­di­twor­thi­ness in the form of a pro­ba­bi­li­ty value con­cer­ning that person’s abi­li­ty to meet pay­ment com­mit­ments in the futu­re (para. 46).

The second requ­ire­ment is for such a deci­sion to be based sole­ly on auto­ma­ted pro­ces­sing, inc­lu­ding pro­fi­ling. The CJEU reso­lved it to a lar­ge extent by reco­gni­zing that a given situ­ation con­sti­tu­tes an exam­ple of pro­fi­ling (para. 47). Addi­tio­nal­ly, the Tri­bu­nal refer­red to the wor­ding of the question refer­red for a pre­li­mi­na­ry ruling, tre­ating the way it was for­mu­la­ted as a con­fir­ma­tion of the auto­ma­ted natu­re of the asses­sed data pro­ces­sing. Thus, the CJEU avo­ided con­si­de­ra­tion of the extent to which the final deci­sions regar­ding whe­ther or not to grant a loan are based ‘sole­ly’ on auto­ma­ted pro­ces­sing and focu­sed on the very fact that the sco­ring deci­sion is automated.

The third con­di­tion, which is the need for the deci­sion to pro­du­ce legal effects or simi­lar­ly signi­fi­can­tly affect a given per­son, was also found to be met by the CJEU: It fol­lows that, in cir­cum­stan­ces such as tho­se at issue in the main pro­ce­edings, in which the pro­ba­bi­li­ty value esta­bli­shed by a cre­dit infor­ma­tion agen­cy and com­mu­ni­ca­ted to a bank plays a deter­mi­ning role in the gran­ting of cre­dit, the esta­bli­sh­ment of that value must be quali­fied in itself as a deci­sion pro­du­cing vis-à-vis a data sub­ject ‘legal effects con­cer­ning him or her or simi­lar­ly signi­fi­can­tly [affec­ting] him or her’ within the meaning of Artic­le 22(1) of the GDPR (para. 50).

The CJEU also empha­si­zed that the inter­pre­ta­tion of the­se pro­vi­sions requ­ires tre­ating the rele­vant regu­la­to­ry fra­me­work as a who­le and ensu­ring that the rights of the data sub­ject are pro­tec­ted in a situ­ation whe­re sco­ring is not per­for­med by the insti­tu­tion that issu­es the final deci­sion (para. 61). Hen­ce, the very fact of sco­ring sho­uld be tre­ated as a „deci­sion” in the light of the pro­vi­sions of the GDPR and be sub­ject to the pro­vi­sions on auto­ma­ted decision-making.

This appro­ach is extre­me­ly impor­tant also in the con­text of the Polish legal order, in which ban­king law pro­vi­des for the right to expla­in the deci­sion regar­ding the asses­sment of the appli­can­t’s cre­di­twor­thi­ness. Still, it is for­mu­la­ted in a man­ner refer­ring to insti­tu­tions gran­ting cre­dit or loans: Banks and other insti­tu­tions autho­ri­zed by law to grant loans at the requ­est of a natu­ral per­son, legal per­son, or orga­ni­za­tio­nal unit witho­ut legal per­so­na­li­ty, pro­vi­ded it has legal capa­ci­ty, apply­ing for a loan, pro­vi­de, in wri­ting, an expla­na­tion of the­ir asses­sment of the appli­can­t’s cre­di­twor­thi­ness (Art. 70a). The judg­ment in SCHUFA case indi­ca­tes that even if such an asses­sment were car­ried out by ano­ther com­pa­ny, the data sub­ject sho­uld still be able to invo­ke the pro­vi­sions of the GDPR regar­ding auto­ma­ted decision-making.

Requ­ire­ments for auto­ma­ted deci­sion-making in the GDPR

More­over, the ruling con­ta­ins three addi­tio­nal ele­ments that are worth pay­ing atten­tion to. Fir­stly, the judg­ment also con­firms that Art. 22 of the GDPR sho­uld be read as a fun­da­men­tal pro­hi­bi­tion of the use of auto­ma­ted deci­sion-making, to which the GDPR pro­vi­des three excep­tions (para. 52: That pro­vi­sion lays down a pro­hi­bi­tion in prin­ci­ple, the infrin­ge­ment of which does not need to be invo­ked indi­vi­du­al­ly by such a per­son.).

The­re­fo­re, to use auto­ma­ted deci­sion-making, it must be possi­ble to invo­ke one of the excep­tions fore­se­en in the GDPR. Howe­ver, fol­lo­wing the GDPR, even in the­se situ­ations it is neces­sa­ry to guaran­tee appro­pria­te measu­res to pro­tect the rights and fre­edoms, and legi­ti­ma­te inte­re­sts of per­sons who­se data are pro­ces­sed. The CJEU also refers to reci­tal 71, which con­ta­ins a cata­log of exam­ples of such measu­res: In the light of reci­tal 71 of the GDPR, such measu­res must inc­lu­de, in par­ti­cu­lar, the obli­ga­tion for the con­trol­ler to use appro­pria­te mathe­ma­ti­cal or sta­ti­sti­cal pro­ce­du­res, imple­ment tech­ni­cal and orga­ni­sa­tio­nal measu­res appro­pria­te to ensu­re that the risk of errors is mini­mi­sed and inac­cu­ra­cies are cor­rec­ted, and secu­re per­so­nal data in a man­ner that takes acco­unt of the poten­tial risks invo­lved for the inte­re­sts and rights of the data sub­ject and pre­vent, among other things, discri­mi­na­to­ry effects on that per­son. Tho­se measu­res inc­lu­de, more­over, at least the right for the data sub­ject to obta­in human inte­rven­tion on the part of the con­trol­ler, to express his or her point of view, and to chal­len­ge the deci­sion taken in his or her regard. (para. 66).

The Court first lists the measu­res men­tio­ned in the reci­tal (such measu­res must inc­lu­de, in par­ti­cu­lar…) and then says that tho­se measu­res inc­lu­de, more­over, when refer­ring to the measu­res men­tio­ned in the con­tent of Art. 22. Hen­ce, the second inte­re­sting sug­ge­stion resul­ting from the judg­ment is that – due to the way this para­graph is wor­ded – also the measu­res are only listed in the reci­tal of the GDPR, and not in the text of Art. 22, sho­uld be imple­men­ted by an enti­ty that uses auto­ma­ted decision-making.

The third ele­ment of the judg­ment that seems impor­tant is that the CJEU empha­si­zes the need to meet the con­di­tions set out in Arts. 5 and 6 of the GDPR: the pro­vi­sions regar­ding the prin­ci­ples of pro­ces­sing and the legal basis of pro­ces­sing, in a situ­ation whe­re Mem­ber Sta­tes intro­du­ce in the­ir natio­nal laws the possi­bi­li­ty of auto­ma­ted deci­sion-making (an exemp­tion to the gene­ral pro­hi­bi­tion which is fore­se­en by law of the mem­ber sta­te). The CJEU empha­si­zes that: Thus, if the law of a Mem­ber Sta­te autho­ri­ses, under Artic­le 22(2)(b) of the GDPR, the adop­tion of a deci­sion sole­ly based on auto­ma­ted pro­ces­sing, that pro­ces­sing must com­ply not only with the con­di­tions set out in the lat­ter pro­vi­sion and in Artic­le 22(4) of that regu­la­tion, but also with the requ­ire­ments set out in Artic­les 5 and 6 of that regu­la­tion (para. 68). The CJEU also empha­si­zed that the legal basis pro­vi­ded for by the law of a Mem­ber Sta­te can­not igno­re the case law of the Court on how given issu­es sho­uld be under­sto­od, e.g., it can­not assu­me in advan­ce that the inte­re­sts of the enti­ty pro­ces­sing the data outwe­igh the inte­re­sts of the data sub­ject (paras. 69–70) and, based on such an assump­tion, pre­dict in advan­ce the possi­bi­li­ty of using auto­ma­ted decision-making.

Conc­lu­sion

In the SCHUFA case, the CJEU quite cle­ar­ly sided with a bro­ad inter­pre­ta­tion of the rights of data sub­jects rela­ted to auto­ma­ted deci­sion-making: not only in terms of the inter­pre­ta­tion of the con­tent of the requ­ire­ments regar­ding what kinds of deci­sions are cove­red by the con­tent of Art. 22 of the GDPR, but also con­cer­ning its sys­te­mic impor­tan­ce of the pro­hi­bi­tion fore­se­en in this provision.

More infor­ma­tion:

Post abo­ut the case on noyb.eu.
Post abo­ut the case on vergassungsblog.de.
Case on the CJEU’s websi­te TSUE.
The trans­la­tion to English was per­for­med using auto­ma­ted trans­la­tion and revi­sed by the author.

Scroll to Top