VG Wiesbaden - 6 K 788/20.WI

From GDPRhub
Revision as of 16:00, 5 December 2021 by Arapcewicz (talk | contribs)
VG Wiesbaden - 6 K 788/20.WI
Courts logo1.png
Court: VG Wiesbaden (Germany)
Jurisdiction: Germany
Relevant Law: Article 4(4) GDPR
Article 6 GDPR
Article 15(1)(h) GDPR
Article 21(1) GDPR
Article 22 GDPR
§ 31 BDSG
Decided: 01.10.2021
Published:
Parties:
National Case Number/Name: 6 K 788/20.WI
European Case Law Identifier: ECLI:DE:VGWIESB:2021:1001.6K788.20.WI.00
Appeal from:
Appeal to:
Original Language(s): German
Original Source: Bürgerservice Hessenrecht (in German)
Initial Contributor: Agnieszka Rapcewicz

The Administrative Court Wiesbaden referred to the ECJ for a decision questions whether Article 22(1) GDPR is to be interpreted as meaning that the automated creation of a score value by a credit agency already constitutes a decision based solely on automated processing which produces legal effects vis-à-vis the data subject or similarly significantly affects the data subject, if that score is communicated by the controller to a third controller and that third controller uses that value as the decisive basis for its decision on the establishment, performance or termination of a contractual relationship with the data subject. Additionally the court has doubts as to the compatibility of Section 31 of the BDSG with Article 22 GDPR, because the German legislator only regulates the "use" of the "probability value", but not the creation of the probability value itself.

English Summary

Facts

The controller is a private German credit agency which provides its contractual partners with information on the creditworthiness of third parties, in particular also consumers. For this purpose, the controller compiles so-called score values. For this purpose, the probability of a person's future behaviour, such as the repayment of a loan, is predicted from certain characteristics of the person on the basis of mathematical-statistical procedures, whereby both the individual characteristics on which the score values are based and the mathematical-statistical procedure are not disclosed. Accordingly, the creation of score values is based on the assumption that by assigning a person to a group of other persons with certain comparable characteristics who have behaved in a certain way, similar behaviour can be predicted. If the person has a certain profile, the score determined is attributed to him or her by the respondent and taken into account in the decision-making process of the person who ultimately contracts with the person concerned, for example a credit institution, when granting a loan, with corresponding consequences.

The data subject was refused credit by a third party after negative information was provided by the controller. Subsequently, the data subject demanded information on stored data from the controller, in addition to the deletion of what it considered to be incorrect entries. The controller provided the data subject with information about his credit score and also informed the data subject in general terms of how its score calculation worked, but not which individual pieces of information were included in the calculation and how they were weighted. The controller claimed that he was not obliged to disclose the calculation methods, as they were covered by industrial and commercial secrecy. The controller also pointed out that it only provided information to its contractual partners, but that the latter made the actual contractual decisions; the controller did not make a recommendation for or against concluding a contract.

The data subject filed a complaint against the information with the DPA, requesting that the DPA ordered the controller to comply with the data subject's request for information and deletion. The DPA refused to take further action against the controller, justifying that the calculation of the creditworthiness value by the controller had to comply with the detailed requirements set out in section 31 of the Federal Data Protection Act (BDSG), which contains detailed regulations on scoring and creditworthiness information. However, these requirements were generally fulfilled by the controller, in the DPA's opinion.

The data subject appealed to the court.

Holding

The Administrative Court Wiesbaden suspended the proceedings and referred it to the Court of Justice of the European Union for a preliminary ruling on the following questions:

(1) Is Article 22(1) GDPR to be interpreted as meaning that the automated production of a probability figure relating to the ability of a data subject to service a loan in the future already constitutes a decision which produces legal effects vis-à-vis the data subject or similarly significantly affects the data subject based solely on automated processing, including profiling, where that figure, produced by means of personal data relating to the data subject, is communicated by the controller to a third controller and that third controller bases its decision on the establishment, performance or termination of a contractual relationship with the data subject on that figure?

(2) If the answer to the first question referred is in the negative, are Art. 6(1) GDPR and Art. 22 GDPR to be interpreted as precluding national legislation under which the use of a probability value - in the present case, the probability of a natural person's solvency and willingness to pay when information on claims is included - about a particular future conduct of a natural person for the purpose of deciding on the establishment, performance or termination of a contractual relationship with that person (scoring) is permissible only if certain further conditions, which are set out in more detail in the grounds for reference, are satisfied?

In the court's opinion, the creation of score values fulfils the definitional characteristics set out in Art. 22 (1) GDPR. The creation of a score value by a credit agency is not merely profiling that prepares the decision of the third party controller, but rather an independent "decision" within the meaning of Article 22(1) GDPR. The court pointed out it's aware that the provision of of Article 22(1) GDPR can be understood in a restrictive interpretation and is also widely understood in such a way that it does not directly apply to the activities of credit agencies such as the respondent. In the court's view, however, such an assumption is based on an erroneous understanding of the activities of credit agencies and the influence of the score values they produce. The court has considerable doubts about such a restrictive interpretation of Article 22(1) GDPR because of the significance of the score compiled by credit agencies for the decision-making practice of third party controllers. In fact, the score value generated by the credit agency on the basis of automated processing ultimately determines whether and how the third party responsible enters into a contract with the data subject. Although the third party does not have to make its decision solely dependent on the score value, it usually does so to a significant extent.

In Section 31 of the BDSG, the German legislature essentially makes provisions on scoring as a subset of profiling. The court has considerable doubts as to the compatibility of these regulations with Article 22 of the GDPR because the German legislature only regulates the "use" of the "probability value", but not the creation of the probability value itself. The court pointed out that Section 31 of the BDSG is conclusive in that it only regulates profiling to the extent that it forms the basis of a decision based on it. Accordingly, the reference point of the prohibition is only the decision, not the profiling that precedes it. Neither Article 22 of the GDPR nor other provisions of the GDPR formulate specific substantive requirements for the lawfulness of data processing for the purpose of profiling in the form of scoring itself.

The court held that the procedure depends on question 1. If Article 22(1) GDPR is to be interpreted as meaning that the creation of a score by a credit agency is an independent decision within the meaning of Article 22(1) GDPR, this - its relevant activity - would be subject to the prohibition of automated individual decision-making. Consequently, a legal basis in the Member States within the meaning of Article 22(2)(b) GDPR would be required, for which only Section 31 of the BDSG comes into consideration. However, there are serious doubts as to its compatibility with Article 22(1) GDPR. The controller would then not only act without a legal basis, but would ipso iure violate the prohibition of Article 22(1) GDPR. As a result, the plaintiff would at the same time have a claim against the defendant for a (further) referral of its case to the supervisory authority.

If the answer to Question 1 is no, i.e. profiling itself is not a decision within the meaning of Article 22(1) and (2) GDPR, then the opening clause of Article 22(2)(b) GDPR does not apply to national rules concerning profiling. Due to the fundamentally conclusive character of the GDPR, which is designed for full harmonisation, a different regulatory power for national regulations must therefore be sought. However, since such a power is not apparent and, in particular, does not follow from the rudimentary provisions of the GDPR, the national regulation in Section 31 of the BDSG is not applicable, which changes the scope of review of the national supervisory authority, which would then have to measure the compatibility of the activities of credit agencies against Article 6 of the GDPR.

Comment

Share your comments here!

Further Resources

Share blogs or news articles here!

English Machine Translation of the Decision

The decision below is a machine translation of the German original. Please refer to the German original for more details.

Translated with DeepL:

On the issue of a decision based on automated processing
- including profiling -

Editorial
The question referred to the ECJ for a decision is whether Article 22(1) of the GDPR is to be interpreted as meaning that 1 of the GDPR is to be interpreted as meaning that the automated establishment of a probability value concerning the ability of a data subject to service a loan in the future already constitutes a decision based solely on automated processing - including profiling - which produces legal effects vis-à-vis the data subject or similarly significantly affects the data subject, if that probability value, established by means of personal data relating to the data subject, is communicated by the controller to a third controller and that third controller uses that value as the decisive basis for its decision on the establishment, performance or termination of a contractual relationship with the data subject. 2.

There are considerable doubts as to the compatibility of Section 31 of the BDSG with Article 22 of the GDPR, because the German legislator only regulates the "use" of the "probability value", but not the creation of the probability value itself.

Tenor
I. The proceedings are suspended.

II. the proceedings are referred to the Court of Justice of the European Union for a preliminary ruling pursuant to Article 267 TFEU on the following questions:

(1) Is Article 22(1) of Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of individuals with regard to the processing of personal data, on the free movement of such data and repealing Directive 95/46/EC (- the GDPR - OJ EU L No 119, 4.5.2016, p. 1) be interpreted as meaning that the automated production of a probability figure relating to the ability of a data subject to service a loan in the future already constitutes a decision which produces legal effects vis-à-vis the data subject or similarly significantly affects the data subject based solely on automated processing, including profiling, where that figure, produced by means of personal data relating to the data subject, is communicated by the controller to a third controller and that third controller bases its decision on the establishment, performance or termination of a contractual relationship with the data subject on that figure?

(2) If the answer to the first question referred is in the negative, are Art. 6(1) and Art. 22 of Regulation (EU) 2016/679 - the GDPR - to be interpreted as precluding national legislation under which the use of a probability value - in the present case, the probability of a natural person's solvency and willingness to pay when information on claims is included - about a particular future conduct of a natural person for the purpose of deciding on the establishment, performance or termination of a contractual relationship with that person (scoring) is permissible only if certain further conditions, which are set out in more detail in the grounds for reference, are satisfied?

Reasons
I.

The subject-matter of the proceedings is an action against the score formed by the defendant SCHUFA Holding AG in respect of the plaintiff. The defendant is a private German credit agency which provides its contractual partners with information on the creditworthiness of third parties, in particular also consumers. For this purpose, the defendant compiles so-called score values. For this purpose, the probability of a person's future behaviour, such as the repayment of a loan, is predicted from certain characteristics of the person on the basis of mathematical-statistical procedures, whereby both the individual characteristics on which the score values are based and the mathematical-statistical procedure are not disclosed. Accordingly, the creation of score values is based on the assumption that by assigning a person to a group of other persons with certain comparable characteristics who have behaved in a certain way, similar behaviour can be predicted. If the person has a certain profile, the score determined is attributed to him or her by the respondent and taken into account in the decision-making process of the person who ultimately contracts with the person concerned, for example a credit institution, when granting a loan, with corresponding consequences.

The plaintiff was refused credit by a third party after negative information was provided by the defendant. Subsequently, the plaintiff demanded information on stored data from the defendant, in addition to the deletion of what it considered to be incorrect entries. On 10 July 2018, the defendant provided the plaintiff with information stating that the plaintiff had a score of 85.96% with the defendant. In letters dated 8 August 2018 and 23 August 2018, the defendant also informed the plaintiff in general terms of how its score calculation worked, but not which individual pieces of information were included in the calculation and how they were weighted. The defendant was not obliged to disclose the calculation methods, as they were covered by industrial and commercial secrecy. The defendant also pointed out to the plaintiff that it only provided information to its contractual partners, but that the latter made the actual contractual decisions; the defendant did not make a recommendation for or against concluding a contract with an agent in this respect. On 18 October 2018, the plaintiff filed a complaint against the information with the defendant, requesting that the defendant order the defendant to comply with the plaintiff's request for information and deletion. The defendant was obliged to inform about the logic involved and the scope and effects of the processing.

By notice to the plaintiff of 3 June 2020, the defendant refused to take further action against the defendant. In justification, the defendant stated, inter alia, that the calculation of the creditworthiness value by the defendant had to comply with the detailed requirements set out in section 31 of the Federal Data Protection Act (Art. 1 of the Act on the Adaptation of Data Protection Law to Regulation (EU) 2016/679 and on the Implementation of Directive (EU) 2016/680 (Data Protection Adaptation and Implementation Act EU - DSAnpUG-EU) of 30 June 2018, Federal Law Gazette I p. 2097 = BDSG). However, these requirements are generally fulfilled by the respondent and in the present case there is nothing to indicate that this is not the case.
On 25 May 2018, Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of individuals with regard to the processing of personal data, on the free movement of such data and repealing Directive 95/46/EC (OJ EU L No. 119, 4.5.2016, p. 1 = GDPR) entered into force. The General Data Protection Regulation provides for a general prohibition of data processing with the reservation of permission; essential elements of processing permission can be found in Art. 6 of the General Data Protection Regulation. In addition, the GDPR contains a multi-instrumental protection concept, which includes provisions in particular on data subjects' rights to information, access, deletion and complaints to the competent supervisory authority for intervention as well as legal action against official decisions before national courts. Among other things, the GDPR also specifically addresses so-called profiling, which is legally defined in Art. 4 No. 4 of the GDPR and to which the actions of the respondent in dispute are subject as so-called "scoring". Profiling is regulated in various provisions, including in connection with the right of data subjects to information under Article 15(1)(h) of the GDPR, the right of data subjects to object under Article 21(1)(1)(2) of the GDPR and - in essence - in Article 22 of the GDPR as a general prohibition (Article 22(1) of the GDPR) with exceptions (Article 22(2) of the GDPR) where decisions are based exclusively on profiling.

As a regulation under Union law within the meaning of Article 288(2) TFEU, the GDPR has general application, is binding in its entirety and directly applicable in every Member State. Despite these principles, the General Data Protection Regulation contains various so-called opening clauses, which give the Member States a certain amount of leeway for national regulations. In view of these left-over norm-setting powers, the new Federal Data Protection Act entered into force on 25 May 2018. § Section 31 BDSG contains detailed regulations on scoring and creditworthiness information.

II.

1. the Charter of Fundamental Rights of the European Union - GrCh - (OJ 2016 No. C 202 of 7 June 2016, p. 389) provides:
Art. 7 GrCh - Respect for private and family life.
Everyone has the right to respect for his or her private and family life, home and communications.

Art. 8 GrCh - Protection of personal data
(1) Everyone has the right to the protection of personal data concerning him or her.
(2) Such data may be processed only fairly and lawfully for specified purposes and with the consent of the data subject or on any other legitimate basis laid down by law. Any person shall have the right of access to data collected concerning him or her and the right to have such data rectified.
(3) Compliance with these provisions shall be monitored by an independent body.

Art. 52 GrCh - Scope and interpretation of rights and principles
(1) Any limitation on the exercise of the rights and freedoms recognised in this Charter must be provided for by law and respect the essence of those rights and freedoms. In accordance with the principle of proportionality, limitations may be made only if they are necessary and genuinely meet objectives of general interest recognised by the Union or the need to protect the rights and freedoms of others.
[...]

2. the Treaty on the Functioning of the European Union - TFEU - (as amended on 7 June 2016, OJ No. C 202 p. 1, 47) provides:
Art. 288 TFEU
[...]
(2. This Regulation shall have general application. It shall be binding in its entirety and directly applicable in all Member States. 3.

3. the General Data Protection Regulation (Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of individuals with regard to the processing of personal data, on the free movement of such data and repealing Directive 95/46/EC (General Data Protection Regulation = GDPR; OJ EU L No 119, 4.5.2016, p. 1) regulates:
Art. 4 GDPR - Definitions.
For the purposes of this Regulation, the term:
[...]
(4) 'profiling' means any automated processing of personal data which consists in using such personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects relating to that natural person's performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or change of location;
Art. 6 DS-GVO - Lawfulness of processing
(1. Processing shall be lawful only if at least one of the following conditions is met:
(a) the data subject has given consent to the processing of personal data relating to him or her for one or more specific purposes;
(b) processing is necessary for the performance of a contract to which the data subject is party or for the implementation of pre-contractual measures taken at the data subject's request;
(c) processing is necessary for compliance with a legal obligation to which the controller is subject;
(d) processing is necessary in order to protect the vital interests of the data subject or of another natural person;
(e) processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller;
(f) processing is necessary for the purposes of the legitimate interests of the controller or of a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require the protection of personal data, in particular where the data subject is a child.
Point (f) of the first subparagraph shall not apply to processing carried out by public authorities in the performance of their tasks.
2. Member States may maintain or introduce more specific provisions to adapt the application of the rules of this Regulation in relation to processing to comply with points (c) and (e) of paragraph 1 by specifying more precisely specific requirements for processing as well as other measures to ensure lawful and fair processing, including for other specific processing situations referred to in Chapter IX.
3. The legal basis for the processing operations referred to in points (c) and (e) of paragraph 1 shall be determined by
(a) Union law; or
(b) the law of the Member States to which the controller is subject.
The purpose of the processing must be specified in that legal basis or, as regards the processing referred to in point (e) of paragraph 1, be necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller. That legal basis may contain specific provisions adapting the application of the provisions of this Regulation, including provisions on the general conditions governing the lawfulness of processing by the controller, the types of data processed, the individuals concerned, the entities to which and the purposes for which the personal data may be disclosed, the purpose limitation, the storage period and the processing operations and procedures that may be applied, including measures to ensure lawful and fair processing, such as those for other specific processing situations in accordance with Chapter IX. Union or Member State law shall pursue an objective in the public interest and be proportionate to the legitimate aim pursued.
4. Where processing for a purpose other than that for which the personal data were collected is not based on the consent of the data subject or on Union or Member State law which constitutes a necessary and proportionate measure in a democratic society to safeguard the objectives referred to in Article 23(1), the controller shall, in order to determine whether the processing for another purpose is compatible with that for which the personal data were originally collected, take into account, inter alia
(a) any link between the purposes for which the personal data were collected and the purposes of the intended further processing
(b) the context in which the personal data were collected, in particular as regards the relationship between the data subjects and the controller
(c) the nature of the personal data, in particular whether special categories of personal data are processed pursuant to Article 9 or whether personal data relating to criminal convictions and offences are processed pursuant to Article 10,
(d) the possible consequences of the intended further processing for the data subjects,
(e) the existence of appropriate safeguards, which may include encryption or pseudonymisation.
Article 15 of the GDPR - Data subject's right to information
(1. The data subject shall have the right to obtain from the controller confirmation as to whether personal data relating to him or her are being processed and, if so, the right to obtain access to those personal data and the following information:
[...]
(h) the existence of automated decision-making, including profiling, pursuant to Article 22(1) and (4) and, at least in those cases, meaningful information about the logic involved and the scope and intended effects of such processing for the data subject.

Article 21 of the GDPR - Right to object
1. The data subject shall have the right to object at any time, on grounds relating to his or her particular situation, to the processing of personal data concerning him or her carried out on the basis of Article 6(1)(e) or (f), including any profiling based on those provisions.
[...]

Art. 22 DS-GVO - Automated decision in individual cases including profiling
(1. The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.
(2. Paragraph 1 shall not apply where the decision is
(a) necessary for the conclusion or performance of a contract between the data subject and the controller,
(b) is permitted by Union or Member State law to which the controller is subject and that law contains appropriate measures to safeguard the rights and freedoms and legitimate interests of the data subject; or
(c) with the explicit consent of the data subject.
(3) In the cases referred to in points (a) and (c) of paragraph 2, the controller shall take reasonable steps to safeguard the rights and freedoms as well as the legitimate interests of the data subject, which include, at least, the right to obtain the intervention of a person on the part of the controller, to express his or her point of view and to contest the decision.
Decisions under paragraph 2 shall not be based on special categories of personal data referred to in Article 9(1), unless Article 9(2)(a) or (g) applies and appropriate measures have been taken to protect the rights and freedoms and legitimate interests of the data subject.

4. the Federal Data Protection Act of 30 June 2017 (BGBl. I. p. 2097, amended by Article 12 of the Act of 20 November 2019, BGBl. I. 1626)provides:
§ Section 31 BDSG - Protection of economic transactions in the case of scoring and creditworthiness information.
(1) The use of a probability value about a certain future behaviour of a natural person for the purpose of deciding on the establishment, implementation or termination of a contractual relationship with that person (scoring) is only permissible if
1. the provisions of data protection law have been complied with,
2. the data used to calculate the probability value are demonstrably relevant to the calculation of the probability of the specific conduct on the basis of a scientifically recognised mathematical-statistical procedure,
3. address data were not exclusively used for the calculation of the probability value, and
4. in the case of the use of address data, the data subject has been informed of the intended use of such data prior to the calculation of the probability value; the information shall be documented.
(2) The use of a probability value determined by credit agencies on the solvency and willingness to pay of a natural person shall only be permissible in the case of the inclusion of information on claims insofar as the prerequisites in accordance with paragraph 1 are met and only such claims on a service owed which has not been rendered despite being due are taken into account,
1. which have been established by a final judgment or a judgment declared provisionally enforceable or for which a debt instrument exists in accordance with section 794 of the Code of Civil Procedure,
2. which have been determined in accordance with section 178 of the Insolvency Code and have not been contested by the debtor at the verification meeting,
3. which the debtor has expressly acknowledged,
4. where
a) the debtor has been reminded in writing at least twice after the claim became due,
b) the first reminder was at least four weeks ago,
c) the debtor has been informed beforehand, but at the earliest at the time of the first reminder, about a possible consideration by a credit agency, and
d) the debtor has not disputed the claim or
5. whose underlying contractual relationship can be terminated without notice due to payment arrears and for which the debtor has been informed in advance about a possible consideration by a credit agency.
The permissibility of processing, including the determination of probability values, of other data relevant to creditworthiness under general data protection law remains unaffected.
III.

In the present case, it is necessary to decide whether the activity of credit agencies, such as the respondent, to compile score values about data subjects and to transmit them without further recommendation or comment to third parties who contract with the data subject or refrain from contracting with the data subject with the relevant inclusion of this score value, falls within the scope of application of Article 22(1) of the GDPR. For in this case, the permissibility of the creation of a final score value for transmission by a credit agency, such as the respondent, can only be based on Article 22 (2) (b) of the GDPR in conjunction with Section 31 of the BDSG, whereby the standards are then at the same time - if the data subject lodges a complaint with the competent supervisory authority, as in the proceedings at issue - the standard for the supervisory review of the credit agency's activity. This in turn depends on whether a provision with the content of Section 31 BDSG is compatible with Article 22(2)(b) of the GDPR. If it is not, the legal standard of review that the defendant uses as a basis in this case with regard to the respondent is lacking.

On question 1:
Applicability of Article 22(1) of the GDPR to credit agencies
According to Article 22(1) of the GDPR, a data subject has the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her. The provision is modelled on its predecessor, Article 15 of Directive 95/46/EC. According to its wording, it appears to be a right of the data subject which must be exercised. In contrast, the referring court is convinced that the provision establishes a prohibition in principle, the violation of which does not require an individual claim.

Activities such as the automated compilation of personal data - carried out by the defendant - to determine a probability of a certain future behaviour of a natural person for the purpose of transferring it to third parties for their decision on the establishment, performance or termination of a contractual relationship with that data subject fall, in any event, according to the content of the activity, under the regulatory regime of Article 22(1) of the GDPR. According to its clear wording, the provision covers not only, but also, decisions made on the basis of profiling, cf. also recital 71 p. 2. The latter is legally defined in Article 4(4) of the GDPR as any automated processing of personal data which consists in using such personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects of that person's performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or change of location.
The creation of score values fulfils these definitional characteristics. This is also supported by EC 71 S. 2, according to which profiling is to be understood, among other things, as the analysis or prognosis of aspects concerning the economic situation, reliability or behaviour of a person. EC 71 P. 1 also mentions the automatic rejection of an online credit application as an example of decisions within the meaning of Article 22(1) of the GDPR. In this respect, Article 22(1) of the GDPR is in principle applicable to cases such as the present one, at least with regard to the fact that, according to the intention of the legislator, the creation of a score value is a subset of profiling within the meaning of Article 4(4) of the GDPR.
In essence, the referring court considers it obvious that in cases such as the one at issue, the constituent element of a decision based exclusively on automated processing, as required by Article 22(1) of the GDPR, is also fulfilled. This is not contradicted by the fact that, according to the recitals, the main activity of credit agencies - such as the respondent - is to determine score values, which is a sub-category of profiling. It is true that the legislator obviously intended not to regulate the admissibility of profiling under data protection law in Article 22(1) of the GDPR, but only to address profiling, as it were, to the extent that it is part of a decision based on an automated decision. This is already apparent from the wording of the provision, which, for its prohibition, focuses primarily on the decision based on profiling - or other automated data processing - but not on the profiling itself.
However, the court assumes that the creation of a score value by a credit agency is not merely profiling that prepares the decision of the third party controller, but rather an independent "decision" within the meaning of Article 22(1) of the GDPR.
In view of the wording of Article 22(1) of the GDPR, the court is aware that the provision can be understood in a restrictive interpretation and is also widely understood in such a way that it does not directly apply to the activities of credit agencies such as the respondent. In the court's view, however, such an assumption is based on an erroneous understanding of the activities of credit agencies and the influence of the score values they produce. The assumption is based on the idea that credit agencies do not make the relevant decision for Article 22(1) of the GDPR themselves, because they only prepare the final decision of the data controller by determining and compiling personal data for the purpose of profiling and the resulting determination of a final score, as it were, and they do not typically also make a recommendation to the third party data controller for or against a contractual agreement with the data subject when transmitting the score.
In its provisions and recitals, the GDPR makes a conceptual distinction between processing on the one hand and decisions based on processing on the other, and does not intend to make any independent material provisions on profiling.
Article 4 No. 4 of the GDPR states that profiling within the meaning of the GDPR is "any automated processing of personal data intended to evaluate certain personal aspects relating to a natural person". The wording of the legal definition can therefore be understood to mean that profiling is not only the determination of the parameters for the assessment result, but also includes the assessment result. With regard to the case at hand, this could also include the automated compilation of the individual characteristics with the aim of subtracting an overall score value by a credit agency and its actual determination. Article 21 (1) sentence 1 of the GDPR could also be interpreted in the direction of such an understanding of the term, according to which the right of objection of the data subject refers to any processing and, according to sentence 2, in particular also to profiling based on the provisions of the GDPR. Ultimately, the differentiation between automated processing by profiling on the one hand and decision-making on the other hand emerges primarily from Art. 22(1) GDPR. By stating that a data subject has the right "not to be subject to a decision based solely on automated processing, including profiling", Article 22(1) of the GDPR explicitly establishes a causal link and chronologically compelling sequence between automated processing (including profiling) and the decision based on it. The intention of the legislator to distinguish between the two concepts is further supported by recital 71 pp. 1, 2. While recital 71 p. 1 explains that the data subject should have the right not to be subject to a decision evaluating personal data relating to him or her which is based solely on automated processing, Recital 71 p. 2 complements this assumption by referring to "such decisions". 2 adds to this assumption that "such processing" - hence not "decisions" - also includes profiling. As an example of a "decision", recital 71 p. 1 rather names the automatic processing of data. 1 rather mentions the automatic rejection of a credit application as an example, thus broadly addressing the case constellation here insofar as the rejection decision of the credit institution vis-à-vis the applicant is the relevant "decision", but not the creation of the score value by the respondent. Ultimately, the wording of Articles 21 (1) sentence 1, 22 (1) and 4 no. 4 of the GDPR as well as recitals 71 p. 1, 2 and 72 can be interpreted as follows. 1, 2 and 72 can be understood to mean that case constellations such as the one underlying the main proceedings, in which a credit agency determines a score, constitute a "processing" but not a "decision" within the meaning of Article 22(1) of the GDPR.
However, the referring court has considerable doubts about such a restrictive interpretation of Article 22(1) of the GDPR. It sees strong indications that the automated creation of a score by credit agencies for the prognostic assessment of the economic capacity of a data subject is an independent decision based on automated processing within the meaning of Article 22(1) of the GDPR. The referring court bases its doubts in factual terms on the significance of the score compiled by credit agencies for the decision-making practice of third party controllers and, in legal terms, on the purposes pursued by Article 22(1) of the GDPR and the legal protection guaranteed by Article 87 et seq. of the GDPR. DS-GVO:
From a factual point of view, the court has serious reservations about the assumption that third party controllers make the individual decision required by Article 22(1) of the GDPR, which is not exclusively based on automation, when a score value is available for a data subject. Although, at least hypothetically, the third party controllers can make their own decision on whether and how to enter into a contractual relationship with the data subject, because at this stage of the decision-making process a human-controlled individual decision is still possible in principle, this decision is practically determined to such a significant extent by the score value transmitted by credit agencies that it is, as it were, transmitted through the decision of the third party controller. In other words: In fact, the score value generated by the credit agency on the basis of automated processing ultimately determines whether and how the third party responsible enters into a contract with the data subject. Although the third party does not have to make its decision solely dependent on the score value, it usually does so to a significant extent. A loan may be refused despite a basically sufficient score (for other reasons, such as the lack of collateral or doubts about the success of an investment to be financed), but an insufficient score will lead to the refusal of a loan in almost every case, at least in the area of consumer loans, and even if an investment otherwise appears to be worthwhile. The fact that score values play the decisive role in the granting of loans and the design of their conditions is shown by experiences from the data protection supervision by the authorities (see LfDI BW, New brochure: Scoring - solid prognosis or lousy number?, https://www.baden-wuerttemberg.datenschutz.de/neue-broschuere-scoring-solide-prognose-oder-miese-nummer/ (as of 30.09.2021)).
However, Article 22(1) of the GDPR - subject to the exceptions in Article 22(2) of the GDPR - is intended to protect the data subject from the dangers of this form of decision-making based purely on automation. The legislator's concern is to prevent decision-making from taking place without individual assessment and evaluation by a human being. The data subject should not be at the mercy of an exclusively technical and opaque process without being able to comprehend the underlying assumptions and assessment standards and, if necessary, intervene by exercising his or her rights. Thus, in addition to protection against discriminatory decisions based on supposedly objective data processing programmes, the aim of the regulation is also to create transparency and fairness in decision-making. Decisions on the exercise of individual freedoms should not be left unchecked to the logic of algorithms. This is because algorithms work with correlations and probabilities that do not necessarily follow a causality and do not necessarily lead to results that are "correct" according to human insight. Rather, erroneous, unfair or discriminatory conclusions can be drawn from the systematisation of accurate individual data, which - if they become the basis of a decision-making process - considerably affect the freedom rights of the person concerned and degrade him or her from the subject to the object of a depersonalised decision. This is particularly true if the data subject is not aware of the use of algorithms or - if he or she is - cannot overlook which data are included in the decision, with what weight and by which methods of analysis. However, it is precisely this concern of the legislator to make a human corrective for automated data processing mandatory in principle and to allow breaches only in limited exceptional cases (Art. 22(2) GDPR) that is thwarted, because the automatically generated score value basically takes a predominant position in the decision-making of the third party controller.
The legislator wanted to solve this basic conflict by means of the prohibition contained in Article 22 (1) of the GDPR "at the expense" of the third party controller, as it were, by starting with the (final) decision vis-à-vis the data subject. In this respect, procedural requirements for profiling are only formulated in recital 71, sentence 6, which is relevant for profiling. Apart from that, the permissibility of data processing for the purpose of profiling results at best from the general processing conditions of Article 6 (1) of the GDPR. This follows both from Article 21(1)(1)(2) of the GDPR, which refers to Article 6(1)(1)(e) and (f) of the GDPR as a possible legal basis for profiling, and from Recital 72(1). 1, according to which profiling is subject to the provisions of the GDPR for the processing of personal data, i.e. also the legal basis for the processing or the data protection principles.
As a result of these merely rump requirements of the GDPR on profiling on the one hand and the fundamental postulate from Art. 22 (1) GDPR on the other hand, the problem of effective legal enforcement by data subjects arises in particular. Alongside the supervisory control mechanism, this is the decisive legal enforcement mechanism of the GDPR. This is shown not only by the well-balanced and comprehensively regulated rights to complain and take legal action under Art. 87 et seq. DS-GVO, but also the accompanying data subject rights from Art. 12 ff. DS-GVO. The aim of the GDPR is to enable and mobilise the responsible EU citizen to enforce the law, in particular through the provision of information rights and transparency requirements.
These rights are undermined by the interplay of the activities and (lack of) obligations of the credit agencies and the decision-making practice of the third party data controllers. Although the data subject has a general right to information from the credit agencies under Article 15 of the GDPR, this does not do justice to the special features of profiling, which the GDPR seeks to address through Articles 15(1)(h), 21(1)(2) and 22 of the GDPR. Within the framework of the general right to information, the credit agencies are not obliged to disclose the logic and composition of the parameters that are decisive for the creation of the score value; nor do they do so for reasons of protection of competition, invoking their trade and business secrecy.
The third party responsible is also unable to provide the data subject with information on the creation of the score value, which is the decisive factor in his or her decision, because he or she is not aware of the logic involved; it is not disclosed to him or her by the credit agency.
This creates a gap in legal protection: The party from whom the information required by the data subject could be obtained is not obliged to provide information under Article 15(1)(h) of the GDPR because it does not purportedly operate its own "automated decision-making" within the meaning of Article 15(1)(h) of the GDPR, and the party that bases its decision-making on the automated score and is obliged to provide information under Article 15(1)(h) of the GDPR cannot provide the required information because it does not have it.
If the creation of the score value by a credit agency falls within the scope of Article 22 (1) of the GDPR, this legal loophole closes. Not only does the creation of score values thus fall under the prohibition of Article 22(1) of the GDPR, so that as it is based on exclusively automated processing, it is only permissible under the exceptional circumstances of Article 22(2) of the GDPR and thus corresponds to the intention of the EU legislator to at least regulatory containment of such decisions. Taking into account the opening clause of Article 22(2)(b) of the GDPR, this approach also enables detailed regulation of such decision-making by the Member States, which they are prevented from doing under the previous provisions of the GDPR on profiling and automated decision-making (see question 2).
The gap in legal protection is also not sufficiently closed by the right of objection of the data subject pursuant to Article 21 (1) sentence 1 of the GDPR. According to this provision, the data subject has the right "to object at any time, on grounds relating to his or her particular situation, to the processing of personal data concerning him or her which is carried out on the basis of Article 6(1)(e) or (f); this also applies to profiling based on these provisions". However, in the case of credit agencies, the data subject typically does not know that he or she has become the subject of an automated scoring procedure. They typically only find out when a decision adverse to them has already been made by a third party responsible with reference to the score value. At that point, however, the right of objection is no longer of any help to her, at least with regard to the closed case; in this respect, she can only exercise her right of objection with regard to future data processing by the credit agency.
On question 2:
Member State regulations on scoring
Pursuant to Section 31 (1) BDSG, the permissibility of using a probability value about a certain future behaviour of a natural person for the purpose of deciding on the establishment, implementation or termination of a contractual relationship with this person (scoring) depends on the fulfilment of further requirements. According to Section 31 (2) BDSG, the use of a probability value determined by credit agencies about the solvency and willingness to pay of a natural person in the case of the inclusion of information about claims is only permissible insofar as the prerequisites according to Section 31 (1) BDSG are met and only such claims about an owed service that has not been provided despite being due are taken into account that meet further specific prerequisites, whereby the permissibility of the processing, including the determination of probability values, of other data relevant to creditworthiness according to general data protection law remains unaffected.
Thus, in Section 31 of the BDSG, the German legislature essentially makes provisions on scoring as a subset of profiling. The referring court has considerable doubts as to the compatibility of these regulations with Article 22 of the GDPR because the German legislature only regulates the "use" of the "probability value", but not the creation of the probability value itself.
§ Section 31 of the BDSG is conclusive in that it only regulates profiling to the extent that it forms the basis of a decision based on it. Accordingly, the reference point of the prohibition is only the decision, not the profiling that precedes it. Neither Article 22 of the GDPR nor other provisions of the GDPR formulate specific substantive requirements for the lawfulness of data processing for the purpose of profiling in the form of scoring itself. With regard to profiling, there are only provisions on information obligations in Article 14(2)(g) of the GDPR, the right to information in Article 15(1)(h) of the GDPR, but in each case only with regard to the existence of automated decision-making, not to profiling itself, and on the right of the data subject to object in Article 21(1)(1) of the GDPR and in other provisions that are irrelevant to the procedure at issue.
In the absence of specific provisions, the admissibility of profiling, insofar as it is not covered by Article 22 of the GDPR in the form of scoring by means of the decision based on it, is therefore otherwise governed by the general processing facts of Article 6 of the GDPR. By attaching further substantive admissibility requirements to scoring, the German legislator specifies the regulatory matter beyond the requirements of Articles 6 and 22 of the GDPR. However, it lacks the regulatory power to do so.
In particular, a corresponding regulatory power cannot be derived from Article 22 (2) (b) of the GDPR. The GDPR provides for a Member State's power to regulate profiling only if the decision is based exclusively on automated processing. § Section 31 of the BDSG, on the other hand, provides for non-automated decisions without differentiation, but regulates the admissibility of the use of scoring data processing. However, according to the systematics of Article 22 of the GDPR and the general processing facts of Article 6 of the GDPR, the admissibility of decisions that are not based on automated processing, including profiling, is governed by Article 6 of the GDPR. This subject matter is beyond the reach of the national legislators, which may be regarded as a deliberate decision by the legislator not to regulate. The legislator obviously did not want to impose more specific requirements on profiling. This cannot simply be done by the Member State legislator - at least within the framework of Article 22(2)(b) of the GDPR only if the Member State provisions exclusively lay down legal requirements for such decisions, which are based exclusively on automated processing.
This applies in particular against the background that the GDPR is a regulation within the meaning of Article 288 (3) TFEU. According to the established case law of the ECJ, the national legislator is already precluded from making legally conclusive assessments - in this case Section 31 BDSG - with regard to abstractly formulated requirements of the European legislator - in this case Articles 6 and 22 DS-GVO - with regard to requirements under directive law (ECJ, judgment of 19.10.2016, Breyer v. Germany, C-582/14, ECLI:EU:C:2016:779, para. 62 f.). This must then apply a fortiori to requirements - as here - in regulations.
Significantly, the German legislator does not specify in its explanatory memorandum to Section 31 BDSG on what basis its regulatory competence is based with regard to this provision. The explanatory memorandum consists of more or less general statements to the effect that the provision takes over the predecessor provisions of Sections 28a and 28b BDSG, old version, and that the substantive provisions are still relevant. In contrast, the draft bill of the Federal Ministry of the Interior of 11 November 2016, p. 93 f., still claimed that the Member State's regulatory power resulted from the "synopsis of Articles 6(4) and 23(1)" of the GDPR. However, this approach - which in itself is untenable - was apparently abandoned in the course of the legislative process.
IV.
According to all this, a referral to the European Court of Justice is required. The outcome of the dispute depends on the questions referred.
The procedure depends on question 1. If Article 22(1) of the GDPR is to be interpreted as meaning that the creation of a score by a credit agency is an independent decision within the meaning of Article 22(1) of the GDPR, this - its relevant activity - would be subject to the prohibition of automated individual decision-making. Consequently, a legal basis in the Member States within the meaning of Article 22(2)(b) of the GDPR would be required, for which only Section 31 of the BDSG comes into consideration. However, there are serious doubts as to its compatibility with Article 22(1) of the GDPR. The defendant would then not only act without a legal basis, but would ipso iure violate the prohibition of Article 22(1) of the GDPR. As a result, the plaintiff would at the same time have a claim against the defendant for a (further) referral of its case to the supervisory authority.

If the answer to Question 1 is no, i.e. profiling itself is not a decision within the meaning of Article 22(1) and (2) of the GDPR, then the opening clause of Article 22(2)(b) of the GDPR does not apply to national rules concerning profiling. Due to the fundamentally conclusive character of the GDPR, which is designed for full harmonisation, a different regulatory power for national regulations must therefore be sought. However, since such a power is not apparent and, in particular, does not follow from the rudimentary provisions of the GDPR, the national regulation in Section 31 of the BDSG is not applicable, which changes the scope of review of the national supervisory authority, which would then have to measure the compatibility of the activities of credit agencies against Article 6 of the GDPR.

V.

This decision is final.