BlnBDI (Berlin) - 31.05.2023: Difference between revisions

From GDPRhub
mNo edit summary
No edit summary
Line 73: Line 73:


=== Holding ===
=== Holding ===
The German DPA upheld the data subject’s complaint.  
The German DPA (Berlin) upheld the data subject’s complaint.  


The German DPA defined an “automated decision” as a decision took by algorithmic models without human intervention. The case at issue fell within the scope of such a definition. According to the supervisory authority, when a controller makes use of tools involving automated decision-making, it is subject to specific transparency requirements under [[Article 22 GDPR#3|Article 22(3) GDPR]].
The German DPA defined an “automated decision” as a decision took by algorithmic models without human intervention. The case at issue fell within the scope of such a definition. According to the supervisory authority, when a controller makes use of tools involving automated decision-making, it is subject to specific transparency requirements under [[Article 22 GDPR#3|Article 22(3) GDPR]].

Revision as of 11:28, 6 June 2023

BlnBDI - 31.05.2023
LogoDE-BE.png
Authority: BlnBDI (Berlin)
Jurisdiction: Germany
Relevant Law: Article 15(1)(h) GDPR
Article 22(3) GDPR
Type: Complaint
Outcome: Upheld
Started:
Decided:
Published: 31.05.2023
Fine: 300,000 EUR
Parties: n/a
National Case Number/Name: 31.05.2023
European Case Law Identifier: n/a
Appeal: Unknown
Original Language(s): German
Original Source: BlnBDI (Berlin) (in DE)
Initial Contributor: mg

In a rare case of enforcement of Article 22(3) GDPR, a German DPA (Berlin) fined a bank €300,000 for not having provided the data subject with a meaningful explanation about the logic involved in an automated decision affecting them.

English Summary

Facts

The data subject applied for a credit card with the controller - a bank. The controller asked the data subject to fill a form with several personal data. An algorithm analysed this information and automatically rejected the data subject’s request, despite the fact that the latter had both a good credit score and a high income.

The data subject asked the controller to provide details about the factors which led to the automated rejection. The controller refused to provide access to such information.

The data subject lodged a complaint with the German DPA (Berlin).

Holding

The German DPA (Berlin) upheld the data subject’s complaint.

The German DPA defined an “automated decision” as a decision took by algorithmic models without human intervention. The case at issue fell within the scope of such a definition. According to the supervisory authority, when a controller makes use of tools involving automated decision-making, it is subject to specific transparency requirements under Article 22(3) GDPR.

In particular, the controller should provide access to concrete information about the database used, the factors and the criteria influencing the decision. In disregarding such a duty in the context of the data subject’s access request, the controller violated Articles 5(1)(a), 15(1)(h) and 22(3) GDPR.

Therefore, the German DPA fined the controller €300,000.

Comment

Share your comments here!

Further Resources

Share blogs or news articles here!

English Machine Translation of the Decision

The decision below is a machine translation of the German original. Please refer to the German original for more details.

The Berlin Commissioner for Data Protection and Freedom of Information (BlnBDI) has imposed a fine of 300,000 euros on a bank for lack of transparency in an automated individual decision. The bank had refused to provide a customer with comprehensible information about the reasons for the automated rejection of a credit card application. The company has cooperated extensively with the BlnBDI and accepted the fine notice. An automated decision is a decision that an IT system makes solely on the basis of algorithms and without human intervention. In this case, the General Data Protection Regulation (GDPR) provides for special transparency obligations. Personal data must be processed in a way that is comprehensible for the data subjects. Affected persons have a right to an explanation of the decision made after a corresponding assessment. If data subjects request information from those responsible, they must provide meaningful information about the logic involved behind the automated decision. In this case, however, the bank did not take this to heart in its digital application for a credit card. Using an online form, the bank requested various data on the applicant's income, occupation and personal details. Based on the requested information and additional data from external sources, the bank's algorithm rejected the customer's application without any specific justification. The algorithm is based on criteria and rules previously defined by the bank. Since the customer had a good Schufa score and a regular high income, he doubted the automated rejection. When asked, the bank only provided general information on the scoring process that was separated from the individual case. However, she refused to tell him why she thought his case was bad. The complainant was therefore unable to understand what database and factors the rejection was based on and what criteria his credit card application was accordingly rejected. However, without this individual justification, it was also not possible for him to challenge the automated individual decision in a meaningful way. He then complained to the data protection officer. Meike Kamp, Berlin Commissioner for Data Protection and Freedom of Information: "When companies make automated decisions, they are obliged to justify them in a sound and comprehensible manner. Those affected must be able to understand the automated decision. The fact that the bank did not provide transparent and comprehensible information about the automated rejection in this case, even on request, resulted in a fine. A bank is obliged to inform customers of the main reasons for a rejection when making an automated decision about a credit card application. This includes specific information on the database and the decision-making factors as well as the criteria for rejection in individual cases." The data protection officer has determined that in the specific case the bank violated Art. 22 Para 15 paragraph 1 lit. h GDPR has violated. When assessing the fine, the BlnBDI took particular account of the high turnover of the bank and the intentional design of the application process and information. Among other things, classified as having acknowledged the violation and already implemented changes to the processes and announced further improvements.