NAIH (Hungary) - NAIH-85-3/2022
|NAIH - NAIH-85-3/2022|
|Relevant Law:||Article 5(1)(a) GDPR|
Article 5(1) GDPR
Article 5(2) GDPR
Article 6(1) GDPR
Article 6(1)(f) GDPR
Article 6(4) GDPR
Article 12(1) GDPR
Article 13 GDPR
Article 21 GDPR
Article 21(2) GDPR
Article 24(1) GDPR
Article 25(1) GDPR
Article 25(2) GDPR
|Parties:||Budapest Bank Zrt.|
|National Case Number/Name:||NAIH-85-3/2022|
|European Case Law Identifier:||n/a|
|Original Source:||NAIH (in HU)|
|Initial Contributor:||Cesar Manso-Sayao|
The Hungarian DPA fined Budapest Bank approximately €700,000 for carrying out automated decision-making and profiling based on emotional AI analysis of customer service calls, without a valid legal basis, a proper balancing of interests, and adequate safeguards. The DPA also held that the bank failed to provide data subjects with information related to the processing and their right to object.
English Summary[edit | edit source]
Facts[edit | edit source]
In September 2021, the Hungarian DPA initiated an ex officio investigation against Budapest Bank Zrt. (hereinafter the Bank) related to the use of Artificial Intelligence (AI) software applied to the audio recordings of customer service telephone conversations between May 2018 and the start of the investigation.
According to the Bank, the software used speech signal processing based on AI to identify periods of silence, different voices talking at the same time, key words, and emotional elements (such as voice speed, volume and pitch) within the recorded sound files in order to identify customer dissatisfaction. Once the software had made an automated decision to identify calls according to these criteria, a Bank employee then listened to the recordings, and made call-backs to customers in order to handle and attempt to resolve any customer dissatisfaction issues.
The Bank stated that its legal basis for this processing was based on legitimate interest, and its purpose was to conduct call quality control, to prevent complaints and customer churn, as well as to increase efficiency.
The Bank stated that customers were informed at the beginning of the calls that they were being recorded, but admitted that they did not inform them that the AI software would be used to analyse the calls, since detailed information in this regard would make the introduction to the calls too long, outlasting many of the simple queries made by customers when calling the Bank. The Bank also claimed that the system did not store any identifiable personal data, or perform automated decision-making in order to create personal profiles.
Additionally, in a Data Protection Impact Assessment carried out by the Bank, the Data Protection Officer stated that: “The purpose of the processing is lawful on the basis of the rights of the data subjects and the business interests of the Bank, there is no direct or indirect legal prohibition. The processing is high-risk for several reasons, in particular the novelty of the technology used, as the audio recordings are analysed and findings are made automatically by artificial intelligence. The aggregate data is suitable for profiling or scoring for both sets of data subjects [customers and employees], and although no automated decision making is involved, the data processing may have legal effects on the data subjects. The high risk is mitigated by the controller through measures identified in the impact assessment, such as human decision-making at the end of automated processing. The exercise of data subjects' rights is ensured in accordance with standard practice.”
Holding[edit | edit source]
Personal data[edit | edit source]
The NAIH first established that the software processed personal data since the data subject was indeed identifiable within this processing, due to the fact that the customer service calls are assigned a unique internal identification number that can be linked to the both the caller and the customer service employee. According to the NAIH, this processing was analogous to case law from the Court of Justice of the European Union C-582/14, which established that dynamic IP addresses are also personal data.
The NAIH also stated that the use of AI to identify emotional states should be considered processing of a sensitive nature, and could fall under the special category of personal data within the meaning of Article 9(1) GDPR in certain cases. However, the NAIH held that in this specific case Article 9(1) GDPR did not apply to the processing, since the voice analysis did not produce data that in itself could uniquely identify a data subject (and therefore could not be considered biometric data), and due to the fact that no meaningful inference as to the physical or mental state of health of the data subject could be drawn from the result of the processing.
Automated decision-making and profiling[edit | edit source]
The NAIH held that automated decision-making was carried out in this case, since it is not a prerequisite that the software makes the decision itself, and that it is sufficient if the processing is intended to produce an outcome that influences the decision-makers. The NAIH also established that profiling also took place according to the definition in Article 4(4) GDPR, since the prioritisation of dissatisfied customers based on keywords and emotions implies the evaluation of personal aspects cited in this provision.
Based on these assessments, and the fact that this is a novel technology, the NAIH noted that the processing created increased risks to fundamental rights, which also imply increased responsibilities on the controller. Therefore, the NAIH held that before rolling out the automated voice analysis using emotional AI, the Bank should have assessed whether the processing was feasible under the current technical and social circumstances, and taken into consideration appropriate safeguards to comply with data protection laws and the principle of data protection by design. Based on these considerations, the NAIH held that the Bank’s failure to carry out these obligations constituted a violation of Articles 24(1) GDPR, 25(1) and 25(2) GDPR.
Lack of proper information and right to object[edit | edit source]
The NAIH noted that no information was given to the data subjects regarding the voice analysis, in particular about the specific types of data processed, as well as how their emotional reactions were processed and assessed. According to the NAIH, this constituted a breach of Articles 12(1), 13, 5(1) and 5(2) GDPR.
Furthermore, according to its previous assessments regarding automated decision-making and profiling, the NAIH held that absence of information given to data subjects regarding their right to object lead to a breach of Article 21 GDPR. Additionally, the NAIH also considered that processing for customer retention purposes constituted a marketing purpose similar to customer acquisition, and that therefore the Bank violated data subjects’ right to object under Article 21(2) GDPR as well.
Balancing of interests and lawfulness of processing[edit | edit source]
The NAIH held that the Bank had provided no concrete evidence that it had carried out an adequate balance of interests between its claimed legitimate interest to carry out the processing, and the rights of the data subjects involved.
The NAIH noted that according to the technical documentation provided by the Bank, the effectiveness of the emotion analysis software is actually relatively low, and that the Bank had failed to prove that, in its current form, its use was suitable to achieve its proposed objectives in a way that was proportionate to the effect on data subjects’ rights. The NAIH also noted that the Bank had not demonstrated that any alternatives to this processing were considered.
The NAIH also cited the European Data Protection Board and European Data Protection Supervisor’s Joint Opinion 5/2021 on the Artificial Intelligence Act, which states that “the use of AI to infer emotions of a natural person is highly undesirable and should be prohibited, except for certain well-specified use-cases, namely for health or research purposes.” Based on these criteria, the NAIH concluded that the Bank’s stated efficiency purposes were not proportionate to justify the use of a form of data processing that EU data protection bodies have considered undesirable and constitute a high risk to data subjects' fundamental rights.
The NAIH also noted that not only the voices of the Bank's customers were analysed, but also the voices of its employees. The NAIH stated that although monitoring performance and quality assurance may give rise to legitimate interests in certain circumstances according to labour law, the question of suitability and proportionality was also relevant in this case, especially because employees are in a vulnerable position in the context of a labour relationship. The NAIH established that these factors were not taken into account due to the Bank's failure to conduct an adequate balance of interests, and that an adequate system of guarantees was not provided for employees.
Therefore, the NAIH held that the bank could not claim legitimate interest as a valid legal basis under Article 6(1)(f) GDPR (or any other legal basis listed in Article 6(1) GDPR) for the processing in question. It therefore held that the Bank had violated Articles 5(1)(a), 6(1) GDPR and 6(4) GDPR.
Fine and order to comply with GDPR[edit | edit source]
Based on these considerations, the NAIH imposed a fine of HUF 250,000,000 (approximately €700,000) on the Bank, and ordered the Bank to cease its use of AI to analyse emotions in the recordings of customer service calls unless it provided proof, within 60 days, that: an appropriate scope of data was defined; a proper data impact assessment was carried out; and a valid legal basis was provided which ensured that data subjects’ rights are protected to the maximum extent possible.
With regard to the Bank's employees, the NAIH held that processing should be limited to what is necessary for the purposes for which it is intended, and that they should be provided with appropriate information, indicating the assessment criteria and consequences, and including a specific balancing of interests that addresses their vulnerability due to the nature of their labour relationship, with appropriate internal safeguards.
Comment[edit | edit source]
Share your comments here!
Further Resources[edit | edit source]
English Machine Translation of the Decision[edit | edit source]
The decision below is a machine translation of the Hungarian original. Please refer to the Hungarian original for more details.