AEPD (Spain) - EXP202100603: Difference between revisions

From GDPRhub
(Created page with "{{DPAdecisionBOX |Jurisdiction=Spain |DPA-BG-Color=background-color:#ffffff; |DPAlogo=LogoES.jpg |DPA_Abbrevation=AEPD |DPA_With_Country=AEPD (Spain) |Case_Number_Name=EXP202100603 |ECLI= |Original_Source_Name_1=AEPD |Original_Source_Link_1=https://www.aepd.es/es/documento/ps-00553-2021.pdf |Original_Source_Language_1=Spanish |Original_Source_Language__Code_1=ES |Original_Source_Name_2= |Original_Source_Link_2= |Original_Source_Language_2= |Original_Source_Language__C...")
 
No edit summary
 
(One intermediate revision by the same user not shown)
Line 67: Line 67:
}}
}}


The Spanish DPA imposed a fine of €200,000 euros on an event organization company for using a facial recognition system to admit participants to a congress without having carried out a substantial risk analysis.
The Spanish DPA imposed a fine of €200,000 on an event organization company for using an accreditation system with face recognition without having carried out a substantial risk analysis.


== English Summary ==
== English Summary ==


=== Facts ===
=== Facts ===
In 2021, the controller organized a congress in Barcelona and required face-to-face attendants to upload their identity document when registering on the controller's website. For participants in the virtual modality, there was no such requirement.
In 2021, the controller organized a congress in Barcelona and required face-to-face attendants to upload their identity document when registering for the event on the controller's website.


In addition, in its privacy policy, there was a request for consent for the use of facial recognition system. This system, operated by a contracted company (processor), was used to compare the captured image of the participant's face when accessing the venue with the photo on the identity document.  
At the time of registration, participants could choose the type of accreditation to be carried out at the entrance to the event: automated or manual accreditation.


A data subject, who had been invited as a speaker, filed a complaint with the Spanish DPA claiming that there was no legal basis for carrying out facial recognition.  
The automated accreditation was made by a system that uses facial recognition to compare the captured image of the participant's face with the photo on the identity document. Participants who opted for automatic accreditation were asked to consent to the use of their biometric data.  


In response, the controller claimed that the upload of the document was based on article 6(1)(c) GDPR as the police in Barcelona required this information for security reasons. Is also claimed that the facial recognition system was intended to avoid personal contact when entering the venue and thus prevent contagion by the coronavirus. In any case, the controller informed that participants were aked for consent, pursuant to article 6(1)(a) GDPR, and that they were given an alternative accreditation option. Finally, the controller informed that around 20,000 people attended the event and presented a data protection impact assessment.
A data subject, who had been invited as a speaker, filed a complaint with the Spanish DPA claiming that there was no legal basis for performing facial recognition in this context.  
 
The controller argued that the purpose of using facial recognition was to avoid personal contact at the entrance to the event and, thus, prevent contagion by the coronavirus. In any case, the controller stated that participants were asked for consent, pursuant to article [[Article 6 GDPR#1a|6(1)(a) GDPR]], and that they were given an alternative accreditation option. The controller also informed that around 20,000 people attended the event and presented a data protection impact assessment to the DPA.


=== Holding ===
=== Holding ===
The Spanish DPA began its analysis by recalling that face image falls under the definition of biometric data given by [[Article 4 GDPR#14|Article 4(14) GDPR]].  
The Spanish DPA began its analysis by recalling that face image falls under the definition of biometric data given by [[Article 4 GDPR#14|Article 4(14) GDPR]].  


According to the DPA, biometric data has the particular characteristic of being produced by the body itself, which makes it, in principle, immodifiable by the will of the individual. Thus, an eventual identity theft is not only perpetual in time, but can also serve to defraud several systems that use the same data.
According to the DPA, biometric data has the particular characteristic of being produced by the body itself, which makes it, in principle, immodifiable by the will of the individual. Thus, unauthorized access to biometric data enables potential identity thefts that are not only perpetual in time, but can also serve to defraud several systems that use the same data.


The DPA went on to state that the processing of data of such a sensitive nature entails a significant risk for fundamental rights and freedoms of individuals. In this line of reasoning, it considered that the use of facial recognition systems should only be applied when essential or indispensable, but not for mere convenience. Therefore, the controller has to make a necessity and proportionality assessment that takes into consideration the the least intrusive alternative options available before implementing such a system.
The DPA went on to state that the processing of data of such a sensitive nature entails a significant risk for fundamental rights and freedoms of individuals. In this line of reasoning, it considered that the use of facial recognition systems should only be applied when essential or indispensable, but not for mere convenience. Therefore, the controller has to make a necessity and proportionality assessment that takes into consideration the the least intrusive alternative options available before implementing such a system.


In the present case, the DPA found that, although there was a legal basis for the processing of biometric data, the controller should have carried out a personal data impact assessment that effectively considered such risks. As to the examples of the risks involved in facial recognition, the DPA pointed to Article 29 Working Party Opinion 3/2012. In this opinion, risks related to accuracy, storage in large databases, loss of quality, confidentiality and availability, among others, are mentioned.
In the present case, the DPA found that, although there was a legal basis for the processing of biometric data, the controller should have carried out a personal data impact assessment that effectively considered such risks. As to the examples of the risks involved in facial recognition, the DPA pointed to [https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2012/wp193_en.pdf Article 29 Working Party Opinion 3/2012]. In this opinion, risks related to accuracy, storage in large databases, loss of quality, confidentiality and availability, among others, are mentioned.


However, in the view of the DPA, the controller did not contemplate these different elements and scenarios in its risk assessment. It considered the provided impact assessment to be merely nominal, since it did not examine any substantive aspects, nor assessed the risks or the proportionality and necessity of the use of the facial recognition technology.
In the DPA's view, the controller did not contemplate these different elements and scenarios in its risk assessment. It considered the provided impact assessment to be merely nominal, since it did not examine any substantive aspects, nor assessed the risks or the proportionality and necessity of the use of the facial recognition technology.


Based on this, DPA found a violation of [[Article 35 GDPR|Article 35 GDPR]] and imposed a fine of €200,000 euros fon the controller.
Based on this, DPA found a violation of [[Article 35 GDPR|Article 35 GDPR]] and imposed a fine of €200,000 euros fon the controller.

Latest revision as of 09:01, 16 May 2023

AEPD - EXP202100603
LogoES.jpg
Authority: AEPD (Spain)
Jurisdiction: Spain
Relevant Law: Article 4(14) GDPR
Article 6(1) GDPR
Article 9 GDPR
Article 35 GDPR
Type: Complaint
Outcome: Upheld
Started: 21.09.2021
Decided:
Published:
Fine: 200,000 EUR
Parties: GSMA LTD
National Case Number/Name: EXP202100603
European Case Law Identifier: n/a
Appeal: Unknown
Original Language(s): Spanish
Original Source: AEPD (in ES)
Initial Contributor: Bernardo Armentano

The Spanish DPA imposed a fine of €200,000 on an event organization company for using an accreditation system with face recognition without having carried out a substantial risk analysis.

English Summary

Facts

In 2021, the controller organized a congress in Barcelona and required face-to-face attendants to upload their identity document when registering for the event on the controller's website.

At the time of registration, participants could choose the type of accreditation to be carried out at the entrance to the event: automated or manual accreditation.

The automated accreditation was made by a system that uses facial recognition to compare the captured image of the participant's face with the photo on the identity document. Participants who opted for automatic accreditation were asked to consent to the use of their biometric data.

A data subject, who had been invited as a speaker, filed a complaint with the Spanish DPA claiming that there was no legal basis for performing facial recognition in this context.

The controller argued that the purpose of using facial recognition was to avoid personal contact at the entrance to the event and, thus, prevent contagion by the coronavirus. In any case, the controller stated that participants were asked for consent, pursuant to article 6(1)(a) GDPR, and that they were given an alternative accreditation option. The controller also informed that around 20,000 people attended the event and presented a data protection impact assessment to the DPA.

Holding

The Spanish DPA began its analysis by recalling that face image falls under the definition of biometric data given by Article 4(14) GDPR.

According to the DPA, biometric data has the particular characteristic of being produced by the body itself, which makes it, in principle, immodifiable by the will of the individual. Thus, unauthorized access to biometric data enables potential identity thefts that are not only perpetual in time, but can also serve to defraud several systems that use the same data.

The DPA went on to state that the processing of data of such a sensitive nature entails a significant risk for fundamental rights and freedoms of individuals. In this line of reasoning, it considered that the use of facial recognition systems should only be applied when essential or indispensable, but not for mere convenience. Therefore, the controller has to make a necessity and proportionality assessment that takes into consideration the the least intrusive alternative options available before implementing such a system.

In the present case, the DPA found that, although there was a legal basis for the processing of biometric data, the controller should have carried out a personal data impact assessment that effectively considered such risks. As to the examples of the risks involved in facial recognition, the DPA pointed to Article 29 Working Party Opinion 3/2012. In this opinion, risks related to accuracy, storage in large databases, loss of quality, confidentiality and availability, among others, are mentioned.

In the DPA's view, the controller did not contemplate these different elements and scenarios in its risk assessment. It considered the provided impact assessment to be merely nominal, since it did not examine any substantive aspects, nor assessed the risks or the proportionality and necessity of the use of the facial recognition technology.

Based on this, DPA found a violation of Article 35 GDPR and imposed a fine of €200,000 euros fon the controller.

Comment

Share your comments here!

Further Resources

Share blogs or news articles here!

English Machine Translation of the Decision

The decision below is a machine translation of the Spanish original. Please refer to the Spanish original for more details.