AEPD (Spain) - EXP202100603

From GDPRhub
Revision as of 13:22, 10 May 2023 by Ba (talk | contribs) (Created page with "{{DPAdecisionBOX |Jurisdiction=Spain |DPA-BG-Color=background-color:#ffffff; |DPAlogo=LogoES.jpg |DPA_Abbrevation=AEPD |DPA_With_Country=AEPD (Spain) |Case_Number_Name=EXP202100603 |ECLI= |Original_Source_Name_1=AEPD |Original_Source_Link_1=https://www.aepd.es/es/documento/ps-00553-2021.pdf |Original_Source_Language_1=Spanish |Original_Source_Language__Code_1=ES |Original_Source_Name_2= |Original_Source_Link_2= |Original_Source_Language_2= |Original_Source_Language__C...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
AEPD - EXP202100603
LogoES.jpg
Authority: AEPD (Spain)
Jurisdiction: Spain
Relevant Law: Article 4(14) GDPR
Article 6(1) GDPR
Article 9 GDPR
Article 35 GDPR
Type: Complaint
Outcome: Upheld
Started: 21.09.2021
Decided:
Published:
Fine: 200,000 EUR
Parties: GSMA LTD
National Case Number/Name: EXP202100603
European Case Law Identifier: n/a
Appeal: Unknown
Original Language(s): Spanish
Original Source: AEPD (in ES)
Initial Contributor: Bernardo Armentano

The Spanish DPA imposed a fine of €200,000 euros on an event organization company for using a facial recognition system to admit participants to a congress without having carried out a substantial risk analysis.

English Summary

Facts

In 2021, the controller organized a congress in Barcelona and required face-to-face attendants to upload their identity document when registering on the controller's website. For participants in the virtual modality, there was no such requirement.

In addition, in its privacy policy, there was a request for consent for the use of facial recognition system. This system, operated by a contracted company (processor), was used to compare the captured image of the participant's face when accessing the venue with the photo on the identity document.

A data subject, who had been invited as a speaker, filed a complaint with the Spanish DPA claiming that there was no legal basis for carrying out facial recognition.

In response, the controller claimed that the upload of the document was based on article 6(1)(c) GDPR as the police in Barcelona required this information for security reasons. Is also claimed that the facial recognition system was intended to avoid personal contact when entering the venue and thus prevent contagion by the coronavirus. In any case, the controller informed that participants were aked for consent, pursuant to article 6(1)(a) GDPR, and that they were given an alternative accreditation option. Finally, the controller informed that around 20,000 people attended the event and presented a data protection impact assessment.

Holding

The Spanish DPA began its analysis by recalling that face image falls under the definition of biometric data given by Article 4(14) GDPR.

According to the DPA, biometric data has the particular characteristic of being produced by the body itself, which makes it, in principle, immodifiable by the will of the individual. Thus, an eventual identity theft is not only perpetual in time, but can also serve to defraud several systems that use the same data.

The DPA went on to state that the processing of data of such a sensitive nature entails a significant risk for fundamental rights and freedoms of individuals. In this line of reasoning, it considered that the use of facial recognition systems should only be applied when essential or indispensable, but not for mere convenience. Therefore, the controller has to make a necessity and proportionality assessment that takes into consideration the the least intrusive alternative options available before implementing such a system.

In the present case, the DPA found that, although there was a legal basis for the processing of biometric data, the controller should have carried out a personal data impact assessment that effectively considered such risks. As to the examples of the risks involved in facial recognition, the DPA pointed to Article 29 Working Party Opinion 3/2012. In this opinion, risks related to accuracy, storage in large databases, loss of quality, confidentiality and availability, among others, are mentioned.

However, in the view of the DPA, the controller did not contemplate these different elements and scenarios in its risk assessment. It considered the provided impact assessment to be merely nominal, since it did not examine any substantive aspects, nor assessed the risks or the proportionality and necessity of the use of the facial recognition technology.

Based on this, DPA found a violation of Article 35 GDPR and imposed a fine of €200,000 euros fon the controller.

Comment

Share your comments here!

Further Resources

Share blogs or news articles here!

English Machine Translation of the Decision

The decision below is a machine translation of the Spanish original. Please refer to the Spanish original for more details.