AEPD (Spain) - PS/00120/2021
|AEPD (Spain) - PS/00120/2021|
|Relevant Law:||Article 5(1)(c) GDPR|
Article 6 GDPR
Article 9 GDPR
Article 12 GDPR
Article 13 GDPR
Article 25(1) GDPR
Article 35 GDPR
Article 57(1) GDPR
Article 83(4)(a) GDPR
Article 83(5)(a) GDPR
Article 83(5)(b) GDPR
|Parties:||ASOCIACION DE CONSUMIDORES Y USUARIOS EN ACCION - FACUA|
|National Case Number/Name:||PS/00120/2021|
|European Case Law Identifier:||n/a|
|Original Source:||AEPD (in ES)|
The Spanish DPA fined Mercadona, a supermarket chain, €3,150,000 (reduced to €2,520,000) in relation to its video surveillance system that used biometric data to identify individuals who had previously committed crimes at its store and who were banned from entering.
English Summary[edit | edit source]
Facts[edit | edit source]
The Spanish DPA (AEPD) launched an investigation on Mercadona, a supermarket chain, after having notice, via the media, that it was using a video surveillance system using facial recognition to prevent access to their premises of people convicted for robbery or other crimes related with Mercadona and with entry bans in force.
Afterwards, also two complaints were lodged in this regard by a consumers association and an association for computer enabled crimes and problems.
Mercadona started to use this system on 1/06/2020 until 6/05/2021, after the AEPD issued an interim measure ordering the controller to stop the processing. Additionally, the process was brought to court in the meantime, what resulted in an order to stop the processing by a Spanish court in AP Barcelona - Auto 72/2021.
The system used a facial recognition process that compares a "dubious biometric sample", obtained from one or more images of a person, against a database of biometric samples already associated with the identity of a person, which have been previously registered through one or more images of that person. To this end, the "dubious biometric samples" are transformed into patterns though algorithmic calculations that are evaluated based on previously established matching thresholds.
The data processing included the capture, matching, storage and destruction - in case of negative identification (after 0.3 seconds of its collection) - of the captured biometric image of any person entering the supermarket.
Mercadona, the controller, informed that they were relying on a public interest, from Article 6(1)(e) GDPR, for the processing, as it purpose was to ensure the safety of the people and goods, as well as of their premises. The particular national law alleged was the Private Security Act.
As regards to biometric data, the controller acknowledged that they were processing special categories of data from Article 9 GDPR, and that they were relying on the exception from Article 9(2)(f), since they were processing data for complying with the court judgments that allowed the controller to use electronic means to control what the judgments provide (such as entry bans).
The controller alleged that the use of such a system is the only adequate way of actually controlling entry bans, since Mercadona has 1,623 shops and 95,000 employees, and that that system would provide more legal guarantees and reliability than any other system, that could not ensure control.
Mercadona also had informative banners in all the (40) shops in which the system was used.
Holding[edit | edit source]
On Articles 6, 9 and 5(1)(c) GDPR[edit | edit source]
Special categories of data[edit | edit source]
The DPA started by confirming that the data processed by Mercadona was included in the special categories of data from Article 9 GDPR, since it is biometric data that is used for the purposes of biometric identification (as opposed to biometric authentication). As remarked by the DPA, facial recognition systems are identification systems that are very intrusive for rights and freedoms.
The DPA also noted that the processing was carried out at a distance, continuously, and it was automate, and used algorithms to create the patterns, what derived in a extreme risk, as it may lead to an indiscriminate and mass surveillance.
Therefore, the controller should have relied on a valid exception from Article 9(2). According to the DPA, the controller could not have relied on the exception from Article 9(2)(g), regarding public interest, since such interest must be set by national law, that shall also specify the circumstances, limits, rules, and measures for applying the exception and relying on a public interest, and be proportionate. Since there is no national law allowing this type of processing, the controller could only have relied on explicit consent.
The DPA also remarked that all the persons that entered any of the shops of the controller were the system was used were treated as convicted subjects, since the controller's justification to use the system was to control and prevent only the entry of convicted persons.
The judgments allowed for the use of electronic means to implement the system, as requested by the controller; in some cases even mentioning facial recognition, in accordance with the measures allowed by the Spanish Criminal Code. However, the AEPD concluded that such measure may only affect the (rights of the) convicted persons. Additionally, not all judgments talk about facial recognition. And, particularly, the DPA noted that the use of such system should take into account the nature and context of the situation that leads to the processing, including the seriousness, probability and size of the potential harm and consequences to rights, guarantees and freedoms of all the affected persons, including the convicted persons. Also, the judgment allowing the use of such means should have included the necessary and proportionate conditions and guarantees to be implemented, what they did not actually do, leaving it to the discretion of the controller.
In this regard, the DPA also considered important that the controller had tried to prepare in advance the legitimacy to carry out such processing by directly requesting the courts to allow them to use a facial recognition system to control the entry, and that this had been done without carrying out in advance a data protection impact assessment (DPIA), an analysis of the (extreme) risk and a prior consultation to the AEPD, as they should have done. This, that should had been done before requesting the permission of the court, should have led the controller to determine the unacceptability of the risk. Additionally, the DPA stated, the electronic means used by the controller shall only had affected the convicted persons to which the judgments concerned, not third persons such as Mercadona clients and workers.
Legal basis[edit | edit source]
The DPA also remarked that the controller did not have an appropriate basis form Article 6 to rely on. In the same way as what the DPA noted regarding the exception from Article 9(2)(g), the public interest legal basis from Article 6(1)(e) needs to be defined by law, including a mention to affected interests, restrictions to its use, limits and conditions. This will pose a limit for public powers, as well as ensure the principle of legal certainty.
However, in this case, there is no real connection between the security measure they system is used for and public interest; it only pursues the private interest of the controller. The DPA also differentiated between activities that are connected to a public interest, so they benefit the society as a whole, and where a judge or court should assess its proportionality, against an activity in which public interest is used to legitimize the massive processing of the data of every person, so everyone is treated as a convicted person.
Anyway, the DPA argued, in line with the previous judgment, that there is no such public interest, since the company was only pursuing a private interest.
Analysis of the legal bases and exceptions[edit | edit source]
In its analysis about legal bases and exceptions, the AEPD differentiated between three types of processing: the processing of convicted persons data, the processing of potential clients, the processing of Mercadona workers.
With regards to the data of the convicted persons, Mercadona alleged the use of the exception from Article 9(2)(f), regarding the processing of data for legal claims. However, the DPA concluded that the use of this exception was not valid.
In this case the legal claims had already been exercised or defended. Additionally, the existence of a legal claim does not entitle the controller to process such data per se; other conditions must be met. In accordance to Recital 52, this shall be done exceptionally and when it is necessary. It also requires adequate guarantees. Therefore, the interpretation of the legal text must be done in a restrictive way. In this sense, the AEPD compared this exception to Article 10 GDPR, that also requires, for the processing of criminal data, to be under the supervision of a public authority; in this case, the processing was not supervised, only potential consequences deriving from it (such as the non-compliance with the judgment). The DPA also remarked here that, for example, if it was the court that would be the one to carry out the processing, they could only process data of the convicted persons, since the measures contained in the judgment can only affect convicted persons. Therefore, what a court cannot do should not be allowed for a private actor to do.
With regards to the legal basis from Article 6, the DPA stated, as already explained, that the basis from Article 6(1)(e) needs, firstly, to be defined by law and, mainly that such public interest did not exist, since the company was only pursuing a private interest.
With regards to the data of potential clients, Mercadona tries to rely also on the exception from Article 9(2)(f), which as explained is not valid. The DPA explained again that the court can only establish measures in its judgment that affect the rights of convicted persons; third persons cannot have their rights affected. This third persons include children, minors and vulnerable people. This totally disproportionate measure, as the AEPD remarked, violates the spirit of the GDPR.
The DPA concluded that, even if the exception from Article 9(2)(f), the measure, that is a taken in the framework of a criminal procedure, can only affect the persons affected by the judgment; otherwise, it would indirectly mean massively imposing a criminal measure on non-related third persons. This would generate a perverse effect, that would be translated in practice to the establishment of a large scale facial recognition system, highly intrusive in people's rights and freedoms, that would pose an unacceptable risk.
Although it is true that the Spanish law, both the Data Protection Act, in its Article 22, and the Private Security Act, in its Article 42, allow for video surveillance systems, this does not include facial recognition systems, that pose a much bigger risk and are more intrusive, and are not meant to be used for private interests.
With regards to Mercadona workers, the AEPD concluded that they were not taken into account in the DPIA carried out by the controller, even when they were specially affected. In accordance with the Opinion from the A29WP, the controller should have carried out an evaluation between legitimate interests of the controller and the reasonable privacy expectations of the employees by outlining the risks posed by this technology and by undertaking a proportionality assessment, what was not done at any moment. The use of the technology was clearly disproportionate, also as there is a risk that it may result in an indirect control of the workers.
The DPA also made reference to the new provision implemented in the Spanish labour law, providing for algorithmic transparency of artificial intelligence systems that affect workers, as they found a lack of transparency regarding the functioning of the system. This is also connected with Articles 5(1)(a), 12, 13, and 14 GDPR, and Article 89 of the Data Protection Act, that provides for a privacy right for workers.
In conclusion: a measure that affects only a very small number of persons that have been convicted does not legitimize the use of this technology. There is no legal basis, nor any exception from Article 9, that can legitimize the processing. Therefore, Articles 6 and 9 had been violated.
Proportionality assessment[edit | edit source]
Data processing requires a proportionality assessment. The assessment must entail three requirements: adequacy assessment, necessity assessment and proportionality assessment in a strict sense (rights and freedoms balance). The assessment must additionally be carried out at the right moment, i.e. before actually carrying out the processing. Also, it will require a detail look when dealing with biometric data, that pose a higher risk. Whether the the resulting loss of privacy is proportional to any anticipated benefit must be weighed.
The processing must be essential to fulfill the need. This also means that if there is a less intrusive way to achieve the pursued end, it shall be followed. Therefore, the processing must not just be useful, but strictly necessary to achieve the purpose.
According to the DPA, the processing was neither proportionate, since it affected the rights of every potential client and the employees when it should only affect convicted persons, nor necessary, since there are less intrusive ways of achieving the purpose, such as having the photographs of the convicted persons in every premise, for the security staff to know them. The AEPD also remarked that, in this case, this system may not even be adequate for the purposes, since it would be easy for the convicted persons to fool it, using, for example, a mask, so it may be neither useful nor effective.
This is also linked with Articles 5(1)(c) and 25(1) GDPR. The fact that the processing is authorized by a judgment does not make it necessary; specially since it does not provide for any safeguards, what should be hence done by the controller, that is responsible for the compliance, in accordance with the accountability principle too. The controller still has to comply with the data protection rules.
The DPA also remarked that it was not proven that the controller had adopted any technical measures to avoid the transfer of data to third parties, including international transfers of data.
Minimization principle[edit | edit source]
With regards to Article 5 GDPR, the DPA also noted that the minimization and purpose limitation principles shall be respected; particularly the minimization principle from Article 5(1)(c) GDPR. However, the own nature of facial recognition systems leads it to a massive processing of biometric data - that shall entail reinforced guarantees, also because of the high number of affected data subjects.
The processing activity at stake is, additionally, not proportionate, since it could be argued that it is adequate but it is neither necessary nor strictly proportionate, since there are less intrusive alternatives and as the rights and risks are not properly balanced. Therefore, the processing is exercise; the controller is processing data of every potential client and employee only for the purpose of controlling a small number of convicted persons. Therefore, the minimization principle was infringed, so there had been a violation of Article 5(1)(c) GDPR.
Personal data of children[edit | edit source]
The AEPD put special emphasis in the fact that the controller should have carefully considered the risks that the processing of personal data from children and vulnerable persons entail, in accordance with Article 28(2) of the Data Protection Act.
Conclusion[edit | edit source]
Hence, the DPA concluded that there is no possibility of relying on the exception from Article 9(2)(g), there is no valid legal basis from Article 6(1), and that the necessity, proportionality and minimization principles had not been respected. Therefore, Articles 6(1), 9(1) and 5(1)(c) GDPR were violated.
On Articles 12 and 13 GDPR[edit | edit source]
The DPA concluded that Mercadona had not respected the transparency principle, since the controller does not provide adequate information to data subjects. Firstly, because the banners informing about the system only mentions the use of the facial recognition system in relation with convicted persons, but does not make any reference to the data of the supermarket's clients.
It is, additionally, misleading, since it mentions that the purpose is the security of the clients, when it is actually only pursuing a private interest of the controller; the security of the clients could be achieved with a standard video surveillance system.
Also, the controller does not specify in which shops in particular the system is being used (nor the duration of it and the actual purpose), therefore limiting the capacity of the clients to decide not to enter particular shops that are actually using it. Their auto-determination right, freedom and privacy is being violated this way.
There is neither information on international transfers of data, that may occur in accordance to the data processing agreement with the processor, even if the controller denied such possibility.
On Article 25 GDPR[edit | edit source]
The AEPD also analyzed how the high ratio of error in facial recognition systems was linked to data protection by design. According to the DPA, algorithmic bias, produced by the lack of training data from vulnerable collectives, such as radicalized people, women, children and elderly people, may lead to discrimination and social exclusion, which poses an unacceptable risk by design. Also, nowadays, in the context of the covid19 pandemic, the risk of error is higher due to the use of masks.
Therefore, the DPA concluded that there had been a violation of Article 25(1) GDPR.
On Article 35 GDPR[edit | edit source]
The AEPD determined that the controller should had carried out a data protection impact assessment prior to the processing, in accordance to Article 35 GDPR, since the processing can be considered high risk, in accordance with EDPB Guidelines. Albeit, the controller requested the court the permission to use a facial recognition system before carrying out the DPIA.
A proper understanding of proactive accountability and privacy by design implies assessing from the very first moment of the outline of a processing activity of personal data whether it can be carried out. Thus, the first moment in which the idea of requesting the use of a facial recognition processing before the courts, should have been the moment to assess and detect the risks to the rights and freedoms of citizens.
Additionally, the DPA remarked that the risks arising from such automated processing are high in themselves and, in fact, unacceptable, since the initial inherent risk cannot be reduced to adequate levels (residual risk), as there is a prohibition in accordance with Article 9(1) GDPR. Such processing occurs without human intervention, in such a way that the data subjects are unable to exercise the right of erasure and object.
The DPIA that was carried out (extemporaneously) also failed to take into account different risks, namely:
- The fact that facial recognition entails an involuntary processing of personal data, to which data subjects cannot object, and that gathers a very high amount of data.
- The risk of discrimination, social exclusion and infringement of the accuracy principle due to the high ratio of error of these systems.
- The risk of stigmatization of the convicted persons.
- The risk of making every client a potential suspect subject to surveillance.
- The specific risks regarding vulnerable collectives.
- The risk of loss of privacy and intimacy.
The lack of consideration of such risks de facto invalidates the DPIA.
The AEPD concluded that Article 35 GDPR had been violated.
Sanction and amount of the fine[edit | edit source]
The AEPD fined Mercadona a total of €3,150,000, that were reduced a 20% to €2,520,000 for early payment. The AEPD also ordered Mercadona to stop the processing, in line with the interim measure it took during the course of the procedure.
The amount of the fine was divided as follows:
- €2,000,000 for the violation of Articles 6 and 9 GDPR.
- €100,000 for the violation of Articles 12 and 13 GDPR.
- €500,000 for the violation of Article 5(1)(c) GDPR.
- €500,000 for the violation of Article 25(1) GDPR.
- €50,000 for the violation of Article 35 GDPR.
In order to determine the amount of the fine, the DPA took into account, as a mitigating factor, the lack of recidivism and reiteration.
As aggravating factors:
- The fact that the fine needs to be effective, proportionate and dissuasive. The size of the company (more than €25,000,000,000 revenue in 2019, 90,000 employees and 1,636 shops) was taken into account in this regard.
- The nature, gravity and duration of the infringement, taking into account that the processed data entail special categories of data and the volume of data that were processed, including data from minors and vulnerable persons. The DPA remarked that the processing was carried out in a remote, massive and indiscriminate way.
- The fact that Mercadona did not make a prior consultation to the DPA, regardless the risk of the processing to the employees' and clients' rights and freedoms.
- The fact that Mercadona was a controller and had full responsibility on deciding about the processing.
- The fact that the processing entailed a systematic and exhaustive processing of special categories of data.
- The fact that the DPA had to know about the processing via two complaints not related to the controller.
- The continuous nature of the infringement, since the processing was carried out from 1/06/2020 until 6/05/2021.
- The link between the controller's business activity and the processing of personal data.
- The fact that the processing affected children's personal data.
Comment[edit | edit source]
This case deals with the same facts as the court judgment AP Barcelona - Auto 72/2021, where the court also found a violation of the GDPR and ordered Mercadona to stop the processing.
Further Resources[edit | edit source]
Share blogs or news articles here!
English Machine Translation of the Decision[edit | edit source]
The decision below is a machine translation of the Spanish original. Please refer to the Spanish original for more details.