AEPD (Spain) - PS/00120/2021: Difference between revisions

From GDPRhub
 
(15 intermediate revisions by 2 users not shown)
Line 20: Line 20:
|Date_Published=26.07.2021
|Date_Published=26.07.2021
|Year=2021
|Year=2021
|Fine=2520000
|Fine=3150000
|Currency=EUR
|Currency=EUR


Line 29: Line 29:
|GDPR_Article_3=Article 9 GDPR
|GDPR_Article_3=Article 9 GDPR
|GDPR_Article_Link_3=Article 9 GDPR
|GDPR_Article_Link_3=Article 9 GDPR
|GDPR_Article_4=Article 25(1) GDPR
|GDPR_Article_4=Article 12 GDPR
|GDPR_Article_Link_4=Article 25 GDPR#1
|GDPR_Article_Link_4=Article 12 GDPR
|GDPR_Article_5=Article 57(1) GDPR
|GDPR_Article_5=Article 13 GDPR
|GDPR_Article_Link_5=Article 57 GDPR#1
|GDPR_Article_Link_5=Article 13 GDPR
|GDPR_Article_6=Article 83(4)(a) GDPR
|GDPR_Article_6=Article 25(1) GDPR
|GDPR_Article_Link_6=Article 83 GDPR#4a
|GDPR_Article_Link_6=Article 25 GDPR#1
|GDPR_Article_7=Article 83(5)(a) GDPR
|GDPR_Article_7=Article 35 GDPR
|GDPR_Article_Link_7=Article 83 GDPR#5a
|GDPR_Article_Link_7=Article 35 GDPR
|GDPR_Article_8=Article 83(5)(b) GDPR
|GDPR_Article_8=Article 57(1) GDPR
|GDPR_Article_Link_8=Article 83 GDPR#5b
|GDPR_Article_Link_8=Article 57 GDPR#1
|GDPR_Article_9=Article 83(4)(a) GDPR
|GDPR_Article_Link_9=Article 83 GDPR#4a
|GDPR_Article_10=Article 83(5)(a) GDPR
|GDPR_Article_Link_10=Article 83 GDPR#5a
|GDPR_Article_11=Article 83(5)(b) GDPR
|GDPR_Article_Link_11=Article 83 GDPR#5b




|National_Law_Name_1=Ley Orgánica 3/2018, de 5 de diciembre, de Protecciónde  Datos  Personales  y  garantía  de  los  derechos  digitales
|National_Law_Name_1=LOPDGDD
|National_Law_Link_1=https://www.boe.es/eli/es/lo/2018/12/05/3
|National_Law_Link_1=https://www.boe.es/eli/es/lo/2018/12/05/3


|Party_Name_1=ASOCIACION   DE   CONSUMIDORES   Y   USUARIOS   EN ACCION-FACUA
|Party_Name_1=ASOCIACION DE CONSUMIDORES Y USUARIOS EN ACCION - FACUA
|Party_Link_1=https://www.facua.org/es/
|Party_Link_1=https://www.facua.org/es/
|Party_Name_2=Mercadona, S.A.
|Party_Name_2=Mercadona, S.A.
Line 61: Line 67:


|Initial_Contributor=
|Initial_Contributor=
|
|}}
}}


The Spanish DPA fined Mercadona, a supermarket chain, €2,500,000 for using a video surveillance system that used biometric data to identify people convicted for robbery or other crimes related with Mercadona and with entry bans in force.
The Spanish DPA fined Mercadona, a supermarket chain, €3,150,000 (reduced to €2,520,000) in relation to its video surveillance system that used biometric data to identify individuals who had previously committed crimes at its store and who were banned from entering.


== English Summary ==
== English Summary ==
Line 73: Line 78:
Afterwards, also two complaints were lodged in this regard by a consumers association and an association for computer enabled crimes and problems.
Afterwards, also two complaints were lodged in this regard by a consumers association and an association for computer enabled crimes and problems.


Mercadona started the use of this system on 1/06/2020 until 6/05/2021, after the AEPD issued an interim measure ordering the controller to stop the processing.
Mercadona started to use this system on 1/06/2020 until 6/05/2021, after the AEPD issued an interim measure ordering the controller to stop the processing. Additionally, the process was brought to court in the meantime, what resulted in an order to stop the processing by a Spanish court in [[AP Barcelona - Auto 72/2021]].


The system used a facial recognition process that compares a "dubious biometric sample", obtained from one or more images of a person, against a database of biometric samples already associated with the identity of a person, which have been previously registered through one or more images of that person. To this end, the "dubious biometric samples" are transformed into patter though algorithmic calculations that are evaluated based on previously established matching thresholds.
The system used a facial recognition process that compares a "dubious biometric sample", obtained from one or more images of a person, against a database of biometric samples already associated with the identity of a person, which have been previously registered through one or more images of that person. To this end, the "dubious biometric samples" are transformed into patterns though algorithmic calculations that are evaluated based on previously established matching thresholds.
 
The data processing included the capture, matching, storage and destruction - in case of negative identification (after 0.3 seconds of its collection) - of the captured biometric image of any person entering the supermarket.


Mercadona, the controller, informed that they were relying on a public interest, from Article 6(1)(e) GDPR, for the processing, as it purpose was to ensure the safety of the people and goods, as well as of their premises. The particular national law alleged was the [https://www.boe.es/buscar/act.php?id=BOE-A-2014-3649 Private Security Act].
Mercadona, the controller, informed that they were relying on a public interest, from Article 6(1)(e) GDPR, for the processing, as it purpose was to ensure the safety of the people and goods, as well as of their premises. The particular national law alleged was the [https://www.boe.es/buscar/act.php?id=BOE-A-2014-3649 Private Security Act].
Line 83: Line 90:
The controller alleged that the use of such a system is the only adequate way of actually controlling entry bans, since Mercadona has 1,623 shops and 95,000 employees, and that that system would provide more legal guarantees and reliability than any other system, that could not ensure control.
The controller alleged that the use of such a system is the only adequate way of actually controlling entry bans, since Mercadona has 1,623 shops and 95,000 employees, and that that system would provide more legal guarantees and reliability than any other system, that could not ensure control.


Mercadona also had informative banners in all the shops in which the system was used.
Mercadona also had informative banners in all the (40) shops in which the system was used.
 
=== Holding ===
 
The DPA held that Mercadona had violated Articles [[Article 5 GDPR|5(1)(c)]], [[Article 6 GDPR|6(1)]], [[Article 9 GDPR|9(1)]], [[Article 12 GDPR|12]], [[Article 13 GDPR|13]], [[Article 25 GDPR|25(1)]], and [[Article 35 GDPR|35]] GDPR.
 
==== On Articles 6, 9 and 5(1)(c) GDPR ====
 
===== Special categories of data =====
The DPA started by confirming that the data processed by Mercadona was included in the special categories of data from [[Article 9 GDPR]], since it is biometric data that is used for the purposes of biometric identification (as opposed to biometric authentication). As remarked by the DPA, facial recognition systems are identification systems that are very intrusive for rights and freedoms. 
 
The DPA also noted that the processing was carried out at a distance, continuously, and it was automate, and used algorithms to create the patterns, what derived in a extreme risk, as it may lead to an indiscriminate and mass surveillance. 
 
Therefore, the controller should have relied on a valid exception from Article 9(2). According to the DPA, the controller could not have relied on the exception from Article 9(2)(g), regarding public interest, since such interest must be set by national law, that shall also specify the circumstances, limits, rules, and measures for applying the exception and relying on a public interest, and be proportionate. Since there is no national law allowing this type of processing, the controller could only have relied on explicit consent.
 
The DPA also remarked that all the persons that entered any of the shops of the controller were the system was used were treated as convicted subjects, since the controller's justification to use the system was to control and prevent only the entry of convicted persons.
 
The judgments allowed for the use of electronic means to implement the system, as requested by the controller; in some cases even mentioning facial recognition, in accordance with the measures allowed by the Spanish [https://www.boe.es/buscar/act.php?id=BOE-A-1995-25444 Criminal Code]. However, the AEPD concluded that such measure may only affect the (rights of the) convicted persons. Additionally, not all judgments talk about facial recognition. And, particularly, the DPA noted that the use of such system should take into account the nature and context of the situation that leads to the processing, including the seriousness, probability and size of the potential harm and consequences to rights, guarantees and freedoms of all the affected persons, including the convicted persons. Also, the judgment allowing the use of such means should have included the necessary and proportionate conditions and guarantees to be implemented, what they did not actually do, leaving it to the discretion of the controller.
 
In this regard, the DPA also considered important that the controller had tried to prepare in advance the legitimacy to carry out such processing by directly requesting the courts to allow them to use a facial recognition system to control the entry, and that this had been done without carrying out in advance a data protection impact assessment (DPIA), an analysis of the (extreme) risk and a prior consultation to the AEPD, as they should have done. This, that should had been done before requesting the permission of the court, should have led the controller to determine the unacceptability of the risk. Additionally, the DPA stated, the electronic means used by the controller shall only had affected the convicted persons to which the judgments concerned, not third persons such as Mercadona clients and workers.
 
===== Legal basis =====
The DPA also remarked that the controller did not have an appropriate basis form [[Article 6 GDPR|Article 6]] to rely on. In the same way as what the DPA noted regarding the exception from Article 9(2)(g), the public interest legal basis from Article 6(1)(e) needs to be defined by law, including a mention to affected interests, restrictions to its use, limits and conditions. This will pose a limit for public powers, as well as ensure the principle of legal certainty.
 
However, in this case, there is no real connection between the security measure they system is used for and public interest; it only pursues the private interest of the controller. The DPA also differentiated between activities that are connected to a public interest, so they benefit the society as a whole, and where a judge or court should assess its proportionality, against an activity in which public interest is used to legitimize the massive processing of the data of every person, so everyone is treated as a convicted person.
 
Anyway, the DPA argued, in line with the [[AP Barcelona - Auto 72/2021|previous judgment]], that there is no such public interest, since the company was only pursuing a private interest. 
 
===== Analysis of the legal bases and exceptions =====
In its analysis about legal bases and exceptions, the AEPD differentiated between three types of processing: the processing of convicted persons data, the processing of potential clients, the processing of Mercadona workers.
 
With regards to the data of the convicted persons, Mercadona alleged the use of the exception from Article 9(2)(f), regarding the processing of data for legal claims. However, the DPA concluded that the use of this exception was not valid. 


[in progress]
In this case the legal claims had already been exercised or defended. Additionally, the existence of a legal claim does not entitle the controller to process such data ''per se''; other conditions must be met. In accordance to Recital 52, this shall be done exceptionally and when it is necessary. It also requires adequate guarantees. Therefore, the interpretation of the legal text must be done in a restrictive way. In this sense, the AEPD compared this exception to Article 10 GDPR, that also requires, for the processing of criminal data, to be under the supervision of a public authority; in this case, the processing was not supervised, only potential consequences deriving from it (such as the non-compliance with the judgment). The DPA also remarked here that, for example, if it was the court that would be the one to carry out the processing, they could only process data of the convicted persons, since the measures contained in the judgment can only affect convicted persons. Therefore, what a court cannot do should not be allowed for a private actor to do.


=== Holding ===
With regards to the legal basis from Article 6, the DPA stated, as already explained, that the basis from Article 6(1)(e) needs, firstly, to be defined by law and, mainly that such public interest did not exist, since the company was only pursuing a private interest.
The DPA
 
With regards to the data of potential clients, Mercadona tries to rely also on the exception from Article 9(2)(f), which as explained is not valid. The DPA explained again that the court can only establish measures in its judgment that affect the rights of convicted persons; third persons cannot have their rights affected. This third persons include children, minors and vulnerable people. This totally disproportionate measure, as the AEPD remarked, violates the spirit of the GDPR.
 
The DPA concluded that, even if the exception from Article 9(2)(f), the measure, that is a taken in the framework of a criminal procedure, can only affect the persons affected by the judgment; otherwise, it would indirectly mean massively imposing a criminal measure on non-related third persons. This would generate a perverse effect, that would be translated in practice to the establishment of a large scale facial recognition system, highly intrusive in people's rights and freedoms, that would pose an unacceptable risk.
 
Although it is true that the Spanish law, both the [https://www.boe.es/buscar/act.php?id=BOE-A-2018-16673 Data Protection Act], in its Article 22, and the [https://www.boe.es/buscar/act.php?id=BOE-A-2014-3649 Private Security Act], in its Article 42, allow for video surveillance systems, this does not include facial recognition systems, that pose a much bigger risk and are more intrusive, and are not meant to be used for private interests. 
 
With regards to Mercadona workers, the AEPD concluded that they were not taken into account in the DPIA carried out by the controller, even when they were specially affected. In accordance with the [https://ec.europa.eu/newsroom/article29/items/610169 Opinion from the A29WP], the controller should have carried out an evaluation between legitimate interests of the controller and the reasonable privacy expectations of the employees by outlining the risks posed by this technology and by undertaking a proportionality assessment, what was not done at any moment. The use of the technology was clearly disproportionate, also as there is a risk that it may result in an indirect control of the workers.
 
The DPA also made reference to the new provision implemented in the Spanish labour law, providing for algorithmic transparency of artificial intelligence systems that affect workers, as they found a lack of transparency regarding the functioning of the system. This is also connected with Articles 5(1)(a), 12, 13, and 14 GDPR, and Article 89 of the [https://www.boe.es/buscar/act.php?id=BOE-A-2018-16673 Data Protection Act], that provides for a privacy right for workers.
 
In conclusion: a measure that affects only a very small number of persons that have been convicted does not legitimize the use of this technology. There is no legal basis, nor any exception from Article 9, that can legitimize the processing. Therefore, Articles 6 and 9 had been violated.
 
===== Proportionality assessment =====
Data processing requires a proportionality assessment. The assessment must entail three requirements: adequacy assessment, necessity assessment and proportionality assessment in a strict sense (rights and freedoms balance). The assessment must additionally be carried out at the right moment, i.e. before actually carrying out the processing. Also, it will require a detail look when dealing with biometric data, that pose a higher risk. Whether the  the resulting loss of privacy is proportional to any anticipated benefit must be weighed.
 
The processing must be essential to fulfill the need. This also means that if there is a less intrusive way to achieve the pursued end, it shall be followed. Therefore, the processing must not just be useful, but strictly necessary to achieve the purpose. 
 
According to the DPA, the processing was neither proportionate, since it affected the rights of every potential client and the employees when it should only affect convicted persons, nor necessary, since there are less intrusive ways of achieving the purpose, such as having the photographs of the convicted persons in every premise, for the security staff to know them. The AEPD also remarked that, in this case, this system may not even be adequate for the purposes, since it would be easy for the convicted persons to fool it, using, for example, a mask, so it may be neither useful nor effective. 
 
This is also linked with Articles [[Article 5 GDPR|5(1)(c)]] and [[Article 25 GDPR|25(1)]] GDPR. The fact that the processing is authorized by a judgment does not make it necessary; specially since it does not provide for any safeguards, what should be hence done by the controller, that is responsible for the compliance, in accordance with the accountability principle too. The controller still has to comply with the data protection rules. 
 
The DPA also remarked that it was not proven that the controller had adopted any technical measures to avoid the transfer of data to third parties, including international transfers of data.
 
===== Minimization principle =====
With regards to [[Article 5 GDPR]], the DPA also noted that the minimization and purpose limitation principles shall be respected; particularly the minimization principle from [[Article 5 GDPR|Article 5(1)(c) GDPR]]. However, the own nature of facial recognition systems leads it to a massive processing of biometric data - that shall entail reinforced guarantees, also because of the high number of affected data subjects.
 
The processing activity at stake is, additionally, not proportionate, since it could be argued that it is adequate but it is neither necessary nor strictly proportionate, since there are less intrusive alternatives and as the rights and risks are not properly balanced. Therefore, the processing is exercise; the controller is processing data of every potential client and employee only for the purpose of controlling a small number of convicted persons. Therefore, the minimization principle was infringed, so there had been a violation of [[Article 5 GDPR|Article 5(1)(c) GDPR]].
 
===== Personal data of children =====
The AEPD put special emphasis in the fact that the controller should have carefully considered the risks that the processing of personal data from children and vulnerable persons entail, in accordance with Article 28(2) of the [https://www.boe.es/buscar/act.php?id=BOE-A-2018-16673 Data Protection Act].
 
===== Conclusion =====
Hence, the DPA concluded that there is no possibility of relying on the exception from Article 9(2)(g), there is no valid legal basis from Article 6(1), and that the necessity, proportionality and minimization principles had not been respected. Therefore, Articles 6(1), 9(1) and 5(1)(c) GDPR were violated.
 
==== On Articles 12 and 13 GDPR ====
The DPA concluded that Mercadona had not respected the transparency principle, since the controller does not provide adequate information to data subjects. Firstly, because the banners informing about the system only mentions the use of the facial recognition system in relation with convicted persons, but does not make any reference to the data of the supermarket's clients.
 
It is, additionally, misleading, since it mentions that the purpose is the security of the clients, when it is actually only pursuing a private interest of the controller; the security of the clients could be achieved with a standard video surveillance system.
 
Also, the controller does not specify in which shops in particular the system is being used (nor the duration of it and the actual purpose), therefore limiting the capacity of the clients to decide not to enter particular shops that are actually using it. Their auto-determination right, freedom and privacy is being violated this way.
 
There is neither information on international transfers of data, that may occur in accordance to the data processing agreement with the processor, even if the controller denied such possibility.
 
Therefore, the AEPD concluded that Articles [[Article 12 GDPR|12]] and [[Article 13 GDPR|13]] GDPR had been violated.
 
==== On Article 25 GDPR ====
The AEPD also analyzed how the high ratio of error in facial recognition systems was linked to data protection by design. According to the DPA, algorithmic bias, produced by the lack of training data from vulnerable collectives, such as radicalized people, women, children and elderly people, may lead to discrimination and social exclusion, which poses an unacceptable risk by design. Also, nowadays, in the context of the covid19 pandemic, the risk of error is higher due to the use of masks.
 
Therefore, the DPA concluded that there had been a violation of [[Article 25 GDPR|Article 25(1) GDPR]].
 
==== On Article 35 GDPR ====
The AEPD determined that the controller should had carried out a data protection impact assessment prior to the processing, in accordance to [[Article 35 GDPR]], since the processing can be considered high risk, in accordance with [https://edpb.europa.eu/sites/default/files/files/file1/edpb_guidelines_201903_video_devices.pdf EDPB Guidelines]. Albeit, the controller requested the court the permission to use a facial recognition system before carrying out the DPIA.
 
A proper understanding of proactive accountability and privacy by design implies assessing from the very first moment of the outline of a processing activity of personal data whether it can be carried out. Thus, the first moment in which the idea of requesting the use of a facial recognition processing before the courts, should have been the moment to assess and detect the risks to the rights and freedoms of citizens.
 
Additionally, the DPA remarked that the risks arising from such automated processing are high in themselves and, in fact, unacceptable, since the initial inherent risk cannot be reduced to adequate levels (residual risk), as there is a prohibition in accordance with Article 9(1) GDPR. Such processing occurs without human intervention, in such a way that the data subjects are unable to exercise the right of erasure and object.
 
The DPIA that was carried out (extemporaneously) also failed to take into account different risks, namely:
 
* The fact that facial recognition entails an involuntary processing of personal data, to which data subjects cannot object, and that gathers a very high amount of data.
* The risk of discrimination, social exclusion and infringement of the accuracy principle due to the high ratio of error of these systems.
* The risk of stigmatization of the convicted persons.
* The risk of making every client a potential suspect subject to surveillance.
* The specific risks regarding vulnerable collectives.
* The risk of loss of privacy and intimacy.
 
The lack of consideration of such risks ''de facto'' invalidates the DPIA.
 
The AEPD concluded that [[Article 35 GDPR]] had been violated.
==== Sanction and amount of the fine ====
 
The AEPD fined Mercadona a total of €3,150,000, that were reduced a 20% to €2,520,000 for early payment. The AEPD also ordered Mercadona to stop the processing, in line with the interim measure it took during the course of the procedure.
 
The amount of the fine was divided as follows:
 
* €2,000,000 for the violation of Articles 6 and 9 GDPR.
* €100,000 for the violation of Articles 12 and 13 GDPR.
* €500,000 for the violation of Article 5(1)(c) GDPR.
* €500,000 for the violation of Article 25(1) GDPR.
* €50,000 for the violation of Article 35 GDPR.
 
In order to determine the amount of the fine, the DPA took into account, as a mitigating factor, the lack of recidivism and reiteration.
 
As aggravating factors:


[in progress]
* The fact that the fine needs to be effective, proportionate and dissuasive. The size of the company (more than €25,000,000,000 revenue in 2019, 90,000 employees and 1,636 shops) was taken into account in this regard.
* The nature, gravity and duration of the infringement, taking into account that the processed data entail special categories of data and the volume of data that were processed, including data from minors and vulnerable persons. The DPA remarked that the processing was carried out in a remote, massive and indiscriminate way.
* The fact that Mercadona did not make a prior consultation to the DPA, regardless the risk of the processing to the employees' and clients' rights and freedoms.
* The fact that Mercadona was a controller and had full responsibility on deciding about the processing.
* The fact that the processing entailed a systematic and exhaustive processing of special categories of data.
* The fact that the DPA had to know about the processing via two complaints not related to the controller.
* The continuous nature of the infringement, since the processing was carried out from 1/06/2020 until 6/05/2021.
* The link between the controller's business activity and the processing of personal data.
* The fact that the processing affected children's personal data.


== Comment ==
== Comment ==
This case deals with the same facts as the court judgment [[AP Barcelona - Auto 72/2021]], where the court also find a violation of the GDPR and ordered Mercadona to stop the processing.
This case deals with the same facts as the court judgment [[AP Barcelona - Auto 72/2021]], where the court also found a violation of the GDPR and ordered Mercadona to stop the processing.


== Further Resources ==
== Further Resources ==

Latest revision as of 15:29, 5 August 2021

AEPD (Spain) - PS/00120/2021
LogoES.jpg
Authority: AEPD (Spain)
Jurisdiction: Spain
Relevant Law: Article 5(1)(c) GDPR
Article 6 GDPR
Article 9 GDPR
Article 12 GDPR
Article 13 GDPR
Article 25(1) GDPR
Article 35 GDPR
Article 57(1) GDPR
Article 83(4)(a) GDPR
Article 83(5)(a) GDPR
Article 83(5)(b) GDPR
LOPDGDD
Type: Investigation
Outcome: Violation Found
Started:
Decided: 23.07.2021
Published: 26.07.2021
Fine: 3150000 EUR
Parties: ASOCIACION DE CONSUMIDORES Y USUARIOS EN ACCION - FACUA
Mercadona, S.A.
National Case Number/Name: PS/00120/2021
European Case Law Identifier: n/a
Appeal: Unknown
Original Language(s): Spanish
Original Source: AEPD (in ES)
Initial Contributor: n/a

The Spanish DPA fined Mercadona, a supermarket chain, €3,150,000 (reduced to €2,520,000) in relation to its video surveillance system that used biometric data to identify individuals who had previously committed crimes at its store and who were banned from entering.

English Summary

Facts

The Spanish DPA (AEPD) launched an investigation on Mercadona, a supermarket chain, after having notice, via the media, that it was using a video surveillance system using facial recognition to prevent access to their premises of people convicted for robbery or other crimes related with Mercadona and with entry bans in force.

Afterwards, also two complaints were lodged in this regard by a consumers association and an association for computer enabled crimes and problems.

Mercadona started to use this system on 1/06/2020 until 6/05/2021, after the AEPD issued an interim measure ordering the controller to stop the processing. Additionally, the process was brought to court in the meantime, what resulted in an order to stop the processing by a Spanish court in AP Barcelona - Auto 72/2021.

The system used a facial recognition process that compares a "dubious biometric sample", obtained from one or more images of a person, against a database of biometric samples already associated with the identity of a person, which have been previously registered through one or more images of that person. To this end, the "dubious biometric samples" are transformed into patterns though algorithmic calculations that are evaluated based on previously established matching thresholds.

The data processing included the capture, matching, storage and destruction - in case of negative identification (after 0.3 seconds of its collection) - of the captured biometric image of any person entering the supermarket.

Mercadona, the controller, informed that they were relying on a public interest, from Article 6(1)(e) GDPR, for the processing, as it purpose was to ensure the safety of the people and goods, as well as of their premises. The particular national law alleged was the Private Security Act.

As regards to biometric data, the controller acknowledged that they were processing special categories of data from Article 9 GDPR, and that they were relying on the exception from Article 9(2)(f), since they were processing data for complying with the court judgments that allowed the controller to use electronic means to control what the judgments provide (such as entry bans).

The controller alleged that the use of such a system is the only adequate way of actually controlling entry bans, since Mercadona has 1,623 shops and 95,000 employees, and that that system would provide more legal guarantees and reliability than any other system, that could not ensure control.

Mercadona also had informative banners in all the (40) shops in which the system was used.

Holding

The DPA held that Mercadona had violated Articles 5(1)(c), 6(1), 9(1), 12, 13, 25(1), and 35 GDPR.

On Articles 6, 9 and 5(1)(c) GDPR

Special categories of data

The DPA started by confirming that the data processed by Mercadona was included in the special categories of data from Article 9 GDPR, since it is biometric data that is used for the purposes of biometric identification (as opposed to biometric authentication). As remarked by the DPA, facial recognition systems are identification systems that are very intrusive for rights and freedoms.

The DPA also noted that the processing was carried out at a distance, continuously, and it was automate, and used algorithms to create the patterns, what derived in a extreme risk, as it may lead to an indiscriminate and mass surveillance.

Therefore, the controller should have relied on a valid exception from Article 9(2). According to the DPA, the controller could not have relied on the exception from Article 9(2)(g), regarding public interest, since such interest must be set by national law, that shall also specify the circumstances, limits, rules, and measures for applying the exception and relying on a public interest, and be proportionate. Since there is no national law allowing this type of processing, the controller could only have relied on explicit consent.

The DPA also remarked that all the persons that entered any of the shops of the controller were the system was used were treated as convicted subjects, since the controller's justification to use the system was to control and prevent only the entry of convicted persons.

The judgments allowed for the use of electronic means to implement the system, as requested by the controller; in some cases even mentioning facial recognition, in accordance with the measures allowed by the Spanish Criminal Code. However, the AEPD concluded that such measure may only affect the (rights of the) convicted persons. Additionally, not all judgments talk about facial recognition. And, particularly, the DPA noted that the use of such system should take into account the nature and context of the situation that leads to the processing, including the seriousness, probability and size of the potential harm and consequences to rights, guarantees and freedoms of all the affected persons, including the convicted persons. Also, the judgment allowing the use of such means should have included the necessary and proportionate conditions and guarantees to be implemented, what they did not actually do, leaving it to the discretion of the controller.

In this regard, the DPA also considered important that the controller had tried to prepare in advance the legitimacy to carry out such processing by directly requesting the courts to allow them to use a facial recognition system to control the entry, and that this had been done without carrying out in advance a data protection impact assessment (DPIA), an analysis of the (extreme) risk and a prior consultation to the AEPD, as they should have done. This, that should had been done before requesting the permission of the court, should have led the controller to determine the unacceptability of the risk. Additionally, the DPA stated, the electronic means used by the controller shall only had affected the convicted persons to which the judgments concerned, not third persons such as Mercadona clients and workers.

Legal basis

The DPA also remarked that the controller did not have an appropriate basis form Article 6 to rely on. In the same way as what the DPA noted regarding the exception from Article 9(2)(g), the public interest legal basis from Article 6(1)(e) needs to be defined by law, including a mention to affected interests, restrictions to its use, limits and conditions. This will pose a limit for public powers, as well as ensure the principle of legal certainty.

However, in this case, there is no real connection between the security measure they system is used for and public interest; it only pursues the private interest of the controller. The DPA also differentiated between activities that are connected to a public interest, so they benefit the society as a whole, and where a judge or court should assess its proportionality, against an activity in which public interest is used to legitimize the massive processing of the data of every person, so everyone is treated as a convicted person.

Anyway, the DPA argued, in line with the previous judgment, that there is no such public interest, since the company was only pursuing a private interest.

Analysis of the legal bases and exceptions

In its analysis about legal bases and exceptions, the AEPD differentiated between three types of processing: the processing of convicted persons data, the processing of potential clients, the processing of Mercadona workers.

With regards to the data of the convicted persons, Mercadona alleged the use of the exception from Article 9(2)(f), regarding the processing of data for legal claims. However, the DPA concluded that the use of this exception was not valid.

In this case the legal claims had already been exercised or defended. Additionally, the existence of a legal claim does not entitle the controller to process such data per se; other conditions must be met. In accordance to Recital 52, this shall be done exceptionally and when it is necessary. It also requires adequate guarantees. Therefore, the interpretation of the legal text must be done in a restrictive way. In this sense, the AEPD compared this exception to Article 10 GDPR, that also requires, for the processing of criminal data, to be under the supervision of a public authority; in this case, the processing was not supervised, only potential consequences deriving from it (such as the non-compliance with the judgment). The DPA also remarked here that, for example, if it was the court that would be the one to carry out the processing, they could only process data of the convicted persons, since the measures contained in the judgment can only affect convicted persons. Therefore, what a court cannot do should not be allowed for a private actor to do.

With regards to the legal basis from Article 6, the DPA stated, as already explained, that the basis from Article 6(1)(e) needs, firstly, to be defined by law and, mainly that such public interest did not exist, since the company was only pursuing a private interest.

With regards to the data of potential clients, Mercadona tries to rely also on the exception from Article 9(2)(f), which as explained is not valid. The DPA explained again that the court can only establish measures in its judgment that affect the rights of convicted persons; third persons cannot have their rights affected. This third persons include children, minors and vulnerable people. This totally disproportionate measure, as the AEPD remarked, violates the spirit of the GDPR.

The DPA concluded that, even if the exception from Article 9(2)(f), the measure, that is a taken in the framework of a criminal procedure, can only affect the persons affected by the judgment; otherwise, it would indirectly mean massively imposing a criminal measure on non-related third persons. This would generate a perverse effect, that would be translated in practice to the establishment of a large scale facial recognition system, highly intrusive in people's rights and freedoms, that would pose an unacceptable risk.

Although it is true that the Spanish law, both the Data Protection Act, in its Article 22, and the Private Security Act, in its Article 42, allow for video surveillance systems, this does not include facial recognition systems, that pose a much bigger risk and are more intrusive, and are not meant to be used for private interests.

With regards to Mercadona workers, the AEPD concluded that they were not taken into account in the DPIA carried out by the controller, even when they were specially affected. In accordance with the Opinion from the A29WP, the controller should have carried out an evaluation between legitimate interests of the controller and the reasonable privacy expectations of the employees by outlining the risks posed by this technology and by undertaking a proportionality assessment, what was not done at any moment. The use of the technology was clearly disproportionate, also as there is a risk that it may result in an indirect control of the workers.

The DPA also made reference to the new provision implemented in the Spanish labour law, providing for algorithmic transparency of artificial intelligence systems that affect workers, as they found a lack of transparency regarding the functioning of the system. This is also connected with Articles 5(1)(a), 12, 13, and 14 GDPR, and Article 89 of the Data Protection Act, that provides for a privacy right for workers.

In conclusion: a measure that affects only a very small number of persons that have been convicted does not legitimize the use of this technology. There is no legal basis, nor any exception from Article 9, that can legitimize the processing. Therefore, Articles 6 and 9 had been violated.

Proportionality assessment

Data processing requires a proportionality assessment. The assessment must entail three requirements: adequacy assessment, necessity assessment and proportionality assessment in a strict sense (rights and freedoms balance). The assessment must additionally be carried out at the right moment, i.e. before actually carrying out the processing. Also, it will require a detail look when dealing with biometric data, that pose a higher risk. Whether the the resulting loss of privacy is proportional to any anticipated benefit must be weighed.

The processing must be essential to fulfill the need. This also means that if there is a less intrusive way to achieve the pursued end, it shall be followed. Therefore, the processing must not just be useful, but strictly necessary to achieve the purpose.

According to the DPA, the processing was neither proportionate, since it affected the rights of every potential client and the employees when it should only affect convicted persons, nor necessary, since there are less intrusive ways of achieving the purpose, such as having the photographs of the convicted persons in every premise, for the security staff to know them. The AEPD also remarked that, in this case, this system may not even be adequate for the purposes, since it would be easy for the convicted persons to fool it, using, for example, a mask, so it may be neither useful nor effective.

This is also linked with Articles 5(1)(c) and 25(1) GDPR. The fact that the processing is authorized by a judgment does not make it necessary; specially since it does not provide for any safeguards, what should be hence done by the controller, that is responsible for the compliance, in accordance with the accountability principle too. The controller still has to comply with the data protection rules.

The DPA also remarked that it was not proven that the controller had adopted any technical measures to avoid the transfer of data to third parties, including international transfers of data.

Minimization principle

With regards to Article 5 GDPR, the DPA also noted that the minimization and purpose limitation principles shall be respected; particularly the minimization principle from Article 5(1)(c) GDPR. However, the own nature of facial recognition systems leads it to a massive processing of biometric data - that shall entail reinforced guarantees, also because of the high number of affected data subjects.

The processing activity at stake is, additionally, not proportionate, since it could be argued that it is adequate but it is neither necessary nor strictly proportionate, since there are less intrusive alternatives and as the rights and risks are not properly balanced. Therefore, the processing is exercise; the controller is processing data of every potential client and employee only for the purpose of controlling a small number of convicted persons. Therefore, the minimization principle was infringed, so there had been a violation of Article 5(1)(c) GDPR.

Personal data of children

The AEPD put special emphasis in the fact that the controller should have carefully considered the risks that the processing of personal data from children and vulnerable persons entail, in accordance with Article 28(2) of the Data Protection Act.

Conclusion

Hence, the DPA concluded that there is no possibility of relying on the exception from Article 9(2)(g), there is no valid legal basis from Article 6(1), and that the necessity, proportionality and minimization principles had not been respected. Therefore, Articles 6(1), 9(1) and 5(1)(c) GDPR were violated.

On Articles 12 and 13 GDPR

The DPA concluded that Mercadona had not respected the transparency principle, since the controller does not provide adequate information to data subjects. Firstly, because the banners informing about the system only mentions the use of the facial recognition system in relation with convicted persons, but does not make any reference to the data of the supermarket's clients.

It is, additionally, misleading, since it mentions that the purpose is the security of the clients, when it is actually only pursuing a private interest of the controller; the security of the clients could be achieved with a standard video surveillance system.

Also, the controller does not specify in which shops in particular the system is being used (nor the duration of it and the actual purpose), therefore limiting the capacity of the clients to decide not to enter particular shops that are actually using it. Their auto-determination right, freedom and privacy is being violated this way.

There is neither information on international transfers of data, that may occur in accordance to the data processing agreement with the processor, even if the controller denied such possibility.

Therefore, the AEPD concluded that Articles 12 and 13 GDPR had been violated.

On Article 25 GDPR

The AEPD also analyzed how the high ratio of error in facial recognition systems was linked to data protection by design. According to the DPA, algorithmic bias, produced by the lack of training data from vulnerable collectives, such as radicalized people, women, children and elderly people, may lead to discrimination and social exclusion, which poses an unacceptable risk by design. Also, nowadays, in the context of the covid19 pandemic, the risk of error is higher due to the use of masks.

Therefore, the DPA concluded that there had been a violation of Article 25(1) GDPR.

On Article 35 GDPR

The AEPD determined that the controller should had carried out a data protection impact assessment prior to the processing, in accordance to Article 35 GDPR, since the processing can be considered high risk, in accordance with EDPB Guidelines. Albeit, the controller requested the court the permission to use a facial recognition system before carrying out the DPIA.

A proper understanding of proactive accountability and privacy by design implies assessing from the very first moment of the outline of a processing activity of personal data whether it can be carried out. Thus, the first moment in which the idea of requesting the use of a facial recognition processing before the courts, should have been the moment to assess and detect the risks to the rights and freedoms of citizens.

Additionally, the DPA remarked that the risks arising from such automated processing are high in themselves and, in fact, unacceptable, since the initial inherent risk cannot be reduced to adequate levels (residual risk), as there is a prohibition in accordance with Article 9(1) GDPR. Such processing occurs without human intervention, in such a way that the data subjects are unable to exercise the right of erasure and object.

The DPIA that was carried out (extemporaneously) also failed to take into account different risks, namely:

  • The fact that facial recognition entails an involuntary processing of personal data, to which data subjects cannot object, and that gathers a very high amount of data.
  • The risk of discrimination, social exclusion and infringement of the accuracy principle due to the high ratio of error of these systems.
  • The risk of stigmatization of the convicted persons.
  • The risk of making every client a potential suspect subject to surveillance.
  • The specific risks regarding vulnerable collectives.
  • The risk of loss of privacy and intimacy.

The lack of consideration of such risks de facto invalidates the DPIA.

The AEPD concluded that Article 35 GDPR had been violated.

Sanction and amount of the fine

The AEPD fined Mercadona a total of €3,150,000, that were reduced a 20% to €2,520,000 for early payment. The AEPD also ordered Mercadona to stop the processing, in line with the interim measure it took during the course of the procedure.

The amount of the fine was divided as follows:

  • €2,000,000 for the violation of Articles 6 and 9 GDPR.
  • €100,000 for the violation of Articles 12 and 13 GDPR.
  • €500,000 for the violation of Article 5(1)(c) GDPR.
  • €500,000 for the violation of Article 25(1) GDPR.
  • €50,000 for the violation of Article 35 GDPR.

In order to determine the amount of the fine, the DPA took into account, as a mitigating factor, the lack of recidivism and reiteration.

As aggravating factors:

  • The fact that the fine needs to be effective, proportionate and dissuasive. The size of the company (more than €25,000,000,000 revenue in 2019, 90,000 employees and 1,636 shops) was taken into account in this regard.
  • The nature, gravity and duration of the infringement, taking into account that the processed data entail special categories of data and the volume of data that were processed, including data from minors and vulnerable persons. The DPA remarked that the processing was carried out in a remote, massive and indiscriminate way.
  • The fact that Mercadona did not make a prior consultation to the DPA, regardless the risk of the processing to the employees' and clients' rights and freedoms.
  • The fact that Mercadona was a controller and had full responsibility on deciding about the processing.
  • The fact that the processing entailed a systematic and exhaustive processing of special categories of data.
  • The fact that the DPA had to know about the processing via two complaints not related to the controller.
  • The continuous nature of the infringement, since the processing was carried out from 1/06/2020 until 6/05/2021.
  • The link between the controller's business activity and the processing of personal data.
  • The fact that the processing affected children's personal data.

Comment

This case deals with the same facts as the court judgment AP Barcelona - Auto 72/2021, where the court also found a violation of the GDPR and ordered Mercadona to stop the processing.

Further Resources

Share blogs or news articles here!

English Machine Translation of the Decision

The decision below is a machine translation of the Spanish original. Please refer to the Spanish original for more details.

                                                                             1/113








     Procedure No.: PS / 00120/2021



       RESOLUTION OF TERMINATION OF THE PROCEDURE BY PAYMENT
                                   VOLUNTARY

Of the procedure instructed by the Spanish Agency for Data Protection and with

based on the following

                                 BACKGROUND

FIRST: On May 5, 2021, the director agreed to start the procedure
sanctioning MERCADONA, S.A. (hereinafter the claimed part). Notified on

initiation agreement and after analyzing the allegations presented, dated June 29,
In 2021, the resolution proposal was issued, which is transcribed below:

      << Procedure number: PS / 00120/2021


      Of the procedure instructed by the Spanish Agency for Data Protection and
      based on the following:

                                    BACKGROUND

      FIRST: On July 6, 2020, the Director of the Spanish Agency

      of Data Protection (hereinafter, AEPD) agrees to initiate actions of
      investigation in view of the news published in the media

      about the implementation that Mercadona, S.A. (hereinafter, Mercadona or
      claimed) would be carrying out in their establishments a system of
      detection of those with final sentences and restraining orders

      in force against Mercadona or against any of its workers.

      Subsequently, two claims are registered in the AEPD in relation to the
      same facts:


      On July 15, 2020, registration number 025103/2020, from the
      ASSOCIATION OF CONSUMERS AND USERS IN ACCION-FACUA (NIF

      G91344986).

      On July 27, 2020, registration number 026511/2020, from
      APEDANICA (NIF G80593254).

      SECOND: In view of the facts denounced in the claim and the

      documents provided by the claimant / of the facts and documents of the
      that this Agency, the Subdirectorate General of Inspection, has learned
      of Data proceeded to carry out preliminary investigation actions to

      the clarification of the facts in question, by virtue of the powers of
      investigation granted to the control authorities in article 57.1 of the
      Regulation (EU) 2016/679 (General Data Protection Regulation, in

      hereinafter RGPD), and in accordance with the provisions of Title VII, Chapter I,
      Second section, of the Organic Law 3/2018, of December 5, Protection
      of Personal Data and guarantee of digital rights (hereinafter

      LOPDGDD).

      As a result of the investigative actions carried out, it is found that
      the person responsible for the treatment is the one claimed.


      Likewise, the following points are found:

      INVESTIGATED ENTITIES


      During these proceedings, investigations have been carried out on the
      following entities:


       Mercadona, S.A., with NIF A46103834 and domiciled in Paseo de la
      Castellana No. 259 C, 28046 Madrid.

      The defendant has a turnover in 2019 of more than 25,000 million
      euros of turnover and more than 94,000 employees, as recorded in the last

      audit report issued by the entity, therefore it constitutes a great
      business.

      RESULT OF RESEARCH ACTIONS


      The writing of these results is based on the information provided by
      Mercadona (entry registration numbers 026455/2020, 026457/2020,

      026459/2020, 026460/2020, 026461/2020, 026462/2020, 026463/2020,
      026464/2020, and 027549/2020) and in the following documents incorporated into the
      present file through the corresponding diligence:


      - Reference number 1: Official Gazette of the Mercantile Registry (hereinafter
      BORME) of *** DATE.1, (…).


      - Reference number 2: BORME of *** DATE.2, (…).

      - Reference number 3: consultation made on November 5, 2020 of the
      entity *** EMPRESA.1 in the Axesor business information service.


      - Reference number 4: report of the legal office of the AEPD of
      reference number 010308/2019.


      - Reference number 5: guidelines on individual decisions
      automated systems and profiling for the purposes of Regulation 2016/679
      of the Working Group on the protection of personal data of article 29.




      - Reference number 6: extract from Law 5/2014, of April 4, of
      Private security.


      - Reference number 7: extract of the Organic Law 10/1995, of 23 of
      November, of the Penal Code.


      - Reference number 8: extract from the Royal Decree of July 24, 1889
      by which the Civil Code is published.

      - Reference number 9: extract from the Spanish Constitution.


      - Reference number 10: report of the legal office of the AEPD of
      reference number 36/2020.


      - Reference number 11: opinion of the ICO (Information Commissioner's
      Office) entitled “The use of live facial recognition technology by law enforcement

      in public places ”, published on October 31, 2019.

      - Reference number 12: privacy policy published on the site of
      Mercadona's internet whose latest update, as quoted by the

      document, was produced on October 5, 2020.

      It is noted that where this report refers to demonstrations,

      descriptions, or exhibitions made by Mercadona "in writing" this
      expression refers to the entry letter registered on July 25,
      2020 with the number 026455/2020.


      In order to achieve the greatest possible exposition clarity, the
      research results in the following sections:


      1. Context and deployment

      2. Interveners, recipients, and international data transfers


      3. Contribution of the image to the judicial procedure and inclusion in the
                Early Detection System (hereinafter SDA)

      4. SDA activation, detection, and alert


      5. Reception and validation of the alert, and communication to the Forces and
                State Security Bodies (hereinafter FCSE)


      6. Terms of conservation of personal data

      7. System architecture, impact assessment, and security measures


      8. Purpose, legality, and proportionality

      9. Compliance with the duty of information



      1. Context and deployment


      Mercadona, as defined in its own writing, is a “global company that
      dedicates, among other activities of its corporate purpose, to the exploitation of

      a chain of food supermarkets ”. Thus, according to the data that
      facilitates, has “1,636 stores and approximately 95,000 workers in
      Spanish territory". It also adds that, “at a generic level, it could be determined

      than a [sic] the approximate number of people who access one each day
      MERCADONA store is *** NUM. 1 ”.


      It also states that “each year, the Company has approximately
      *** NUM. 2 judicial processes that can end in more than *** NUM. 3
      judicial decisions in his favor in which the defendant is firmly condemned

      with restraining orders on the MERCADONA facilities ”. To the
      In this regard, he cites that they are the object of a complaint and therefore “susceptible
      request an order prohibiting access to a Company store "the

      people who:

      - “They are repeat offenders in the crime of robbery or theft against MERCADONA.

      - They have stolen a large number of products that can be sold

      - Have been reported and convicted of crimes related to

      MERCADONA facilities, goods or workers

      - They threaten or attack their own workers or security guards

      that provide service in MERCADONA stores

      - They commit illegal acts on MERCADONA's clients ”.

      In line with the foregoing, it states that “the implementation of a system

      early detection using facial recognition technology in their
      stores […] motivated by the risk derived from the commission of criminal acts,

      with its corresponding risk for MERCADONA customers and employees
      due to the large number of crimes that are committed in its more than 1,600
      centers distributed throughout the Spanish geography, against their employees or

      goods".

Mercadona explains that “a facial recognition process consists of compare a doubted biometric sample, obtained through one or more images of a person, in front of a database of biometric samples already undoubtedly associated with the identity of a person, which have been










      previously registered through one or more photographs ”. To do this, he adds,
      “The doubted biometric samples are transformed into patterns.

      Subsequently, through facial recognition, the biometric samples are
      compared to the undoubted template previously saved, through
      algorithmic calculations that are evaluated based on matching thresholds

      previously established ”.

      Mercadona describes that the procedure consists of the following phases (the

      document number 1 of document 026457/2020 lists, in addition to these phases, the
      actions that each of them includes):

      - Contribution of the image to the judicial procedure.


      - Inclusion of the image in the SDA.


      - Activation of the SDA.

      - Detection phase.


      - Alert phase.

      - Reception and validation of the alert.


      - Communication with FCSE.

      The condensed information of the treatment can be consulted in the extract of the

      record of treatment activities provided by Mercadona as part of the
      document number 29 of brief 026463/2020 that includes the activities of

      processing of data related to the SDA. The following is anticipated
      information on the same, the details of which are expanded in the following sections:

      - Data processing: management of the early detection system


      - Category of processed data: identification data; picture; profile
      biometric.


      - Category of interested parties: subjects who access the centers of
      Mercadona; subjects with firm condemnation.


      - Origin of the data:

      o Undoubted image: through the image provided in a final judgment in

      which Mercadona is part of.


      o Real-time image: data capture through cameras
      with facial recognition system of the centers in which this is active

      system.

      - Legitimation:

      o Public interest


      o Sensitive data: treatment necessary for the formulation, exercise, or
      defense of claims


      - Recipients: FCSE; Courts and tribunals

      Regarding the deployment of the system, Mercadona states “that on July 1,

      2020 the Early Detection System pilot project began in
      *** NUMBER 4 stores ”. It adds, however, that “the system solely and exclusively
      is active in *** NUMBER 5 stores of *** LOCALIDAD.1, that is, in the

      stores that are currently affected by a final judicial decision, in the
      that a restraining order be decreed as a measure, having provided
      MERCADONA the corresponding images in the procedure and

      establishing the possibility by the Court, to make it effective, the
      use of technological means. "


      In relation to future deployment, Mercadona explains that the purpose of the
      system is to protect the safety of customers and employees, "so the
      criteria to be followed in the deployment will obey will be evaluated [sic] according to the

      most vulnerable areas, where there may be a greater risk to
      MERCADONA clients or workers, according to the number of
      ongoing legal proceedings ”. Regarding the number of interested parties

      to include in the SDA states that “within those *** NUM. 3 restraining orders
      on MERCADONA facilities the maximum number could be estimated
      of interested parties included annually [sic] in the System ”. However, he clarifies that

      "These numbers are an approximation and may be increased or decreased by
      function of the own knowledge of the technology on the part of the courts or by
      requests that could be made directly by the FCS ”.




      2. Interveners, recipients and international data transfers


      In its letter, Mercadona lists the following participants in the project:

      - The Mercadona Security Department.


      Specifically, the following profiles are mentioned:


      or (…)


      Mercadona informs that Mercadona personnel have signed a commitment
      specific confidentiality related to this project (in addition to the
      commitments signed by any Mercadona employee). Thus, it facilitates how

      document number 8 of the brief 026464/2020, an example copy of this
      confidentiality commitment.

      or (…)


      o The provision of the service implies the realization by *** COMPANY. 2 of the
      treatment of registration, conservation and deletion of personal data, in the

      insofar as it is necessary for its execution.

      o Mercadona guarantees and declares that it has a legitimate basis

      sufficient for the treatment of the data of the interested parties object of this
      Agreement, in accordance with the provisions of the regulations for the protection of

      data.

      o In general, subcontracting with third parties of
      the services that imply the access and / or treatment, partial or total, of data

      personal, unless *** COMPANY.2 has prior, express authorization
      and in writing from Mercadona.


      o Mercadona's personal data will be processed by *** COMPANY.2
      only to carry out the provision of the service. Yes *** COMPANY.2
      it is considered necessary to carry out data processing with a

      different purpose, you must previously request the written authorization of
      Mercadona. In the absence of such authorization, *** COMPANY.2 will not be able to carry out

      said treatment.

      o The categories of interested parties whose data will be processed by the
      *** COMPANY. 2 by virtue of this agreement are: Mercadona customers, people

      with a restraining order or judicial measure analogous to the facilities of
      Mercadona, people captured by the facial recognition system.


      o *** COMPANY.2 will only process identifying data (name,
      surname and image) and the personal data associated with the biometric pattern in

      under this Agreement.

      o *** COMPANY. 2 undertakes to guarantee, taking into account the
      state of the art, implementation costs, and the nature, scope,

      context, and the purposes of the treatment, as well as the risks of probability and
      variable severity for the rights and freedoms of natural persons, the

      application of appropriate technical and organizational measures to ensure a


      level of security appropriate to the risk, which, if applicable, includes, among others:
      pseudonymisation and encryption of personal data; the ability to guarantee

      permanent confidentiality, integrity, availability and resilience of the
      systems; the ability to restore availability and access to data
      personnel quickly in the event of a physical or technical incident; a process of

      regular verification, evaluation and assessment of the effectiveness of the measures
      technical and organizational to guarantee the security of the treatment.


      o *** COMPANY. 2 undertakes to notify Mercadona, without delay
      undue and within a maximum period of 72 hours, violations of the security of
      the personal data of which it has knowledge, giving support in the

      notification to the AEPD or other competent Control Authority and, where appropriate, to
      the interested parties, of the security violations that occur, as well as
      provide support, where necessary, in conducting impact evaluations

      of privacy and in prior consultation with the AEPD or another Control Authority
      competent, where appropriate.


      o *** COMPANY. 2 undertakes to keep, in writing, a record of all
      the categories of processing activities carried out on behalf of
      Mercadona.


      o *** COMPANY. 2 undertakes to make available to Mercadona all
      the information necessary to demonstrate compliance with the obligations

      established in this Agreement and to allow and contribute to the realization of
      audits, including inspections, by Mercadona or a third party

      authorized by Mercadona.

      o The seventh stipulation of the agreement details the obligations of secrecy and
      confidentiality (as well as the establishment of measures for their protection) to

      those that both parties are subject to even after the relationship ends
      contractual in relation to the information and personal data to which they have

      access.

      or (…)


      o *** COMPANY.2 guarantees that, in relation to the execution of the Agreement,
      no processing of personal data will take place outside the Union
      European or in a country that does not have an adequate level of protection.


      The previous agreement also contains an annex dedicated to measures of
      security in relation to: (…)


      - *** COMPANY.3, as a provider of private security and maintenance
      of facial recognition systems. Refer to the profile of Responsible for
      Production, exclusively for Mercadona as stated, as



      in charge of directing and coordinating the exclusive technicians for the service in
      Mercadona.


      Attached as document number 10 of writing 026459/2020, the Agreement of
      confidentiality and treatment of Personal Data on behalf of third parties

      signed on December 29, 2011 between Mercadona and
      *** COMPANY. 4.

      According to the BORME publication (reference number 1) *** COMPANY.4 was

      absorbed (…) by *** COMPANY.6 Subsequently, on *** DATE.2, the
      published in the BORME (reference number 2) the entry as sole partner of
      *** COMPANY. 5 in *** COMPANY. 6. Likewise, it is stated (reference

      number 3) the coincidence relationship by corporate body and domicile between
      *** COMPANY.5 and *** COMPANY.3.


      The object of the agreement is to regulate the treatment to be given to the entire
      confidential information and personal data to which you have access
      in the context of the services provided. It is referred to in the document, given the

      date of signature, the personal data protection regulations conformed by
      Organic Law 15/1999 and its development regulations. The following is highlighted
      contents:


      or (…)


      o The person in charge of treatment is obliged to:

      (…)


      “Adopt all the technical and organizational measures required by the
      data protection regulations that are necessary to guarantee the
      security and confidentiality of personal data, avoiding the

      unauthorized alteration, loss, treatment, access or assignment. "

      "Once the provision of services is completed, personal data must be

      destroyed or returned to the issuing party (at the latter's choice), the same as
      any support or document containing any personal data
      object of treatment. "


      "All the personal data provided is confidential, and
      under no circumstances may they be disclosed. "


      "The Treatment Manager must communicate and enforce their
      employees the obligations established in this Agreement and, specifically,
      those related to the duty of secrecy and security measures. "


      On the other hand, in relation to the recipients of the information, it clarifies
      Mercadona in its writing that the only data communications provided are

      those derived from the disclosure of breaches of orders
      of removal from the FCSE, and from the courts and tribunals competent in the
      procedures. Likewise, document number 29 of writing 026463/2020

      which includes the definition of the treatment activity related to the management of the
      SDA, indicates that these assignments would be made within the framework of a "legal obligation"
      of the person in charge.


      Finally, Mercadona points out that within the framework of this project, no
      carry out international transfers of personal data.




      3. Contribution of the image to the judicial procedure and inclusion in the SDA


      In relation to the "undoubted" images against which the
      comparison, Mercadona points out that “it has taken into account that, without
      sharp, reliable images that meet certain technical requirements

      explained later, it would not be possible to carry out the intended activity ”.
      It indicates that, “for this reason, prior to the implementation of the System,

      carried out numerous tests (…) verifying that the system works correctly
      correct ”. He adds that, “all this, tending to avoid errors in the systems
      biometric that, where appropriate, could lead to serious consequences for the

      person and, in particular, the erroneous refusal to authorized persons and the
      erroneous acceptance of unauthorized persons that could lead to
      serious problems at many different levels, as the

      Agency in its Report 010308/2019 ”(reference number 4).

      On this particular Mercadona attaches a document (document number 3

      of the writing 026457/2020) that details “the technical requirements for the images
      of the system". From this document, written in English and entitled “Face
      Enrollment Best Practices ”, the following content is underlined:


      - (…)

      About the source from which these images would be obtained, firstly

      Mercadona states that “regarding the final convictions that are the result of
      criminal proceedings in which MERCADONA is part of the procedure,
      the images are obtained from the video surveillance cameras that it has

      in its facilities and that were provided in the procedure as evidence,
      being validly obtained and admitted by the Court or Tribunal
      competent".




      Specifically, Mercadona indicates that when there is a complaint for facts
      that are related to Mercadona's facilities, assets or workers,
      the lawyers responsible for the stores request via email to the CAS

      the images of the facts and the author or authors. Then, "(…)". Aim
      Mercadona that the people in charge of locating and extracting the images
      "They have the classification of" Manager "for viewing images, a position that

      requires specific training in security and video surveillance,
      as well as specific training on the operation of this system ”.

      Next, it states that the images “(…)”.

      Mercadona indicates at this point that it has a registry, which it calls
      "DAM images request", which consists of "a proper internal work list and

      exclusive to the CAS with the following fields:

      - Zone: number of the shopping area to which the center belongs


      - Center: store number.

      - Denomination: name of the center.

      - Population: municipality where it is located.


      - Province: in which it is located.

      - Date request images: date on which the lawyer requests the images

      images to CAS.

      - Observations: annotations to be recorded.


      - Delivered: to FCSE, Court or blank if it has not been done.

      - Trial date.


      - Sentence: prohibition of access, restraining order or blank if not
      has been dictated.

      - Settlement of sentence: upon receipt, it is filled in with a yes, in case

      Otherwise, pending is indicated.

      - Start date: in the settlement of the sentence appears from which day the
      condemned person cannot enter.


      - End date: end date of the sentence of prohibition of access or
      Restraining order.


      - *** FILE. 1: unique identifier that matches that of the
      Court procedure.


      - Identification date: day and time in which 100% has been identified at the
      person condemned not to be able to enter that store.


      - Identification store: center where the company has been 100% identified
      person.


      - CAS Managers: names of the Viewing Managers present in the
      confirmation of identification of that person. "

      As he explains, at this moment the request is registered in the list and the

      different fields are completed as appropriate throughout the
      different phases.


      Mercadona attached (document number 2 of document 026457/2020) on
      document “DAM images request”.


      It adds that “in the event that the judicial resolution determines the order of
      distance, the images contributed to the procedure would become
      undoubted biometric sample and, consequently, would be transformed into

      template". Regarding the territorial scope, it states that “it will be defined by the
      Final judicial decision, which may be limited to one store, several or the territory
      determined by the relevant Court ”.


      Secondly, “in relation to those convictions in which MERCADONA
      is not a party to the procedure (in case of restraining orders for crimes

      committed against MERCADONA employees - alleged violence against
      gender, for example-) and the Courts and Tribunals directly request the
      collaboration with MERCADONA, in relation to the scope of the

      removal to the workplace of the victim, to enforce the orders
      of removal, it will be the Courts and Tribunals themselves who will communicate, to
      Through the timely judicial resolution, to MERCADONA the need for its

      collaboration to guarantee said effectiveness, as well as the terms of said
      measure, in relation to aspects such as the duration of the same and stores
      on which it would be applicable ”. As he states, “in these cases, these images

      will have been provided in the procedure from which the judicial resolution brings
      cause and justification for its use will be determined by the requirement of

      use of technological means for the specific restraining order ”. Y
      he adds that in these cases he would need that "the Courts and Tribunals
      directly, or through the FCSE, provide you with valid images, which

      meet the requirements set forth that the facial recognition system
      you need to establish a previous undoubted sample ”.

      It also exposes the case in which “the requirement comes directly from

      FCSE or *** ORGANISM. 1, based on an investigation found
      carrying out or issues related to *** SUBJECT.1 ”. About,


      states that “in order to use the analyzed system, it must be provided,
      likewise, of the guarantees set forth (specifically, where applicable, the

      established by the regulations on data protection), namely, order
      court based on Law, photograph on which the
      biometric pattern, temporal delimitation of the measure and stores on which

      it would be applicable ”.

      Regarding the inclusion of the images in the SDA, Mercadona points out that,

      “Once MERCADONA has a firm judicial resolution that
      determine the imposition of a restraining order or similar judicial measure
      with respect to one or more MERCADONA stores, the lawyer responsible for the

      file, send an email to the CAS "in which the number of
      judgment, the centers it affects, and the period of validity, and the
      "Pdf document. with settlement of sentence / precautionary measure ”. Thus, it details

      Mercadona that "the image is incorporated into the system with the territorial limitation of the
      area or stores determined in the court resolution, indicating the limitation
      temporary term or expiration of the restraining order, which comes

      determined in the judicial resolution ”.

      According to Mercadona, this process involves completing the information

      corresponding to the "DAM images request" registry. After that the Department
      Security, in order to make a new registration in the system, uses a
      "Tab" with the following information:


      - Number of judicial procedure.

      - Description, including telephones of the FCSE to call and of the

      surveillance service if it is available at the center, start date and
      end date of detection, and a brief description of the judicial measure.


      Document number 4 of document 026457/2020 contains the list of telephones
      associated with the different Mercadona centers.

      -       Group: (…).


      In the event that the judicial resolution is acquittal or the measure is denied
      precautionary, Mercadona points out that “the lawyer responsible for the case would send a

      e-mail to the CAS, for the elimination of the blocked images ”. It
      would cause the deletion of the images and the update of the list "Petition
      DAM images ”.




      4. SDA activation, detection and alert


      As described by Mercadona, (…).


      To follow up on the dates of completion of the judicial measure,
      use the application "*** APPLICATION.1". (…). Add that access to the system
      requires an individual username and password that are provided by the

      IT Department.

      Once the system is activated, “through the facial recognition cameras,

      the images captured will be checked in real time with the
      Undoubted image (s) that have been included. This process of
      check lasts tenths of a second (0.3 seconds today) between

      that an image is captured and verification is performed against the image
      indubited included in the System ”.

      In relation to the cameras installed in each center, the following is underlined

      information contained in the letter:

      - (…)


      - Mercadona “has proceeded and will proceed to comply with the duty of
      information (…) in those centers in which the
      installation of such cameras, even if they are not activated

      to meet the expectation of privacy of customers and employees ”.

      In relation to the capture of the image by the camera, Mercadona provides the

      following documents:

      - Document number 5 of brief 026457/2020, (…), qualified as
      confidential. This document written by “*** EMPRESA.2” in English and

      titled “*** TITLE.1” presents the results obtained after analyzing the potential
      gender and skin color bias in the facial recognition system
      “*** APPLICATION.2” of *** COMPANY.2. The document concludes that the system

      it is not biased based on these attributes.

      - Document number 6 of brief 026459/2020, “description of the system
      used by *** APPLICATION.2, in the extraction of the biometric pattern and its

      comparison in relation to the anonymization process used ”. The
      document, drawn up in English by *** EMPRESA.2, is titled “*** TITULO.2”.

      Some of the characteristics of the system described in the document are:

      or (…)


      - Document number 7 of document 026459/2020, “*** DOCUMENT.1”. The
      document, written in English by *** EMPRESA.2, is titled "*** TITULO.3" and
      includes an explanation of the facial recognition process, which follows the

      following phases: detection, feature extraction, adjustment, and
      recognition. Define result as the distance between the analyzed pattern and the



      inscribed comparison pattern. Add that the probabilities that this
      greater distance between different subjects increases if the quality is improved

      of the images.

      Likewise, Mercadona describes in its brief (pages 24-33) the evaluation that has

      carried out in order to assess the effectiveness of the detection system. As he explains,
      the tests have been carried out with a detection threshold of X, XX since it would be
      the one recommended by the manufacturer *** COMPANY. 2. to optimize the relationship

      between detections and false positives. Thus, it expresses that "a person detected
      with score X, XX means that it has a similarity in at least one YY% to the
      system reference image. "


      It also adds that the tests have been carried out using the solution
      "*** APPLICATION.2 version 2.2 of the manufacturer *** COMPANY.2" on different

      types of cameras, configurations, reference images (…) and scenarios (…)
      that have allowed you to select the combination that offers the best results.
      As stated in the brief, in the tests carried out there would have been no

      no false positives.

      In addition, it points out in relation to the process of detecting a person with
      mask that:


              “The provider of the IT solution has developed an improvement with
      in order to identify people with a face semi-hidden by these masks, as

      as can be seen in the images provided throughout the writing.

             In this sense, it is important to point out that the
      facial recognition based identification by collecting information from the

      periocular area of the face (…).

             The system loses information since part of this area is hidden, for

      which has optimized the reading of the visible part without lowering the threshold (treshold
      [sic]) of identification. "


      Having made these clarifications regarding the effectiveness tests of the system,
      describes the process of generating the alert:

             “Once the Early Detection System is activated in the store / s

      object of the final judgment and in the event that any of the chambers of
      facial recognition installed in stores detect the access of a
      person whose image is included in the system *** APPLICATION. 2,

      it would generate an alert that would initiate the process of confirmation and notification to the FCSE.

             This alert that detects the match in the cameras of the store is

      send by email to a specific address prepared for this purpose […]




               Only the following have access to this email account
       profiles:


       - The Project Manager.

       - CAS Coordinator.


       - Managers, shift managers at CAS.

       - Image viewing managers.


               […] If someone else needed to access this account, they would have to
       expressly request the person in charge of the Project, the need for this new
       access.


               This alarm mail indicating the coincidence of the images in a
       specific store, it is generated by each of the stores' equipment "


       Mercadona provides in its letter (page 21) an example of the mail sent. Is according
       indicates, the following information is sent in the mail:


       -       "Qualification: (…)

       -       Name: (…)


       -       Group: (…)

       -       Center: (…)


       -       Camera: (…)

       - Date and time of detection.


       - Coincidence: (…)

       -       Description: (…)


       -       Reference image: (…)

       - Detection image: (…)




       5. Reception and validation of the alert, and communication to the FCSE

       As Mercadona describes, the process involves “a double factor of

       verification of the positives to avoid the risks derived from a treatment



      exclusively automated ”. Thus, it emphasizes that “once the alert is received, the
      It will be verified by the viewing managers of the Customer Service Center.

      Security present at that time, being confirmed (only in the case of
      that all the viewing managers confirm that it is the same
      person) or not confirmed (if any of them have doubts at the time of

      confirm that it is the same person). In the event that it is not
      confirmed, the image will be destroyed, studying the technical reasons for the alarm
      and the process will be finished ”. As he points out, “the viewing managers of the CAS

      they have sufficient experience and training to carry out this verification ”.

      Mercadona points out in its writing that “this verification by the

      responsible for the Security Department is fully mandatory in the
      process". Thus, he understands that “due to the subsequent verification process, no
      In no case would there be a treatment through an automated decision ”.

      To make this statement, he relies on the “Guide of the Working Group of the
      Article 29 on automated decisions published on October 3,
      2017 ”(reference number 5).


      After confirming the alarm, as described by Mercadona, a Manager of
      viewing will take care of:


      (…)

             Once this process is closed, the image will be extracted

      object of detection, to avoid unnecessary treatments on it more
      beyond its contribution to the competent authorities. "




      6. Terms of conservation of personal data

      Mercadona states in its letter that "(...)"


      Next, Mercadona differentiates between two assumptions. So, first of all,
      describes the behavior of the system during the detection phase in relation to

      with people whose image does not match any of the images
      stored in the system:


             "All the necessary technical and organizational measures have been adopted
      in order to minimize any potential data processing and
      limit it to mere technical residual storage (strictly necessary

      for the proper functioning of the system). "

             “The facial recognition system will detect (automatically and
      during a non-appreciable period) and will analyze the images individually that

      receive from each center. (…)



             Regarding the assumption of detection of a positive (coincidence with
      an image from the database), Mercadona expresses the following:


             (…)

      All of the above is listed, in summary, in the evaluation of

      impact of privacy (document 30 of brief 026463/2020). Thus, this
      states that:

             "The data will be kept:


             (…)

      Finally, it is stated that, as observed in the activity register

      treatment (extract attached as document number 29 of the brief
      026463/2020), SDA management and video surveillance are activities of

      independent treatment. In the case of the processing of personal data
      Regarding the video surveillance activity, the consigned conservation period is
      thirty days.




      7. System architecture, impact assessment, and security measures


      Document number 29 of document 026463/2020 includes the risk analysis
      related to the management of the SDA. This gives this treatment activity a
      medium inherent risk and low residual risk after implementation of measures

      mitigating. Among other issues, the analysis indicates that the activity involves:
      "(...)". This leads you to determine the need to run a “PIA”.


      Document number 30 of document 026463/2020 corresponds to the
      project privacy impact assessment. This includes the evaluation
      of the risk inherent to the treatment through the analysis of *** NUM. 6 threats.

      The result you get is that the level of risk is "tolerable". The
      content related to the following threats:


       (…)

      Likewise, it is noted in the impact assessment that “it has proceeded to

      examine the Project, once operational, to verify that the risks
      detected have been successfully addressed and that no other
      new".


      The privacy impact assessment also includes the content
      following in the fifth section dedicated to the conclusions:




             "(...)"

      On the other hand, Mercadona describes in its writing (pp. 35-49) the architecture of the

      SDA and the security measures implemented. As provided, the elements that
      make up the architecture are:


      - Store equipment.

      (…)


      - Store cameras.

      (…)


      - CAS teams.

      (…)


      - System *** APPLICATION.2 version 2.2.0. of *** COMPANY. 2.

      (…)


      - About the stores:

      or (…)


      - About the CAS:

      or (…) from Mercadona's Security Division or have an authorization

      this.

      - About the facial recognition program:


      (…)

      - On Mercadona's own systems on which the SDA relies:

      or (…)





      7. Purpose, legality and proportionality

      Mercadona points out that “it can be concluded that the purpose for which the
      installation of the Early Detection System is to comply with the

      judicial decisions in which the defendant has been sentenced with a
      restraining order, as a result of events related to


      MERCADONA's facilities, goods or workers, in certain
      special circumstances and provided that a court decision so establishes

      firm".

      With regard to the basis of legitimation, Mercadona states that "the treatment

      of data carried out by MERCADONA in order to preserve the
      safety of people and property, as well as its facilities finds
      place in the public interest. " Thus, Mercadona also cites the

      following content of AEPD Report 010308/2019 (reference number 4):

             "In the present case, we have already cited how article 22 of the LPDGDD
      regulates the processing for video surveillance purposes whose legitimacy is

      finds, as indicated in its Opinion by the Council of State and has
      included in the Law in its Statement of Motives, in the existence of a purpose of

      Incardinable public interest in article 6.1.e) of the General Regulation, as it has
      for the purpose of "preserving the safety of people and property, as well as their
      facilities".


      To this end Mercadona states that “the treatment carried out to preserve the
      safety of people and property, as well as their facilities (the
      mentioned by the AEPD in the Report mentioned, as an example of

      treatment protected in the public interest) is the main purpose of the treatment of
      data carried out by MERCADONA ”.


      On the other hand, Mercadona brings up that “article 8 of the Organic Law
      3/2018 […] includes the following: “The processing of personal data may only be
      be considered founded on the fulfillment of a mission carried out in the interest

      public or in the exercise of public powers conferred on the person in charge, in the
      terms provided in article 6.1 e) of Regulation (EU) 2016/679, when
      derives from a competence attributed by a norm with the force of law ”. Based on

      The above, it is interest of this part to mention that the norm with the force of law
      that enables MERCADONA to adopt mechanisms that detect and

      mitigate the commission of fraudulent conduct regarding the treatment carried out
      to preserve the safety of people and property, as well as their
      facilities, is Law 5/2014, of April 4, on Private Security (as per

      For example, article 4 on the purposes of the rule or article 8 on its
      guiding principles). "

      Included in the file, reference number 6, is an extract from the aforementioned Law

      5/2014 which contains the wording of articles 4 and 8.

      On the other hand, Mercadona states that “there is no doubt that the treatment

      of data carried out by a facial recognition system would enter into
      from the category of special data ”. On this, he states that “it will only
      use the System in the event that it is part of a procedure



      court in which a firm resolution determines the use of
      facial recognition to enforce restraining orders. Therefore my

      represented considers that the analyzed treatment has a place in the article
      9.2.f) by virtue of which sensitive data could be processed when "the treatment
      it is necessary for the formulation, exercise or defense of claims ”. In

      Regarding the above, add the following:

             "(...)".


             This argument is defended by the Courts and Tribunals, when
      position themselves in favor of the option defended by MERCADONA, authorizing that
      said sentence is controlled through electronic means, in order to

      facial recognition, by virtue of the provisions of article 48.4 of the CP. "

      An extract from the Law has been incorporated into the file (reference number 7)

      Organic 10/1995, of November 23, of the Penal Code that contains the
      wording of article 48.

      In line with the above, he also adds that:


             “It is worth mentioning article 1 of the CC in which the
      following:


             "1. The sources of the Spanish legal system are the law, custom
      and the general principles of law.


             2. Provisions that contradict another of rank will be invalid.
      higher.


             (…)

             6. The jurisprudence will complement the legal system with the
      doctrine repeatedly established by the Supreme Court when interpreting and

      apply the law, custom and general principles of law.

             7. The Judges and Courts have the inexcusable duty to resolve in

      In any case, the matters they know, adhering to the system of sources
      settled down."

             Therefore, it could be concluded that, since the Judges and Courts have

      the inexcusable duty to resolve in any case the issues they know,
      taking into account the established system of sources, the fact that a judge has
      considered appropriate to use a facial recognition system to

      ensure compliance with restraining orders in the facilities of
      MERCADONA, would have enough weight to legitimize the treatment.



              Moreover, it is worth mentioning Article 24 of the EC, which is raised to the
      category of fundamental right and that regulates the right of defense within the

      which includes the right to effective judicial protection, according to which all
      People have the right of access to jurisdiction, that is, they must have the right to
      possibility of going to the jurisdictional bodies and formulating before them

      guardianship petitions. Likewise, the right to effective judicial protection also
      It includes the right for the courts to rule on the
      claim made and thus issue a resolution on the merits of the matter,

      motivated and founded in Law.

              In addition, the Constitutional Court has been understanding that within the

      right to effective judicial protection is found, as a manifestation
      necessary, the right that the defendants have to the sentences that the
      ordinary courts have issued for the protection of their rights and interests

      legitimate laws are enforced. This right to compulsory execution
      thus links with the jurisdictional power that the EC recognizes to the courts in
      its article 117.


              […] And, furthermore, all legal subjects (of a public or private nature)
      has the obligation to comply with the final judicial decisions and must collaborate

      with the courts and tribunals in the execution of the resolution, as provided
      Article 118 of the EC.


              In any case, the beneficiary of a judicial resolution has
      an authentic subjective right, which has the character of a fundamental right, to the
      connect directly with the right to effective judicial protection of the article

      24.1 of the EC, and is qualifiable as a subjective public right, since it requires
      with respect to the jurisdictional bodies of the State. "

      An extract from the Real has been incorporated into the file (reference number 8).

      Decree of July 24, 1889 publishing the Civil Code containing
      the wording of articles 1 and 3. Likewise, it has also been included (reference

      number 9) an extract from the Spanish Constitution that includes articles 24,
      117 and 118.

      Regarding the legality of the treatment, Mercadona concludes that “it gives

      compliance with the provisions of the AEPD in its Reports 36/2020 and
      010308/2019, based on the fact that “the existence of a public interest does not legitimize
      any type of personal data processing, but must be, in

      First, to the conditions that the legislator may have established, such as
      provides for article 6 of the RGPD, in its sections 2 and 3 […]. And in case they go

      to be subject to any or some of the personal data included
      in the special categories of data referred to in article 9.1 RGPD, which
      any of the circumstances contemplated in its section 2 concur that

      lift the prohibition on the processing of said data, established with the



      general in its section 1 ”, insofar as the treatment would be legitimized
      by article 6.1.e) RGPD based on the public interest derived from the need

      to preserve the safety of customers, staff and facilities and for the
      Article 9.2.f) in order to respond to the processes in which it is a party and in the
      that the use of said technology has been determined as a measure to recognize

      to the subjects who are the object of a restraining order. "

      Report 36/2020 has been incorporated into the file (reference number 10)

      issued by the legal office of the AEPD.

      On the other hand, Mercadona states that the purpose of the system involves the
      processing of data related to convictions and criminal offenses. Explain, no

      However, this type of data was already processed prior to implementation
      of the system since it is a common practice in the sector to identify those

      people who may pose a risk to ensure the safety of the
      workers and customers. Consequently, he states that “the system studied in
      this writing comes to carry out this same treatment, not assuming a

      different activity in relation to the processing of personal data related to
      penal sanctions or convictions ”.

      To support the legitimacy of the treatment of this type of data, in its

      written by Mercadona (referencing articles ten of the RGPD and the
      LOPDGDD) states that “it deals with data related to convictions and infractions

      under the supervision of public authorities, since the
      treatment carried out by MERCADONA is fully
      legitimized, because it is only carried out supported by the Administration

      of Justice or the FCSE. […] the treatment will be carried out only on those
      judicial decisions in which MERCADONA is a party, so there is no
      would generate a database of criminal convictions, the use of data being

      biometrics a specialization within the existing and necessary treatment,
      as MERCADONA is part of the procedure or has been required by the
      Courts and Tribunals themselves ”.


      In relation to the suitability, necessity and proportionality of the implementation of the
      system, Mercadona states that:


      - “the fulfillment of a restraining order in a store can only
      be effectively guaranteed through electronic means, since
      MERCADONA has 1,636 stores and approximately 95,000 workers in

      Spanish territory and each year, the Company has approximately
      *** NUM. 2 judicial processes that can end in more than *** NUM. 3

      judicial decisions in his favor in which the defendant is firmly condemned
      with restraining orders on the MERCADONA facilities ”.



      - “A large part of these judicial decisions are against people who
      They act within organized gangs or are particularly dangerous

      for bosses and workers, on which it is unfeasible to comply
      to judicial decisions and to enforce sentences without the use of
      technological mechanisms, since the convicted go to the stores

      MERCADONA with a very different physical appearance (costumes, wigs, etc.),
      that makes it difficult to visually recognize
      security to those people who have an access prohibition, plus

      even taking into account that, approximately, *** NUM. 1 people enter the
      day in a MERCADONA store ”.


      - “although the end pursued could be achieved by other means
      (through security guards who control access to stores, for
      example) these do not guarantee the reliability of technological solutions based

      in biometrics, which allow us to achieve the goal pursued by MERCADONA
      with greater guarantees and reliability and, therefore, greater legal certainty ”.


      - “the requirement that the data processing be" strictly "
      necessary, likewise, it is justified insofar as the measure of
      immediate intervention is necessary in cases of flagrante delicto, such as

      breach of a penalty that precisely tries to prevent recidivism and,
      above all, the safety of MERCADONA's clients and workers ”.


      On this point Mercadona adds that "this argument is reinforced
      by the British Data Protection Authority, Information Commissioner’s
      Office, in the document “The use of live facial recognition technology by law

      enforcement in public places 31 ”[sic] of October 2019, stating that“ the
      The purpose for which the facial recognition system is deployed is to
      great importance since there is a considerable difference between the use of

      facial recognition to mitigate certain serious or violent crimes and
      widespread deployments of facial recognition technology to
      identify known thieves ”.”


      The document entitled
      "The use of live facial recognition technology by law enforcement in public

      places ”published by the ICO (Information Commissioner's Office)

      - "the treatment in question only generates benefits and advantages for the
      general interest, as well as for the clients and employees of MERCADONA,

      as for the Courts and Tribunals themselves, since it is the only way
      effective to make effective the measures decreed by them and; for the

      FCSE, by guaranteeing the System a collaboration with them, facilitating
      the performance of their duties ”.


      It concludes that the system “meets the proportionality requirements and is
      strictly necessary to fulfill the intended purpose, since it does not

      there are less intrusive means for user privacy than
      allow the pursued objective to be achieved, as it is technically impossible
      effectively control the entry of convicted persons with a

      prohibition of access to the facilities without the use of a mechanism
      technological". Thus, it states that “opting for an alternative mechanism would imply,
      without a doubt, an alteration of the purpose of the treatment pursued ”.


      In this way, he adds that “due to MERCADONA's interest in the
      implementation of the facial recognition system, since March 2019, in

      the judicial proceedings in which it has been a party, it has been requested to the
      Administration of Justice the establishment of measures against
      reported in relation to access to MERCADONA establishments

      of a certain territorial area, according to the facts denounced,
      for a specified period of time, making effective the control of said
      measured through electronic means in order to facial recognition "

      obtaining as a result that “each and every one of the Courts to which
      has made the request, they have considered the facial recognition system a

      adequate means to ensure compliance with restraining orders
      (…) By virtue of the provisions of article 48.4 of the Penal Code ”.




      8. Compliance with the duty of information

      In its letter, Mercadona lists the following mechanisms used to comply

      with the duty of information:

      - Informational posters about the facial recognition system placed

      Visibly at the entrances to each of the stores.

      Attached, document number 18 of brief 026461/2020 and document 18 of
      written 026463/2020, copy of the signage that has been installed in “the accesses to

      sales room ”in which the SDA has been implemented. The poster includes, under the
      title "EARLY DETECTION AREA", information on the person in charge of the

      treatment, the operation of the system, the recipient of the information
      (FCSE), the legal basis of the treatment, and the possibility of exercising the rights
      of data protection and to file a claim with the AEPD. What's more,

      Various ways are provided to consult additional information on the
      treatment (shop interior, telephone, website).

      In this regard, it also states that “the informational badges have a size

      enough so that any user can read its content and they are
      located in a sufficiently visible place, at the entrance of the store, taking into account


      Note that the duty of information must be prior to the processing of the data,
      in order to strictly respect this part with the principle of transparency and the

      own duty of information. "

      - The Privacy Policy of the Mercadona website


      Attached, document number 19 of brief 026461/2020, copy of the policy of
      privacy of Mercadona published on the internet whose last update, according to
      stated in the document itself, it occurred on July 1, 2020.


      In the section on categories of processed data, the “data
      biometrics (in those stores in Spain where [sic] the

      early detection system) ”.

      In the section corresponding to the purposes, he cites: “carry out the
      precise actions to protect the vital interests of customers when

      as necessary, or compliance with court decisions and measures
      in them agreed. "


      In the section dedicated to conservation periods, it states the following:

      “In relation to the protection of the vital interest of people and the execution of
      the judgments or resolutions that entail restraining orders on the

      work centers and / or people, the data will be processed and guarded over time
      essential to comply with the judicial measures [sic] of

      those people sentenced to said restraining order (in those
      stores in Spain where the early detection system is implemented).

      However, the data collected accessory to comply with said

      purpose will remain on the server only in the process of
      check (this check takes tenths of a second). One time
      Once this check is performed, it will proceed to be definitively destroyed (in

      those stores in Spain where the detection system is implemented
      anticipated). "


      Regarding legitimation, the privacy policy states that “in the case of
      treatment of sensitive data will be treated for reasons of
      public interest with the consequent considerations provided by the regulations

      of data protection, which must be proportional to the objective pursued, which is
      enforce the law, respecting the remaining principles of the regulations of
      data protection and establishing the appropriate and specific measures to

      protect the interests and rights of the interested parties, on the basis of the Law
      of the Union or of the Member States (in those stores in Spain where

      the early detection system is implemented). "



      Likewise, the section entitled "Other data that we process at Mercadona"
      contains the following paragraphs:


      "In the same way we inform you that, in order to improve customer security
      and employees, MERCADONA, based on the public interest can treat its image

      or their biometric facial profile to identify subjects with a warrant for
      withdrawal (or analogous judicial measure) in force against MERCADONA or against
      any of its workers (in those stores in Spain where it is

      early detection system implemented).

      This image will only be used for this purpose and will remain in the
      central server only in the verification process (this verification

      lasts tenths of a second). Once this check is done, proceed to
      be definitively destroyed (in those stores in Spain where it is

      early detection system implemented).

      These images will only be processed internally by MERCADONA, being
      exclusively communicated to the Security Forces and Bodies for

      protect the safety of MERCADONA clients and workers and the
      compliance with the judicially decreed measures (in those stores of
      Spain where the early detection system is implemented) ”.


      The privacy policy published in
      Mercadona's website whose latest update, as stated

      in it, it took place on October 5, 2020.

      - The customer service phone.


      Attached, document number 20 of brief 026461/2020, copy of the argument
      telephone number used in connection with the SDA describing the
      system operation.


      - Information forms made available to those interested in the
      stores to hand them over to them on request.


      Attached, document number 21 of brief 026461/2020, copy of the form in the
      which describes the operation of the system, sets out the legal basis of the
      treatment, informs of the possibility of exercising the rights of protection of

      personal data and to file claims with the AEPD, and refers to the
      privacy policy for the purpose of obtaining more information.


      Likewise, Mercadona attached (document number 28 of the document
      026464/2020), the copy of the email that, according to what it states, it addresses in
      "Security Manager" to "Store Managers". In this it is reported

      on the documents to be printed and provided to customers and
      workers requesting more information about the SDA.



      - Mercadona's communication plan.

      Attached, document number 22 of brief 026462/2020, an extract from the

      document "Early Detection Communication Plan" whose date of
      creation, as contained therein, is June 1, 2020.


      In addition to the foregoing, Mercadona states in its writing that, with character
      prior to the launch of the pilot project, he addressed a press release
      (copy attached as document number 23 of brief 026462/2020) at

      news agencies of the affected cities in order for it to be published
      in the media and thus make the project known to residents
      of these areas. Likewise, it indicates that on July 3, 2020, I send these

      same agencies "some FAQs about the project" (provides a copy as a document
      number 24 of the brief 026462/2020). Among other issues, this

      list of questions and answers in which "in stores two systems coexist
      independent of each other. On the one hand, conventional video surveillance, and on the
      another, early detection ”. This question is also seen reflected in the

      record of treatment activities (attached extract as document number
      29 of writing 026463/2020), in which the management of the SDA and video surveillance
      are listed as separate processing activities.


      Likewise, Mercadona indicates that it has informed its workers about the
      treatment carried out by the SDA through various actions. Thus, it facilitates how

      document number 25 of brief 026462/2020, the text that, according to what it states,
      it would be available through the “employee portal”. This text includes information
      about the person responsible, the purpose, the legal basis, and the possibility of exercising the

      rights to protect personal data as well as to file a
      claim before the AEPD. Document number 26 of brief 026462/2020 is
      corresponds to the information addressed to the "Inter-Center Committee". In this writing,

      dated June 30, 2020, the start-up is reported with the date of
      July 1, 2020 of the system in various stores. Finally, it states that the
      Communication Department would have produced a video “so that its

      workers understood the Project perfectly ”. Contributes (document
      number 27 of the brief 026463/2020) the argument of the same.


      To conclude, Mercadona mentions that “since the System has been installed,
      MERCADONA has only received a request to exercise rights
      that has been taken care of accordingly. " And then it states that

      “This fact allows us to conclude that the interested parties consider that the information
      that MERCADONA provides them through the aforementioned channels gives
      strictly compliance with the provisions of the regulations for the protection of

      data and that the purpose followed by MERCADONA for the purpose of the Project is
      proportional and adequate. "


      On May 28, 2020, the AEPD published a press release entitled: “The
      AEPD analyzes in a report the use of facial recognition systems by

      part of the private security companies ”.

      This communication has also been incorporated into this file through

      of the corresponding diligence.



      THIRD: On May 5, 2021, the Director of the Spanish Agency

      of Data Protection agreed to initiate a sanctioning procedure to the claimed,
      in accordance with the provisions of articles 63 and 64 of Law 39/2015, of 1

      October, of the Common Administrative Procedure of the Administrations
      Public (hereinafter, LPACAP), for the alleged violation of Article 5.1.c) of the
      GDPR, Article 6 of the GDPR, Article 9 of the GDPR, Article 12 of the GDPR, Article

      35 of the RGPD, Article 13 of the RGPD, Article 25 of the RGPD, typified in the
      Article 83.5 of the RGPD, and the precautionary measure consisting of the suspension of
      all the processing of personal data related to facial recognition in its

      establishments.

      FOURTH: Once the initiation agreement was notified, the defendant requested a copy of the

      file and extension of term to present allegations, which was granted
      in the legally established terms. Subsequently, the defendant submitted
      in term and written form of allegations in which it states, in summary, what

      following regarding substantive aspects:

      1. That its legitimacy resides in the public interest (art. 6.1.e) of the
      RGPD) to ensure compliance with court decisions.

      2. That the RGPD allows the use of biometric data provided that it is

      adopt the appropriate security measures, focusing not so much on the
      legitimation, which he takes for granted, but in that what is important are the measures of
      security. It adds that, with adequate security measures, the treatment

      can be carried out, even if it concerns special categories of data
      personal.

      3. Alleges and affirms that the treatment now analyzed is the only measure

      capable of solving this problem and indicates that it is necessary, suitable, effective and
      proportional.

      4. Alleges and affirms that the rights of other subjects who

      enter the supermarket since there is no data processing because
      produces in 0.3 seconds. Thus, it considers that only the data would be processed
      identifiable biometric data of those convicted by final judicial decision, being

      impossible for you to identify those people who are not at the base of
      undoubted data.



      5. The treatment now analyzed has been previously validated by various
      court rulings.


      6. The AEPD has not carried out a detailed analysis of the system
      implanted, and has included innumerable references to “guides, articles and
      guidelines ”that are not binding. Consequently, there is a violation

      to the principles of typicity and legality, violating the principle of interdiction of
      the arbitrariness of the public powers (art 9.3 of the C.E.).

      7. It has been informed in a diligent, sufficient and adequate manner of the laying

      in operation of the System and its implications, as well as the means to
      exercise the rights recognized to those affected.

      8. The system implemented now analyzed took into consideration from the

      design the potential impact on people's privacy.

      Regarding non-substantive or formal aspects, do the following
      allegations:


      A. Ignorance of the two claims (Facua and Apedanica), which
      it is contrary to the usual practice of the AEPD.

      B. The pattern of a person does not constitute a personal data, for

      so no legal basis is needed for its treatment.

      C. The system implemented does not collect additional information to the condition of
      convicted included in its database.


      D. The proposal for a Regulation on artificial intelligence (COM (2021)
      206. Annexes 1 to 9) published on 04/21/2021, considers that the system would be
      possible and in accordance with the measures proposed in said proposal.


      E. Alleges the inexistence of a subjective element of guilt.

      F. MERCADONA's main activity is not linked to the treatment
      of data but to the management of a supermarket chain.


      G. It alleges that both the AEPD and MERCADONA have been adopting the System and
      adjusting it to the requirements of the Agency.

      Therefore, MERCADONA requests that the sanctioning file be filed.


      FIFTH: There is no evidence from the claimed request for evidence, so
      the previous investigation actions are considered incorporated, as well as
      the documents provided by the claimed and the inspection of this AEPD.
      There is also no contribution from the "expert opinion on facial recognition"
      announced in the Second Otheri of the brief of allegations.


                                    PROVEN FACTS




      FIRST: The processing of personal data implemented on the date
      06/01/2020 and continued until 05/06/2021 by MERCADONA in forty
      establishments of the company related to facial recognition of those
      people who access its shopping centers, constitutes a treatment of
      special category data of those regulated in art. 9 of the RGPD and art 9 of the
      LOPDGDD.


      SECOND: In the processing of biometric personal data now analyzed
      (special category data) the concurrence of the
      circumstances set out in article 9.2 of the RGPD, so that according to the provisions
      in art. 9.1 of the RGPD the treatment is prohibited. It is accredited
      the inadmissibility of applying the exceptions of art. 9.2.f), g) and h) of the RGPD to the  lifting of the general prohibition indicated in article 9.1 of said rule.

      THIRD: In addition, without prejudice to what is stated in the Facts proven First
      and Second, in the processing of biometric personal data now analyzed
      (special category data) there is no legitimate basis as indicated in art. 6
      of the RGPD, nor legal regulations that allow it as provided in art. 8 of the LOPDGDD.

      FOURTH: In the processing of biometric personal data now analyzed
      (special category data), without prejudice to what is stated in the Facts proved
      First and Second, the information required in art. 13 in relation to the general obligation imposed by art. 12 of the GDPR and, in special, the provisions of 12.1 regarding “children”. Nor is it accredited compliance with the requirements established in art 7 of the LOPDGDD regarding minors.


      FIFTH: In the processing of biometric personal data now analyzed, without
      detriment to what is stated in the First and Second Proven Facts, there is no evidence
      compliance with the minimization principle set forth in art.
      5.1.c) since the recognition system implemented by MERCADONA
      could treat in a highly plausible way data of various kinds regardless of
      those strictly necessary, such as those indicated and classified as category special in art. 9.1 of the RGPD and 9 of the LOPDGDD.

      SIXTH: In the processing of biometric personal data now analyzed, without
      detriment to what is stated in the First and Second Proven Facts, there is no evidence
      accredited that the safeguards have been established from the design in order

      to guarantee the freedoms and rights of all those affected, as indicated in the
      art. 25.1 of the GDPR.

      SEVENTH: In the processing of biometric personal data now analyzed,
      without prejudice to what is stated in the First and Second Proven Facts, no

      The correct risk analysis and the mandatory evaluation of
      impact, since it does not contemplate, neither in one nor in the other, all the subjects
      affected (FD V), as is the case of workers and minors.

      EIGHTH: Being, therefore, a prohibited treatment, said prohibition
      cannot be bypassed by applying proactive security measures, since

      that the prohibition of the treatment determines that they are irrelevant.

      NINTH: In accordance with the provisions of the Facts tested First,
      Second and Eighth, the precautionary measure imposed in the agreement of beginning.

                               FOUNDATIONS OF LAW

                                                I

      By virtue of the powers that article 58.2 of the RGPD recognizes to each

      supervisory authority, and as established in articles 47 and 48 of the
      LOPDGDD, the Director of the Spanish Agency for Data Protection is

      competent to initiate and resolve this procedure.

                                                II

      In relation to the brief of allegations to the initiation agreement presented by the

      mercantile, it must mean, in short, the following:

      Regarding the allegations included in the FOURTH antecedent of type
      substantial and numbered from 1 to 8, it should be noted that all of them have already been

      find distorted and motivated - through a detailed analysis
      result of the exhaustive preliminary investigation carried out by this Agency-
      its inadmissibility in the Fundamentals of Law (FD) of the agreement of

      Initiation of the present sanctioning procedure and of those indicated in the present
      Motion for a Resolution. However, they are now answered succinctly, without
      detriment of extension in subsequent Foundations of Law:


      In answering the allegations presented by MERCADONA, it means the
      following:

       Regarding legitimation: Mercadona does not adduce in its allegations to

      present procedure no exception among those contemplated in art. 9.2
      of the RGPD that legitimizes the treatment of the biometric data of the convicted person; I know
      limits to citing the legitimacy of the treatment under the pretext that “it is not injured in

      no moment the data protection of the subjects ”.

      The foregoing confirms what is indicated in the Initiation Agreement: Mercadona does not hold

      legitimacy to carry out the processing of personal data consisting of
      facial recognition.

      Likewise, through the allegations made by Mercadona, it is corroborated

      the initial evidence appreciated by this Agency, that is, that the mercantile
      was pre-constituting the exception of art. 9.2 of the GDPR for the purposes of



      be able to process the biometric data regulated in art. 9 of the GDPR. Well once
      obtained the judicial resolution that generically allows the implementation of

      the security measure, the supermarket chain interprets in a way
      unilateral the scope of the judicial decision and uses it for the purposes of justifying
      which holds legitimacy in the sense of art. 9.2.f) of the RGPD not only for the

      condemned, but also for the rest of the citizens affected by the
      system when they access supermarkets - which the merchant includes under the
      name of "not convicted" -.


      The initial agreement already stressed the lack of legitimacy to carry out
      carried out the treatment consisting of facial recognition: it was pointed out that
      where there is no concurrence of one of the exceptions indicated in the article

      9.2 of the RGPD, there is no legitimacy to process biometric data of anyone, with
      independence of the causes of legality indicated in art. 6 of the GDPR, every time

      that art. 9.1 prohibits it; Although, we understood that there was legitimacy regarding
      of the treatment of the convicted person's biometric data because it had, in the
      assumption examined and raised by Mercadona, with the corresponding measure

      of security adopted in a judicial resolution. The AEPD respects the
      judicial resolutions, not being able to oppose what is stated in them.
      However, the extensive and unilateral interpretation of the exposed terms

      in the judicial resolution by Mercadona is contrary to the principles of
      necessity, proportionality and minimization indicated by the RGPD (arts. 5.1.c), 25,
      35.7.a) and recitals 4, 156 and 170, by all).


      At this time we have to bring up the Order of the Provincial Court of
      Barcelona of 02/11/2021, Appeal No. 840/2020, and Resolution No.
      72/2021. The aforementioned Order examines the adoption of the security measure

      consisting of the facial recognition requested by Mercadona for the
      condemned. It concludes that the provisions of article 48 of the Penal Code have

      to be complemented with the consent of the convicted person so that such
      treatment of personal data of facial recognition can be carried out
      with sufficient legitimacy: "Although article 48 of the Penal Code establishes" the

      deprivation of the right to reside in certain places or to go to them prevents the
      sentenced to reside or go to the place where the crime was committed "and that" the judge or
      The court may agree that the control of these measures is carried out through

      those electronic means that allow it "; this would occur by ensuring
      the fundamental rights of the convicted person, that is, as long as he has
      given your consent. We must remember that the damned enjoy all

      the fundamental rights recognized in the Constitution, with the exception of the
      that are expressly limited by the content of the conviction, the

      sense of punishment and the penitentiary law ”.

      In addition, the Order considers that the treatment is not protecting the
      public interest but rather, the private or particular interests of the

      trade.



       Need for the measure: It also means that the company focuses on
      the utility of the measure because it is effective, confusing "utility" with the

      Objective "necessity" of the measure. The measure implemented may be effective, but
      in no way necessary.

      From the foregoing, and from the following legal foundations, the entire

      legal support wielded by MERCADONA to carry out the treatment of
      data that it seeks, as it is prohibited as indicated in art. 9.1 of the GDPR, and

      there is no exception that lifts the ban.

      Regarding the rest of the allegations presented by MERCADONA (outlined
      from A to G), the following should be noted:

      Regarding non-substantive or formal aspects, do the following

      allegations:

      A) << Ignorance of the two claims (Facua and Apedanica), which
      which is contrary to the usual practice of the AEPD. >>


      In this sense, it means that the AEPD proceeded to initiate preliminary investigations
      in order to verify the alleged infringements of the RGPD as indicated in the
      Title VIII of the LOPDGDD, later arriving a series of
      claims motivated by general procedural aspects and not

      singular claims of specific affected persons, the AEPD. It must be added that,
      Following the Initiation Agreement, the respondent has disposed of the entirety of the
      documentation that is in the administrative file.

      In view of the claims of the company, remember that the transfer is a
      optional and non-mandatory procedure, derived from the presentation of a

      claim. The transfer is a procedure outside the sanctioning procedure.

      Furthermore, the claimed part does not specify in which its
      right of defense, which must be material and not formal.


      B) << The pattern of a person does not constitute a personal data,
      so no legal basis is needed for its treatment >>.

      The genesis of the biometric pattern starts from the collection of physical characteristics

      of the subject (photography, which by itself is personal data as it is
      subsequently object of treatment and, consequently, identifiable) in a
      such that it characterizes him unequivocally, so that, by the very definition of

      personal data, as it is identifiable, both the photograph and the Pattern
      Biometric data constitute personal data and their treatment is subject to the RGPD.

      That Mercadona treat the image of any person who enters its

      establishments, capture it, obtain a pattern from it, compare it with that of the
      person sentenced and suppressed is a treatment of character data
      personal (facial recognition). The pattern thus obtained from the personal image

      constitutes in itself, a personal data. There are no two patterns
      equal (Doc 6 of the letter of nre: 026459/2020).


      For the sake of completeness, and in view of the allegations made by the
      mercantile, we must remember that the image of a person is a data of
      personal character and this is continually reiterated by the AEPD; the image of the face of

      a person, from whom the biometric pattern is extracted, fully identifies
      this without further action. Within the framework of consistent data processing
      in facial recognition, that the company does not have the names of the people

      whose biometric data they treat, as they do have that of the convicted person, does not imply that
      it is not about personal data. That do not have previously
      stored the image of a person other than the sentenced person, to check it

      with a database through a pattern, it also does not mean that we do not
      we find ourselves before a treatment of personal data.


      C) << The implanted system does not collect additional information to the condition
      of convicted person included in its database. >>

      In this regard, it should be noted that the information collected from the sentenced to

      From the undisputed database that MERCADONA has and deals with, it is
      contrasted with additional information from third parties in order to "match"
      biometric characteristics of both and, later, based on algorithms

      and in quality criteria, identity by matching is allowed or
      inadmissible. In both cases, additional information is always collected based on
      characteristics and personal data that enriches the system and that lacks

      legal basis for its treatment.

      D) << The proposal for a Regulation on artificial intelligence (COM (2021)
      206. Annexes 1 to 9) published on 04/21/2021, considers that the system would be

      possible and in accordance with the measures proposed in said proposal >>.

      In the Initiation Agreement, mention was already made of the aspects that are now alleged
      with regard to the aforementioned draft regulation on artificial intelligence. In this

      In this sense, Article 5 of the aforementioned alleged Regulation states:

      “The following artificial intelligence practices are prohibited:

      (…)


(a) the use of remote biometric identification systems "in real time" in
public access spaces for law enforcement purposes, unless and in the
insofar as such use is strictly necessary for one of the objectives

following:

(i) the specific search for possible victims of crimes, including

missing children;

C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 36/113








(ii) the prevention of a specific, substantial and imminent threat to life or
the physical safety of natural persons or from a terrorist attack;


(iii) The detection, location, identification or prosecution of an author or
suspect of a criminal offense referred to in article 2, paragraph 2, of the
Framework Decision 2002/584 / JHA of the Council and sanctioned in the Member State that

is dealt with a custodial sentence or an arrest warrant for a period
maximum of three years, as determined by the legislation of said Member State. "

      In the present case, there is no evidence that exceptions (i) to (iii) have been met.


      Furthermore, in addition to the fact that the aforementioned regulation is found in
      processing, data protection regulations always require an analysis
      detailed information on the specific case in question for the purpose of verifying

      holds legitimacy for a specific processing of personal data, remote
      always such an analysis of the automatism.

      E) << Alleges the inexistence of a subjective element of guilt. >>

      Although it is not possible to impute an offense in the absence of the volitional element of

      liability (strict liability), in the present case the commercial
      responsible was aware of the activity that was going to start by hiring
      specialized entities for its implementation. The fact of having

      proceeded to perform a poor risk analysis by omitting not only all the
      affected subjects but not to assess as a risk the prohibition of the treatment that
      is contemplated in article 9.1 of the RGPD, it already configures the volitional element of

      culpability. Having assessed the risk of the planned treatment, the outcome
      it would have been that we are faced with a prohibited treatment and, in

      consequence, unacceptable, which in his case would have led to the application of the
      provided in article 36 of the RGPD (prior consultation), which at no time
      has been taken into account and would have led to the pronouncement of this AEPD

      on the processing of personal data now analyzed.

      Furthermore, to the unacceptable deficiency committed in the elaboration
      risk analysis prior to treatment must be added the also deficient
      subsequent impact assessment, by not involving all affected subjects,

      which also constitutes a serious deficiency by not determining the serious
      consequences for the rights and freedoms of the interested parties. All the

      citizens who access a Mercadona shopping center with a
      implanted facial recognition are treated as condemned.

      The above, configures the presence of the volitional element of guilt required
      by art. 28 of Law 40/2015, of 1/10, of RJSP.


      F) << The main activity of MERCADONA is not linked to
      data processing but to the management of a supermarket chain >>.

C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 37/113








      Although MERCADONA's main activity is the management of supermarkets,
      It is also true that such management implies as a daily parallel activity and

      continues the processing of personal data of both its online customers and
      face-to-face and their workers, the latter numbering more than one hundred thousand.

      G. << He alleges that both the AEPD and MERCADONA have been adopting the

      System and adjusting it to the requirements of the Agency >>.

      This claim should be rejected since at no time is AEPD
      has taken any position with the establishment of treatment now

      analyzed and, as already mentioned, Mercadona has not used the
      regulatory mechanism established for this purpose in the RGPD (art. 36 RGPD).

      H. << Alleges disproportionality in the amount of the sanction >>.


      In this sense, the amount of the penalty is motivated in the commencement agreement.
      In this regard, it should be noted that the RGPD itself, art 83.1, states that: “1. Every
      supervisory authority will ensure that the imposition of administrative fines

      in accordance with this article for infringements of this Regulation
      indicated in sections 4, 5 and 6 are effective in each individual case,
      proportionate and dissuasive ”.


      In the present case, the effectiveness, proportionality and dissuasive character
      is guaranteed. The amount of the administrative fine is adjusted to levels
      much lower than the maximum allowed (for each, 10 or 20 million

      euros, or 2% or 4% of the total global annual turnover for the year
      previous financial statement, opting for the one with the highest amount.

      Consequently, the allegations must be rejected in their entirety.


                                               III

      In order to systematize reading and comprehension from the beginning of the
      present Resolution Proposal, the doctrine of this resolution is set out below.

      AEPD regarding the treatment now under analysis, to which it will be
      reference, among others, throughout the Proposed Resolution.

      Regulation (EU) 2016/679, of the European Parliament and of the Council of 27

      April 2016 regarding the protection of natural persons with regard to the
      processing of personal data and the free circulation of these data and by the

      that Directive 95/46 / EC (General Regulation for the protection of
      data, RGPD) defines in its article 4.14 biometric data as “data
      personnel obtained from a specific technical treatment, relating to the

      physical, physiological or behavioral characteristics of a natural person that
      allow or confirm the unique identification of said person, such as images
      facial or fingerprint data ”.


C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 38/113








      Article 9 of said norm regulates the treatment of special categories of
      data, including biometric data, establishing a

      general prohibition of its treatment in the following terms:

      "The processing of personal data that reveals the origin is prohibited
      ethnic or racial, political opinions, religious or philosophical convictions, or

      union membership, and the processing of genetic data, biometric data
      aimed at uniquely identifying a natural person, data relating to
      health or data regarding a person's sexual life or sexual orientation

      physical."

      In relation to the processing of facial recognition data, in our
      Report 36/2020, analyzing article 9.1 in relation to Recital 51

      of the GDPR, as well as the Protocol of amendment to the Convention for the Protection of
      Individuals regarding the processing of personal data, approved by the

      Committee of Ministers at its 128th session in Elsinore on May 18
      of 2018 (Convention 108+) we indicated that:

       “In order to clarify the interpretative doubts that arise regarding the

      consideration of biometric data as special categories of data
      the distinction between biometric identification and
      biometric verification / authentication established by the Article 29 Group in

      its Opinion 3/2012 on the evolution of biometric technologies:

      Biometric identification: the identification of an individual by a system
      biometric is normally the process of comparing your biometric data

      (acquired at the time of identification) with a series of templates
      biometric data stored in a database (that is, a process of
      one-to-many match search).


      Biometric verification / authentication: the verification of an individual by a
      biometric system is normally the process of comparison between your data
      biometrics (acquired at the time of verification) with a single template

      biometric data stored on a device (that is, a process of searching for
      one-to-one matches).

      This same differentiation is included in the White Paper on intelligence

      artificial of the European Commission:

      “As regards facial recognition, 'identification' means
      that the template of a person's face image is compared to many others

      templates stored in a database to find out if your image is
      stored in it. "Authentication" (or "verification"), for its part, is
      usually refers to the search for matches between two templates

      concrete. It allows the comparison of two biometric templates that, in
      principle, they are supposed to belong to the same person; thus, the two templates

C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 39/113








      they are compared to determine if the person in the two images is the same.
      This procedure is used, for example, in control gates

      automated border controls used in border controls of the
      airports ”.

      Taking into account the aforementioned distinction, it can be interpreted that, according to the

      Article 4 of the RGPD, the concept of biometric data would include both cases,
      both identification and verification / authentication. However, and with
      general character, biometric data will only be considered as

      special category of data in the cases in which they are subjected to treatment
      technician aimed at biometric identification (one-to-many) and not in the case of
      biometric verification / authentication (one-to-one). "


      In the present case, biometric data is processed for the purposes of
      identification, that is, to isolate an individual among several, making it a

      treatment of special categories of data subject to the general rule of
      prohibition of the same (art. 9.1. RGPD).

      However, article 9.2 of the RGPD regulates exceptions to this prohibition

      general by stating that:

      “Section 1 shall not apply when one of the circumstances occurs
      following:

      a) the interested party gave their explicit consent for the treatment of

      said personal data for one or more of the specified purposes, except
      when the law of the Union or of the Member States establishes that the

      The prohibition mentioned in paragraph 1 cannot be lifted by the
      interested.

      (…)


      f) the treatment is necessary for formulation, exercise or defense
      claims or when the courts act in the exercise of their function
      judicial;


      g) the processing is necessary for reasons of an essential public interest,
      on the basis of Union or Member State law, which must be
      proportional to the objective pursued, to respect essentially the right to

      data protection and establish adequate and specific measures to protect
      the interests and fundamental rights of the interested party; "

      (…)


      In relation to section g), it highlights that when the treatment is necessary
      for reasons of public interest, which must be essential on the basis of the right
      of the Member States, proportional to the objective pursued, respect as far as

C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 40/113








      essential the right to data protection and establish adequate measures and
      specific to protect the interests and fundamental rights of the interested party.


      Therefore, it will proceed to analyze if, in the present case, the budgets concur
      established in article 9.2. to lift the prohibition of treatment of
      biometric data.


      This Agency has had the opportunity to pronounce, on various occasions, on
      the necessary requirements to lift the prohibition established in art. 9.1 of
      RGPD, especially regarding the requirements established by article 9.2.g)

      of the RGPD, to be able to protect the processing of personal data based on
      facial recognition, given the proliferation of proposals received in relation to
      with them from different spheres, which shows the interest

      growing in using these systems and the constant concern of this
      control authority, as they are very intrusive identification systems for

      the fundamental rights and freedoms of natural persons. Concern
      which has been shared by the rest of the control authorities for years, such as
      highlight the Working Document on Biometrics, adopted on 1

      August 2003 by the Group of 29, or the subsequent Opinion 3/2012 on the
      evolution of biometric technologies, adopted on April 27, 2012, and that
      has led the Community legislator itself to include these data among the

      special categories of data in the GDPR. In this way, its
      treatment in general, any exception to said prohibition will be
      to be subject to restrictive interpretation.


      In this regard, it should be noted, in addition to the aforementioned report 36/2020, referring to the
      use of facial recognition techniques in conducting tests of
      online evaluation that is later commented on, the report 31/2019 on the

      incorporation of facial recognition systems in the services of
      video surveillance under article 42 of the Private Security Law or the
      Report 97/2020 relative to the Draft Order of the Minister of Affairs

      Economic and Digital Transformation on the identification methods not
      face-to-face for the issuance of qualified electronic certificates. In all

      In these cases, it was concluded that there was no legal norm in the legal system
      Spanish that meets the requirements of article 9.2.g) of the RGPD, so the
      Treatment could only be based on the consent of those affected

      provided that it is guaranteed that it is free.

      Analyzing and developing the requirements of article 9.2.g) in our Report
      36/2020 we indicated -FD V-, the following:


      << The next question that arises in the consultation is whether the treatment of
      biometric data by facial recognition systems in the processes of
      online evaluation could be based on the existence of a public interest

      essential according to article 9.2.g) of the RGPD:

C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 41/113








      g) the processing is necessary for reasons of an essential public interest, on
      the basis of Union or Member State law, which must be

      proportional to the objective pursued, to respect essentially the right to
      data protection and establish adequate and specific measures to protect
      the interests and fundamental rights of the interested party.


      As we pointed out previously, the processing of personal data
      necessary for the provision of the public service of higher education
      legitimates, in general, in the existence of a public interest under the protection

      of the provisions of article 6.1.e) of the RGPD. However, when it comes to
      special categories of data, the assumption contemplated in letter g) of the
      article 9.2. does not refer only to the existence of a public interest, as

      does in many other of its precepts the RGPD, but it is the only precept
      of the RGPD that requires that it be "essential", an adjective that comes to

      qualify said public interest, taking into account the importance and need of
      greater protection of the data processed.

      This precept finds its precedent in article 8.4 of the Directive

      95/46 / EC of the European Parliament and of the Council, of October 24, 1995,
      regarding the protection of natural persons with regard to treatment
      of personal data and the free circulation of these data: “4. As long as

      provide adequate guarantees, Member States may, for reasons
      of important public interest, establish other exceptions, in addition to the
      provided for in section 2, either by means of their national legislation, or by

      decision of the supervisory authority ”. However, its reading results in a greater
      rigor in the new regulation by the RGPD, since the adjective
      “Important” for “essential” and the exception is not allowed to

      be established by the control authorities.

      In relation to what should be understood by essential public interest, it must
      also take into account the case law of the European Court of

      Human Rights, which under article 8 of the European Convention on
      Human Rights, has been considering that the processing of personal data

      constitutes a lawful interference with the right to respect for private life and only
      can be carried out if carried out in accordance with the law, serves a purpose
      legitimate, respects the essence of fundamental rights and freedoms and is

      necessary and proportionate in a democratic society to achieve an end
      legitimate (D.L. v. Bulgaria, No. 7472/14, May 19, 2016, Dragojević
      v. Croatia, no. 68955/11, January 15, 2015, Peck v. United Kingdom, no.

      44647/98, January 28, 2003, Leander v Sweden, No. 9248/81, January 26
      March 1987, among others). As he points out in the last sentence cited, "the
      concept of need implies that the interference responds to a need

      pressing social force and, in particular, that it is proportionate to the legitimate purpose that
      pursues ».


C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 42/113








      Likewise, the doctrine of the Constitutional Court must be taken into account
      regarding restrictions on the fundamental right to data protection, which

      synthesized in its judgment 292/2000, of November 30, in which after
      configure the fundamental right to the protection of personal data as a
      autonomous and independent right that consists of a power of disposition and

      of control over personal data that empowers the person to decide
      which of these data to provide to a third party, be it the State or an individual, or
      which third parties can collect, and which also allows the individual to know

      who owns this personal data and for what purpose, being able to oppose this
      possession or use, analyzes its limits, noting the following:

      More specifically, in the aforementioned judgments regarding the protection of

      data, this Court has declared that the right to data protection is not
      unlimited, and although the Constitution does not expressly impose limits

      nor refer to the Public Powers for their determination as has been
      done with other fundamental rights, there is no doubt that they must
      find them in the remaining fundamental rights and legal rights

      constitutionally protected, as required by the principle of unity of the
      Constitution (SSTC 11/1981, of April 8, F. 7; 196/1987, of December 11
      [RTC 1987, 196], F. 6; and regarding art. 18, STC 110/1984, F. 5). Those

      limits or may be direct restrictions of the fundamental right itself,
      to which it has been alluded before, or they can be restrictions to the way, time or
      place of exercise of fundamental right. In the first case, regulate those

      Limits is a form of development of the fundamental right. In the second, the
      The limits that are set are to the specific way in which the beam of

      powers that make up the content of the fundamental right in question,
      constituting a way to regulate your exercise, which can be done by the
      ordinary legislator in accordance with the provisions of art. 53.1 CE. The first

      verification that must be made, which is not less capital, is that the
      The Constitution has wanted the Law, and only the Law, to be able to set the limits to a
      fundamental right. Fundamental rights can, of course, yield

      before assets, and even constitutionally relevant interests, provided that the
      cutting they experience is necessary to achieve the legitimate purpose envisaged,
      provided to achieve it and, in any case, be respectful with the content

      essential of restricted fundamental right (SSTC 57/1994, of February 28
      [RTC 1994, 57], F. 6; 18/1999, of February 22 [RTC 1999, 18], F. 2).

      Precisely, if the Law is the only one authorized by the Constitution to set the

      limits to fundamental rights and, in the present case, to the right
      fundamental to data protection, and those limits cannot be different from

      those constitutionally provided, which for the case are none other than the
      derived from the coexistence of this fundamental right with other rights and
      legal assets of constitutional rank, the legal empowerment that allows a

      Public Power to collect, store, process, use and, where appropriate, transfer data
      personal rights, it is only justified if it responds to the protection of other rights
C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 43/113








      fundamental assets or constitutionally protected assets. Therefore, if those
      operations with the personal data of a person are not carried out with strict

      observance of the rules that regulate it, the right to protection is violated
      of data, since constitutionally illegitimate limits are imposed, either to its
      content or the exercise of the bundle of powers that compose it. How

      It will also violate that limiting Law if it regulates the limits in such a way that they
      impracticable the fundamental right affected or ineffective the guarantee that the
      Constitution grants you. And so it will be when the Law, which must regulate the limits to

      fundamental rights with scrupulous respect for their essential content,
      limits itself to empowering another Public Power to set the restrictions in each case

      that can be imposed on fundamental rights, whose unique
      determination and application will be at the discretion of the decisions that
      Public Power, who will be able to decide, in what interests us now, on the

      obtaining, storage, treatment, use and transfer of personal data in
      the cases that it deems appropriate and wielding, even, interests or assets
      that they are not protected with constitutional rank […] ”. (Legal Basis 11)


      “On the one hand, because although this Court has declared that the Constitution does not
      prevents the State from protecting legal rights or goods at the cost of the sacrifice of
      others equally recognized and, therefore, that the legislator may impose

      limitations on the content of fundamental rights or their exercise,
      We have also specified that, in such cases, these limitations must
      be justified in the protection of other constitutional rights or assets

      (SSTC 104/2000, of April 13 [RTC 2000, 104], F. 8 and those cited there) and,
      in addition, they must be proportionate to the end pursued with them (SSTC 11/1981,

      F. 5, and 196/1987, F. 6). Well, otherwise they would incur the banned arbitrariness
      by art. 9.3 CE.

      On the other hand, even having a constitutional foundation and resulting

      provided the limitations of the fundamental right established by a
      Law (STC 178/1985 [RTC 1985, 178]), these can violate the Constitution if
      suffer from a lack of certainty and predictability in the very limits they impose and

      its mode of application. Conclusion that is corroborated in the jurisprudence of the
      European Court of Human Rights that has been cited in F. 8 and that here
      it must be considered reproduced. And it should also be noted that not only

      would harm the principle of legal certainty (art. 9.3 CE), conceived as
      certainty about the applicable law and reasonably well-founded expectation
      of the person on what the action of the power should be applying the Law

      (STC 104/2000, F. 7, by all), but at the same time said Law would be
      damaging the essential content of the fundamental right thus restricted, given

      that the way his limits have been set make him unrecognizable and
      make it impossible, in practice, to exercise it (SSTC 11/1981, F. 15; 142/1993, of 22
      April [RTC 1993, 142], F. 4, and 341/1993, of November 18 [RTC 1993,

      341], F. 7). So that the lack of precision of the Law in the budgets
      material limitation of a fundamental right is likely to generate
C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 44/113








      an indeterminacy on the cases to which such restriction applies. And to
      to produce this result, beyond all reasonable interpretation, the Law already

      does not fulfill its function of guaranteeing the fundamental right that it restricts,
      for it lets the will of the one who is to operate in its place
      apply it, thus undermining both the effectiveness of the fundamental right and the

      legal security […]". (FJ 15).

      “More specifically, in relation to the fundamental right to privacy
      we have highlighted not only the need for its possible limitations

      are based on a legal provision that has constitutional justification and
      that are provided (SSTC 110/1984, F. 3, and 254/1993, F. 7) but that the
      Law that restricts this right must precisely express each and every one of

      the material presuppositions of the limiting measure. If not, it is a bad fit
      understand that the judicial resolution or the administrative act that applies it are

      founded on the Law, since what it has done, making abandonment of its
      functions, is to empower other Public Powers so that they are the ones
      set the limits to the fundamental right (SSTC 37/1989, of February 15 [RTC

      1989, 37], and 49/1999, of April 5 [RTC 1999, 49]).

      Similarly, with regard to the right to protection of personal data, it is
      consider that the constitutional legitimacy of the restriction of this right does not

      it can be based, by itself, on the activity of the Public Administration. Neither
      it is enough that the Law empowers it to specify in each case its
      limits, limiting itself to indicating that you must make such precision when

      any constitutionally protected right or good. It is the legislator who must
      determine when that good or right that justifies the restriction of the
      right to the protection of personal data and under what circumstances can

      limit himself and, furthermore, it is he who must do so by means of precise rules that
      make the imposition of such limitation foreseeable to the interested party and its

      consequences. Well, in another case the legislator would have transferred to the
      He administers the performance of a function that he alone is responsible for in matters
      of fundamental rights by virtue of the reserve of Law of art. 53.1 CE, this

      is, clearly establish the limit and its regulation. […] (FJ 16) ”.

      Likewise, our Constitutional Court has already had the opportunity to
      to pronounce specifically on article 9.2.g) of the RGPD, as

      consequence of the challenge of article 58 bis of Organic Law 5/1985,
      of June 19, of the General Electoral Regime, introduced by the provision
      third final of Organic Law 3/2018, of December 5, on the Protection of

      Personal Data and guarantee of digital rights, regarding legitimation
      of the collection of personal data related to the political opinions of the
      people who carry out political parties in the framework of their activities

      electoral, precept that was declared unconstitutional by Sentence no.
      76/2019 of 22 May.


C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 45/113








      Said judgment analyzes, first of all, the legal regime to which
      is subject to the treatment of special categories of data in the

      GDPR:

      In accordance with section 1 of art. 9 GDPR, the treatment of
      personal data that reveals political opinions, in the same way as

      is the processing of personal data that reveal the ethnic or racial origin, the
      religious or philosophical convictions or union membership and treatment of
      genetic data, biometric data aimed at uniquely identifying

      a natural person, data related to health or data related to sexual life or
      the sexual orientation of a natural person. However, section 2 of the
      The same precept authorizes the treatment of all these data when there is

      any of the ten circumstances provided there [letters a) to j)]. Some of those
      Circumstances have a limited scope of application (labor, social, associative,

      health, judicial, etc.) or respond to a specific purpose, so that, in
      themselves, delimit the specific treatments that they authorize as an exception
      to the general rule. Furthermore, the enabling efficacy of several of the assumptions there

      provided for is conditional on the law of the Union or the law of the
      members the circumstances set out in letters a), b), g), h), i) and j).

      The treatment of special categories of personal data is one of the

      areas in which the General Regulation for the Protection of
      Data has recognized Member States 'room for maneuver' when it comes to
      "specify its standards", as its recital 10 qualifies. This margin

      legislative configuration extends both to the determination of the causes
      Enabling the processing of specially protected personal data
      -that is, to the identification of the purposes of essential public interest and the

      appreciation of the proportionality of the treatment to the end pursued, respecting
      essentially the right to data protection- such as the establishment of

      "adequate and specific measures to protect the interests and rights
      fundamental data of the interested party "[art. 9.2 g) RGPD]. The Regulation contains,
      Therefore, a specific obligation of the Member States to establish such

      guarantees, in the event that they enable to process personal data
      specially protected.

      In relation to the first of the requirements demanded by article 9.2.g), the

      invocation of an essential public interest and the necessary specification of the
      itself, the High Court recalls what was stated in its judgment 292/2000 in which
      It was rejected that the identification of the legitimate purposes of the restriction could

      be done through generic concepts or vague formulas, considering that the
      restriction of the fundamental right to the protection of personal data
      may be based, by itself, on the generic invocation of an indeterminate

      "public interest" :



C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 46/113








      In the aforementioned STC 292/2000 (RTC 2000, 292), which also prosecuted
      a legislative interference in the right to the protection of personal data,

      We reject that the identification of the legitimate purposes of the restriction could
      be done using generic concepts or vague formulas:

      "16. [...] Similarly, regarding the right to personal data protection

      It can be considered that the constitutional legitimacy of the restriction of this right
      it cannot be based, by itself, on the activity of the Public Administration.
      Nor is it enough that the Law empowers it to specify in each case its

      limits, limiting itself to indicating that you must make such precision when
      any constitutionally protected right or good. It is the legislator who must
      determine when that good or right that justifies the restriction of the

      right to the protection of personal data and under what circumstances can
      limit himself and, furthermore, it is he who must do so by means of precise rules that

      make the imposition of such limitation foreseeable to the interested party and its
      consequences. Well, in another case the legislator would have transferred to the
      He administers the performance of a function that he alone is responsible for in matters

      of fundamental rights by virtue of the reserve of Law of art. 53.1 CE, this
      is, clearly establish the limit and its regulation.

      17. In the present case, the employment by the LOPD (RCL 2018, 1629) in its art.

      24.1 of the expression "control and verification functions", opens a space of
      uncertainty so wide that it provokes a twofold and perverse consequence. Of a
      On the other hand, by enabling the LOPD to the Administration to restrict rights

      invoking such an expression is renouncing to fix it
      limits itself, empowering the Administration to do so. And in a way
      such that, as indicated by the Ombudsman, it allows to redirect to the same

      practically all administrative activity, since all administrative activity
      that involves entering into a legal relationship with an administrator, which will be the case

      practically in all cases in which the Administration needs data
      personal data of someone, will ordinarily entail the authority of the Administration
      to verify and control that the company has acted in accordance with the

      administrative legal of the legal relationship established with the Administration. It
      that, in view of the reason for restricting the right to be informed of art. 5
      LOPD, leaves the citizen in the most absolute uncertainty about in which cases

      that circumstance will concur (if not in all) and add to the ineffectiveness any
      jurisdictional protection mechanism that should prosecute such a case of
      restriction of fundamental rights without another complementary criterion that

      come to the aid of your control of administrative action in this matter.

      The same reproaches deserve, likewise, the use in art. 24.2 LOPD of the
      expression "public interest" as the basis for the imposition of limits on

      fundamental rights of art. 18.1 and 4 CE, since it contains a degree of
      even greater uncertainty. It is enough to note that all administrative activity, in

      last term, it pursues the safeguarding of general interests, whose
C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 47/113








      Achievement constitutes the purpose to which the
      Administration pursuant to art. 103.1 CE. "


      This argument is fully transferable to the present prosecution. Of
      Similarly, therefore, we must conclude that the constitutional legitimacy of the
      restriction of the fundamental right to the protection of personal data

      may be based, by itself, on the generic invocation of an indeterminate
      "public interest". Well, in another case the legislator would have transferred the parties
      politicians - whom the contested provision enables to collect data

      personal information regarding the political opinions of people within the framework of their
      electoral activities- the performance of a function that only corresponds to him in
      matter of fundamental rights by virtue of the reservation of Law of art. 53.1

      CE, that is, clearly establish its limits and its regulation.

      Neither can the purpose adduced by the

      State attorney, which refers to the functioning of the democratic system,
      as it also contains a high degree of uncertainty and can represent a
      circular reasoning. On the one hand, political parties are themselves "channels

      necessary for the functioning of the democratic system "(for all, STC
      48/2003, of March 12 (RTC 2003, 48), FJ 5); and, on the other hand, all the
      The functioning of the democratic system ultimately pursues the

      safeguarding of constitutional purposes, values and goods, but this is not enough
      to identify the reason why the fundamental right should be restricted
      affected.


      Finally, it should be specified that it is not necessary to suspect, with
      greater or lesser basis, that the restriction pursues a purpose
      unconstitutional, or that the data that is collected and processed will be harmful

      for the private sphere and the exercise of the rights of individuals. It is
      it is enough to verify that, since it cannot be identified with sufficient precision
      the purpose of the data processing, neither can the character

      constitutionally legitimate of that purpose, nor, where appropriate, the proportionality
      of the planned measure in accordance with the principles of suitability, necessity and

      proportionality in the strict sense.

      On the other hand, regarding the guarantees that the legislator must adopt, the aforementioned
      judgment no. 76/2019 of May 22, after recalling that “In view of the

      potential intrusive effects on the fundamental right affected that result
      of the processing of personal data, the jurisprudence of this Court requires the
      legislator who, in addition to meeting the aforementioned requirements,

      also establish adequate guarantees of a technical, organizational and
      procedural, preventing risks of varying probability and severity and
      mitigate its effects, because only in this way can respect for the content be ensured

      essential of the fundamental right itself ”, analyzes what is the norm that must
      contain the aforementioned guarantees:

C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 48/113








      “Therefore, the resolution of this challenge requires that we clarify a
      doubt raised regarding the scope of our doctrine on guarantees

      adequate, which consists of determining whether the adequate guarantees against the use
      information technology must be contained in the law that authorizes and regulates this
      use or may also be found in other normative sources.


      The question can only have a constitutional answer. The forecast of
      Adequate guarantees cannot be deferred to a time after regulation
      legal processing of personal data in question. The guarantees

      Appropriate measures must be incorporated into the own legal regulation of the treatment,
      either directly or by express and perfectly delimited reference to
      external sources that have the appropriate regulatory range. Only that

      understanding is compatible with the double requirement that arises from art. 53.1 CE
      (RCL 1978, 2836) for the legislator of fundamental rights: the reservation

      of law for the regulation of the exercise of fundamental rights
      recognized in the second chapter of the first title of the Constitution and the
      respect for the essential content of these fundamental rights.


      According to reiterated constitutional doctrine, the reservation of law is not limited to requiring that
      a law enables the restrictive measure of fundamental rights, but rather
      It is also necessary, in accordance both with requirements sometimes called

      normative and -other- predetermination of the quality of the law as well as respect for
      essential content of the law, that in that regulation the legislator, who comes
      primarily obliged to weigh the competing rights or interests,

      predetermine the assumptions, conditions and guarantees in which the
      adoption of restrictive measures of fundamental rights. That mandate of
      predetermination with respect to essential elements, also linked in

      last term to the judgment of proportionality of the limitation of the right
      fundamental, it cannot be deferred to further legal development or

      regulations, nor can it be left in the hands of individuals themselves "
      (FJ 8).

      Consequently, the processing of biometric data under article

      9.2.g) requires that it be provided for in a standard of European or national law,
      In the latter case, it must have said norm, according to the constitutional doctrine
      cited and the provisions of article 9.2 of the LOPDGDD, rank of law. Said law

      must also specify the essential public interest that justifies the restriction
      of the right to the protection of personal data and in what circumstances can
      be limited, establishing the precise rules that make the interested party foreseeable

      imposition of such limitation and its consequences, without being sufficient, to these
      effects, the generic invocation of a public interest. And said law shall
      establish, in addition, the appropriate technical, organizational and

      procedural, preventing risks of varying probability and severity and
      mitigate its effects.


C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 49/113








      In addition, said law must respect in all cases the principle of proportionality,
      as recalled in the Judgment of the Constitutional Court 14/2003, of 28
      January:


      “In other words, in accordance with a reiterated doctrine of this Court,
      the constitutionality of any restrictive measure of fundamental rights
      it is determined by the strict observance of the principle of proportionality.

      For the purposes that matter here, it is enough to remember that, to check whether a
      restrictive measure of a fundamental right exceeds the judgment of
      proportionality, it is necessary to verify whether it meets the three requirements or

      following conditions: whether the measure is likely to achieve the objective
      proposed (suitability judgment); if, in addition, it is necessary, in the sense that
      there is no other more moderate measure to achieve this purpose with

      equal efficacy (judgment of necessity); and, finally, if it is weighted or
      balanced, by deriving from it more benefits or advantages for the interest

      general that damages to other goods or values in conflict (judgment of
      proportionality in the strict sense; SSTC 66/1995, of May 8 [RTC 1995,
      66], F. 5; 55/1996, of March 28 [RTC 1996, 55], FF. 7, 8 and 9; 270/1996,

      December 16 [RTC 1996, 270], F. 4.e; 37/1998, of February 17 [RTC
      1998, 37], F. 8; 186/2000, of July 10 [RTC 2000, 186], F. 6). "

      Of the transcribed regulation, which is the transposition of community regulations,
      it can easily be inferred that it does not meet the requirements
      established in article 9.2.g), since the legislator has not provided for the use of

      biometric data as a proportional measure for the identification of the
      natural persons, establishing the specific and adequate guarantees that are
      derive from the greater risks involved in the processing of said data.

      Therefore, the project intends to process data

      personal data included in the special categories of data referred to in the
      article 9.1. of the RGPD, since it is biometric data directed to the
      identification of natural persons, it is a prerequisite that one of the
      the circumstances contemplated in its section 2 that lifts the prohibition of
      treatment of said data, established in general in its section 1,
      requiring article 9.2. of the LOPDGDD that "Data processing

      referred to in letters g), h) and i) of article 9.2 of the Regulation (EU)
      2016/679 based on Spanish law must be covered by a
      regulation with the force of law, which may establish additional requirements related to
      your security and confidentiality. " not existing, as indicated, norm
      legal that enables said treatment under article 9.2.g) of the RGPD.


      Therefore, said prohibition may only be lifted in those cases where
      that the affected person gives his express consent, under the protection of letter a) of
      article 9.2. of the RGPD, all other requirements must be met for
      grant a valid consent that is included in the definition of article 4.11
      of the RGPD: “any manifestation of free will, specific, informed and
      unequivocal by which the interested party accepts, either through a statement or


C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 50/113








      a clear affirmative action, the processing of personal data that
      concern ”.


      Although the absence of cause that lifts the prohibition of the treatment of
      special categories of data determines, by itself, the prohibition of
      treatment carried out by Mercadona, and it should be noted that there is no

      a legal basis that would legitimize, where appropriate, the same under article 6.1.
      GDPR on the basis of the public interest.

      The concept of public interest, or that of general interest, which is more

      frequently used by our constitutional text, it is a legal concept
      indeterminate with a double function: to give legitimizing coverage to the performance
      of the Administration and, on the other hand, constitutes one of the ways of limiting

      administrative powers. In this way, the public interest that, as it points out
      Parejo Alfonso, has a clear directive role in regulatory development
      (parliamentary or not) of the constitutional order, acts as a delimiting criterion of

      the actions of the public powers, so it must, first of all, be
      identified by the legislator, in order to identify the area in which the
      develop the actions of the Administration, subject to the principle of legality and

      which corresponds to serve the general interests objectively (article
      103.CE) and, in any case, under the control of the courts, since as you recall

      the Judgment of the Constitutional Court of June 11, 1984, “There is no
      ignore that the power attributed by the Constitution to the State to define the
      general interest, open and indeterminate concept called to be applied to

      respective subjects, it can be controlled, against possible abuses and
      posteriori, by this Court… ”.

      In the first place, it must be assumed that the existence of a public interest, not

      legitimizes any type of personal data processing, but must
      be, in the first place, to the conditions that the
      legislator, as provided for in article 6 of the RGPD, in its sections 2 and 3,

      and article 8 of Organic Law 3/2018, of December 5, on the Protection of
      Personal Data and guarantee of digital rights (LOPDGDD) that regulates the
      data processing based on a legal obligation and on a mission carried out

      in the public interest or exercises of public interests in its article 8, in the
      following terms:

      "1. The processing of personal data can only be considered based on the

      compliance with a legal obligation enforceable by the person responsible, in the terms
      provided for in article 6.1.c) of Regulation (EU) 2016/679, when so

      foresee a rule of European Union law or a rule with the rank of
      law, which may determine the general conditions of the treatment and the types
      of data object of the same as well as the assignments that proceed as

      consequence of compliance with the legal obligation. Said rule may
      also impose special conditions on the treatment, such as the

C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 51/113








      adoption of additional security measures or others established in the
      Chapter IV of Regulation (EU) 2016/679.


      2. The processing of personal data can only be considered based on the
      fulfillment of a mission carried out in the public interest or in the exercise of
      public powers conferred on the person responsible, under the terms provided in the

      Article 6.1 e) of Regulation (EU) 2016/679, when it derives from a
      competence attributed by a norm with the force of law. "

      Consequently, the public interest requires, in the first place, its realization by

      part of the legislator, taking into consideration all the interests affected, when
      object of determining the restrictions that particular interests may suffer
      as a consequence of the presence of said general interests, which must

      be done through a rule with the force of law.

      On the other hand, the other principles of article 5 of the RGPD should be respected,
      especially to those of limitation of the purpose and minimization of data.


      Especially in relation to the principle of data minimization, which
      requires that they be “adequate, relevant and limited to what is necessary in relation to
      with the purposes for which they are processed ”(article 5.1.c) of the RGPD) it is necessary to

      point out that the processing of facial recognition data will involve the
      large-scale processing of special categories of data subject to a

      reinforced guarantee scheme. This is so due to the high volume of affected
      and clients of the company, as well as because said treatment could
      be generalized to all merchants in the same or another commercial sector.


      Finally, despite the ostensible lack of legitimacy for the treatment of
      personal data consisting of facial recognition, the implanted system
      by the company would not comply with the proportionality requirements demanded by

      the Constitutional Court, since within the triple judgment of proportionality, if
      may well be considered suitable for the proposed purpose, it is not
      necessary, as there are less intrusive alternative measures, nor is it strictly

      proportional, to the extent that more benefits are derived for the interest
      public that damages to other goods or values in conflict, taking into account
      account that its massive and indiscriminate application is intended for all

      clients and the rest of those affected, and that if it were generalized it would imply a
      massive treatment of special categories of data that would reach the

      practically the entire population, regardless of the level of risk
      represents becoming the exception of the possibility of data processing
      biometrics in the general rule, contrary to what is intended by the RGPD.


      Precisely, the inadmissibility of using these techniques with character
      widespread, as well as the absence of connection between the security measure
      with the public interest, pursuing, on the contrary, private interests or


C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 52/113








      individuals of the mercantile, is included in the Order of the Provincial Court of
      Barcelona, dated 02/15/2021:

      “Having stated what precedes in the preceding paragraphs, this Chamber considers that

      the measure requested by the commercial entity, MERCADONA S.A, in
      It is in no way proportional, necessary or even suitable. The convicts
      in this executive order, Messrs. A.A.A. and B.B.B. a ban was imposed on them

      access to a specific supermarket of the Mercadona entity,
      specifically located on Calle Frederic Mompou s / n in the town of San
      Boi de Llobregat; there has been no record, or at least the testimony of

      individuals referred to this section, there is no record that they violated the
      corresponding prohibition of access to the shopping center or also that these
      are repeat offenders in such conduct. But what's more, this room cannot

      share that the measure concerned is protecting the public interest,
      but rather, the private or particular interests of the company in question,

      since, as has already been explained in the previous paragraphs, they would be
      violating the adequate guarantees in order to protect the rights and
      freedoms of the interested parties, not only of those who have been punished and whose

      access prohibition is incumbent on them, but on the rest of the people who access the
      cited supermarket ”.

      In the allegations made to the initiation agreement, Mercadona claims an interest
      public underlying in the judicial decisions in which the

      security measures consisting of facial recognition of the convicted person.
      The complained party affirms that “Consequently, in view of the establishment as

      security measure in criminal sentences of recognition methods
      facial by judges and courts, the public interest wielded and accepted
      as a legal basis for the convicted, and courts, the public interest wielded and

      accepted as the legal basis for convicted persons, it would logically extend to
      these effects on non-convicted persons ”.

      Well, one thing is that the adoption of a security measure can

      have beneficial effects on society and that a judge or criminal court assess
      proportionally what the adoption of the security measure implies (between
      the restriction of the rights of the convicted person and the public interest, that benefit

      social, which is obtained from the imposition of the security measure). And another thing
      is that the preponderance of the public interest (the reason why the
      security measure) legitimizes the processing of personal data of the rest of

      citizens, so that all citizens are treated as
      condemned, being subjected to the same treatment as the subject to whom they
      has imposed the security measure.


      In any case, the existence of that public interest is not a peaceful matter. The
      aforementioned Order of the Provincial Court of Barcelona, examining
      specifically the security measure consisting of facial recognition,

C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 53/113








      considers that there is no public interest, but that, as we have already pointed out,
      strictly pursue particular and private interests of the company.


      Consequently, and in light of the allegations made at this time
      procedural by Mercadona, we must conclusively conclude that the
      data processing based on facial recognition for the purposes of

      identification is not authorized in accordance with article 9.2.g) of the RGPD
      and, furthermore, it lacks a legal basis under article 6.1 of the
      itself and is contrary to the principles of necessity, proportionality and

      minimization.

                                             IV

      On the other hand, and as already indicated, it is appropriate to bring up a summary

      of the content of the recent Order of the Provincial Court of Barcelona dated
      02/15/2021, Resource No. 840/2020, and Resolution No. 72/2021, in which the
      mercantile (MERCADONA) has been an interested party in the car that brings cause

      for facts related to the treatment now under analysis. Breeds at
      The effects of the references to the same appear in the present Proposal of

      Resolution.

      The aforementioned Order indicates the following (The underlining is from the AEPD):

      << LEGAL REASONS

      FIRST.- The mercantile MERCADONA requests the adoption of the measure,

      understanding that biometric data is obtained through the cameras of
      security when a subject enters the premises. For this it establishes as
      regulations to be followed European Union Regulation 2016/679 of Parliament

      Council and Council of April 27, 2016 on the protection of persons
      with regard to the processing of personal data and the free

      circulation of these data. The appellant understands the fact that, the
      category of biometric data is recognized in said Regulation as
      data of special protection, does not exclude its use, provided that it is carried out

      with all relevant security measures. It is understood by part of said
      mercantile that with the security measures proposed is not injured in any
      moment the data protection of the subjects, since, although they are processed

      the biometric data of every user who enters one of the establishments,
      the system instantly detects (in 0.3 seconds) those individuals who
      have been sentenced with an entry ban to the aforementioned establishment to

      through the final judgment in a judicial process; consequently no
      no biometric data of a person who has not been

      condemned and will be immediately erased and never used.

      The appellant argues for considering that the purpose of the Legislator in the
      development of the General Data Protection regulation is, not only to protect

C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 54/113








      the rights of natural persons but also the free circulation of data
      attending to the progress of technology. That is why, it would be all-round

      ineffective trying to solve a problem such as the control of those
      individuals who have been convicted in a final judgment with a prohibition of
      entrance, trying to show the image of these individuals to dozens of

      employees of establishments so that they could identify them and
      report them. It is argued that, failing to take advantage of the advantages that progress
      offers, being able to do so ensuring the protection of natural persons, is

      condemn the human being, as well as the Spanish legislative development of the last
      decades.

      The appellant invokes the suitability, necessity and proportionality of the

      requested measure. It is effective in the first place, as it addresses the problem
      presents, in order to achieve its objective, which is to identify all those

      individual who, despite having a final judgment that prevents him from entering
      one of its establishments, may violate the decision of the judicial body and
      also the rights of the company itself. It is necessary, because it is the only

      as you face the problem and solve it, since the above measures
      that have been taken, have been completely ineffective due to the
      impossibility of exercising control in all establishments by

      all the employees; and finally, it is proportional, since it contributes more
      benefits for the general interest than damages for the particular individual in
      so much so that it does not imply any treatment of the biometric data of the subjects

      in general terms, implying a treatment only of those individuals
      who have been convicted by final judgment ...

      SECOND.- Well, going into the substance of the request made, what

      It is true that it is an issue that raises many doubts at the legal level.
      We must remember that after the approval and entry into force of the Regulation

      general data protection - directly applicable since May 2018 - the
      Treatment will only be lawful if at least one of the following is met
      terms:


      * the interested party gave their consent for the processing of their data
      personal for one or more specific purposes;

      * the treatment is necessary for the execution of a contract in which the

      interested is part or for the application at the request of this of measures
      pre-contractual;

      * the treatment is necessary for the fulfillment of a legal obligation

      applicable to the controller

      * the treatment is necessary to protect vital interests of the interested party or of
      another natural person;


C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 55/113








      * the treatment is necessary for the fulfillment of a mission carried out in
      public interest or in the exercise of public powers conferred on the person responsible
      of the treatment;


      * the treatment is necessary for the satisfaction of legitimate interests
      pursued by the controller or by a third party, provided that
      the interests or the rights and freedoms do not prevail over said interests

      fundamental data of the interested party that require the protection of personal data,
      particularly when the interested party is a child. "

      In other words, the Regulation contemplates the obligation that the user

      of your consent to process your personal data. When we talk about
      facial recognition, we must understand made the reference to data
      biometrics. The regulation defines them as "personal data obtained from

      of a specific technical treatment, related to the physical characteristics,
      physiological or behavioral characteristics of a natural person that allow or confirm the
      unique identification of that person, such as facial images or data

      dactyloscopic ". In case there is any doubt, section 1 of article 9 of the aforementioned
      The legal text provides that "The processing of personal data that
      reveal ethnic or racial origin, political opinions, convictions

      religious or philosophical, or union membership, and data processing
      genetic, biometric data aimed at uniquely identifying a
      natural person, data related to health or data related to sexual life or

      sexual orientation of a natural person ".

      According to the mercantile MERCADONA S.A, the system "detects, unique and
      exclusively, the entry of people with final sentences and precautionary measure

      restraining order in force against Mercadona or against any of its
      workers or workers. But, you should ask yourself before the measure invoked,
      where do they get images for facial recognition, with what

      consent, but it is more true than people with a firm sentence
      have the right to privacy or why they maintain a database of

      photographs of people.

      The system used "performs the identification in real time and deletes
      immediately all the information, only using the results
      positive to contact the authorities in case of detection.

      Mercadona alleges that there is no data processing and that is why it refers to
      0.3 seconds. It is, however, the less surprising that
      protect the "speed". No matter how fast, there is a violation of the

      Privacy. Both the argument of speed and the non-processing of data
      they fall under their own weight.

      We are clearly facing what the European Union has called "authentication".

      In the White Paper on artificial intelligence of the European Commission of 19
      February 2020 establishes that "with regard to facial recognition,
C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 56/113








      By "identification" it is meant that the template of the facial image of a
      person is compared to many other templates stored in a database of

      data to find out if your image is stored in it. The "authentication" (or
      "verification"), meanwhile, usually refers to the search for
      matches between two specific templates. Allows comparison of two

      biometric templates that, in principle, are supposed to belong to the same
      person; Thus, the two templates are compared to determine if the person in the
      two images is the same. This procedure is used, for example, in

      automated border control gates used for controls
      border lines of airports.

      This is a complex issue. In the words of the AEPD itself in its report

      36/2020, "taking into account the aforementioned distinction, it can be interpreted that, according to
      With article 4 of the RGPD, the concept of biometric data would include both

      assumptions, both identification and verification / authentication. Without
      However, and in general, biometric data will only have the
      consideration of a special category of data in the cases in which

      undergo technical treatment aimed at biometric identification (one-to-one
      miscellaneous) and not in the case of biometric verification / authentication (one-to-one). Do not
      However, this Agency considers that it is a complex issue,

      subject to interpretation, for which no conclusions can be drawn
      general, having to attend to the specific case according to the data processed, the
      techniques used for its treatment and the consequent interference in the

      right to data protection, having, as long as it is not pronounced
      in this regard the European Data Protection Committee or the bodies

      jurisdictional, adopt, in case of doubt, the most favorable interpretation
      for the protection of the rights of those affected. "In the present case, it is
      There is no doubt that the use of facial recognition in

      video surveillance employees in the field of private security would imply the
      treatment of biometric data aimed at uniquely identifying
      a natural person, in a one-to-one correspondence search process

      various, the treatment constituting a special category of data whose
      Treatment, in principle, is prohibited by article 9.1 of the RGPD

      The Spanish Agency for Data Protection in a report of May 28,

      2020 made the matter quite clear, concluding that

      * Facial recognition techniques for biometric identification purposes
      involve a treatment of special categories of data for which the

      Regulation requires reinforced guarantees

      * To treat special categories of data for these purposes, the regulations
      requires that there be an "essential public interest" contained in a standard with

      rank of law that does not currently exist in the legal system.


C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 57/113








      * The Agency rejects that the legitimation recognized for the systems of
      video surveillance that only captures and records images and sounds can cover

      technologies such as facial, gait or voice recognition.

      As rightly dictated by the Spanish Data Protection Agency in
      the aforementioned report, so that facial recognition could have a better

      legal protection would need a specific law. There is no rule today
      in our legal system regarding facial recognition.

      The existence of a public interest does not legitimize any type of treatment of

      personal data, but must be, first, to the conditions
      that the legislator may have established, as provided for in Article 6 of the
      RGPD, in its sections 2 and 3, as well as the aforementioned principles of article 5

      of the RGPD, especially those of limitation of the purpose and minimization of
      data. And in the event that one or more of the

      of the personal data included in the special categories of data to which
      referred to in article 9.1. of the RGPD, that any of the
      circumstances contemplated in its section 2 that lifts the prohibition of

      treatment of said data, established in general in its section 1.
      Consequently, the use of facial recognition technologies in
      video surveillance systems involves the processing of biometric data, such and

      as defined in article 4.14 of the RGPD and involves the treatment of categories
      special data regulated in article 9 of the RGPD, being "data
      biometrics aimed at uniquely identifying a natural person ". No

      we are facing a simple authentication, but rather an identification, so
      it requires double legitimation.

      Although article 48 of the Penal Code establishes "the deprivation of the right to

      Residing in certain places or going to them prevents the convicted person from residing or going
      to the place where the crime was committed "and that" the judge or court may agree
      that the control of these measures is carried out through those means

      that allow it "; this would occur by ensuring the rights
      fundamentals of the condemned person, that is, provided that he had given his

      consent. We must remember that the damned enjoy all
      fundamental rights recognized in the Constitution, except for those that
      are expressly limited by the content of the conviction, the

      sense of punishment and prison law.

      THIRD.- Beyond data protection, you could enter other
      issues pertaining to the restraining order. Behind the formalism of a

      restraining order, there are many issues to consider
      for the crime to be committed, such as prior notification and requirement and
      express to the convicted person, and the validity at that time of the order of

      remoteness. These are issues that a very complex
      third for sure.

C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 58/113








      Not everything goes in the matter of Fundamental Rights. These technologies can
      be really intrusive and require a calm ethical and legal debate, all

      time they can have very adverse effects on core values and
      human integrity.

      This analysis is necessary to be able to determine the legality or not of this

      treatment, especially considering the particularities of the category of
      data being processed, biometric data and therefore especially
      protected. This is so by making the images of the faces of the

      interested parties the direct, unique and unequivocal identification of all the
      people to be recorded. Collecting images for later
      recognition must comply with the criteria and standards contained in the

      General Data Protection Regulation, in accordance with which we are
      faced with an intensive treatment of biometric data, which thus poses situations

      of high incursion in the private sphere and in the fundamental right of protection
      of personal data of the interested parties. So much so that in order to be authorized
      and confirm the legality of this type of treatment, the correct

      appreciation of aspects such as the nature and origin of the data, the mode of
      development of the same and, above all, the purpose. These elements must
      be studied together with the informing principles of the regulations at hand,

      in order to determine if the measures implemented are proportional to the
      intrusion into the private sphere of those involved.

      In accordance with the personal data protection regulations, the treatments

      must always respect a minimum level of proportionality between the intrusion
      that these treatments can entail in the private sphere of people and
      the conditions and guarantees that accompany this in order to correct the

      possible adverse effects that they entail. Thus, it is established that for those
      treatments that require data from special categories, as is the case

      of biometric data, the explicit consent of the
      interested as a basis for the legitimation of the uses and actions that are
      to develop with your information. In the case that concerns us, and for the moment,

      The express consent of the interested parties is not being sought, giving
      also a situation in which it is difficult for both parties, company and customer,
      can be considered with the same ability to negotiate the effects of granting

      consent or not, as this translates directly into the impossibility for
      part of the direct customer to continue shopping at that supermarket.

      The level of intrusion into the private life of the interested parties must enter into the already

      aforementioned proportionality judgment, which according to the regulations therefore requires
      the expression of the explicit consent of the interested parties. If this
      consent is not explicitly collected and is not collected by means of

      test how a written support can be, as is being the case in this
      facial recognition treatment, this should be remedied with the support of

      another basis of legitimation strong enough to be justified
C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 59/113








      the need for this treatment to obtain the desired purposes, as can
      be the maintenance of the correct operation of the business and the prevention

      against robbery, theft and situations of insecurity for the workers of the
      business. This basis of legitimation, Mercadona assures, through its
      petition, is the "public interest" that is collected in the same way as legitimation

      exceptional in the personal data protection regulations. But nevertheless,
      This creates doubts when interpreting its validity or lack thereof in this
      case, by actually serving the implementation of this technology in a greater way to a

      private purpose of the company, such as guaranteeing the safety of its
      facilities.

      Regarding the implementation of facial recognition technologies and their use

      appropriate for the guarantee and maintenance of the security of physical places,
      The AEPD ruled in response to a query by a company of

      private security, within Report 010308/2019, which continues to be
      Today the regulatory framework dedicated to regulating this type of treatment is insufficient
      and considering that it will be necessary to approve "a standard with the rank of

      law that would specifically justify to what extent and what assumptions, the use
      of these systems would respond to an essential public interest "for the correct
      definition of the legality requirements of this type of treatment.


      … But what is more, this Chamber cannot share that the measure concerned
      is protecting the public interest, but rather, the private interests or
      individuals of the company in question, since as has already been stated in the

      previous paragraphs, the appropriate guarantees would be violated in order
      to the protection of the rights and freedoms of the interested parties, not only of the
      that have been punished and whose access prohibition is incumbent upon them, if not the rest

      of people who access the aforementioned supermarket.

      (…) >>

                                              V


      Once the legal doctrine to be applied in the present case has been exposed, it is appropriate to enter into the
      procedural issues.

      From the preliminary investigation actions, it is concluded that Mercadona carries out

      a processing of personal data of biometric data (art. 4.14 of the RGPD)
      in order to uniquely identify a specific person among several
      (hereinafter one-to-many) becoming subject to the guarantees of the provisions of the

      art. 9 of the GDPR.

      The treatment does not only occur in relation to the identification of
      criminal convicted with imposition of security measure, consequence of
      the restraining order imposed on those in a criminal sentence, but rather



C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 60/113








      affects anyone who walks into one of its supermarkets (including
      minors) and their employees.

      The data processing implemented by Mercadona includes the collection, collation,

      conservation and destruction -in case of negative identification- (after 0.3
      seconds of its collection) of the captured biometric image of any person
      that enters the supermarket (capture, collation, conservation and destruction are

      four forms of treatment according to the definition of art 4 of the RGPD).

      Mercadona expressly recognizes that there is processing of personal data from
      biometric nature, and thus, for example, in the EIPD provided, it states the following:

      "The data will be kept:


      • Relating to the sentence and the image provided: During the validity period of
      the final judgment that imposes the restraining order.

      • Relating to the negatives of the camera: The treatment will be 0.3 seconds
      (time between capture and deletion after comparison).

      • Relating to the camera positives: Duration necessary for putting them into

      available to the State Security Forces and Bodies ”.

      It should be noted, that the preservation of facial images for the brief
      time lapse of 0.3 seconds constitutes a processing of personal data

      biometric for “one-to-many” identification purposes, without stating
      accredited any of the exceptions for the treatment indicated in the article
      9.2 of the RGPD, so it is not even appropriate to apply the legal bases

      indicated in article 6 of the RGPD.

      The data that is processed is biometric data, whose definition is found
      It is found in article 4.14 of the RGPD: “personal data obtained from a

      specific technical treatment, related to physical and physiological characteristics
      behavior or behavior of a natural person that allows or confirms the identification
      unique of said person, such as facial images or fingerprint data ”.

      In this specific case, it involves the treatment of special categories of data.

      regulated in article 9 of the RGPD, as they are “biometric data aimed at
      uniquely identify a natural person ”. Similarly, the

      Recital 51 of the RGPD also reasons that “only
      included in the definition of biometric data when the fact of being
      treated with specific technical means allows the identification or

      unique authentication of a natural person ”.

      The report 36/2020 of the Legal Office of the AEPD asserts, without prejudice to
      address the complexity of the issue and the impossibility of extracting
      general conclusions, that “biometric data will only have the

      consideration of a special category of data in the cases in which

C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 61/113








      undergo technical treatment aimed at biometric identification (one-to-one
      several) and not in the case of biometric verification / authentication (one-to-one) ”,

      as in the present case.

      In the same sense, the European Data Protection Committee (hereinafter
      CEPD) considers the use of video surveillance with facial recognition as
      special category of data of article 9 of the RGPD in its “Guidelines 3/2019 on

      processing of personal data through video devices ".

                                              SAW

      According to Mercadona, the purpose of the facial and shape treatment
      remote control of the “one-to-many” type is to control compliance with a

      security imposed by sentence on a convicted person in criminal proceedings in
      which Mercadona has been part of.

      The establishment of this surveillance system with facial recognition is linked to

      issuance of several sentences in which a security measure is imposed
      referring to the removal of a convicted person for a minor crime.

      Said security measure consists of the removal of the sentenced person to a
      supermarket or several concrete ones from Mercadona or from the stores of a

      certain territory during a period specified in the judgment that does not
      in no case does it exceed six months (art. 57.3 of the CP).

      Likewise, as a consequence of the express request of this security measure by

      part of Mercadona in the criminal procedure, the judicial resolution allows
      establishment of electronic means to control such measures of
      security as provided in art. 48.4 of the CP.

      In some judgments it is specified that such electronic means can be

      facial recognition, processing biometric data (one-to-many). That happens
      because Mercadona, if asked about the security measure in the process

      court in which it is a party, requests that the security measure be executed at
      through electronic means, specifying it in electronic means
      consistent in facial recognition.

      From the sample of Judgments previously provided by Mercadona in relation to

      With security measures and the use of electronic means, it is extracted
      following:

      (…)

      In view of the sample of Sentences that we have, we have to

      conclude that:

       The security measure agreed by the judicial body affects
      only to the convicted person and to his legal sphere of rights.


C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 62/113








       The security measure includes electronic means with
      facial recognition. But not all rulings authorize Mercadona to

      install that system "one-to-many" (identification), but some do
      Generic mention of electronic means that allow the control of this measure
      security without specifying that it is facial recognition and, as has already been

      discussed above, electronic means of facial recognition do not
      they have to be of the "massive and remote" type.

      The use of remote biometric identification systems in a way
      massive, indiscriminate and remotely in spaces with public access to

      effects of the application of a judicial decision must take into account the
      nature of the situation giving rise to the possible use, in particular the
      severity, probability and magnitude of damage caused in the absence of
      use of the system and also the consequences of using the system

      for the rights, guarantees and freedoms of all affected persons,
      including the condemned.

      In addition to existing cause for lifting the general prohibition that indicates
      Article 9.1 of the RGPD, the use of biometric identification systems in a way
      massive (“one-to-many”), indiscriminate and remotely in access spaces

      public for the purposes of the application of a judicial decision should comply,
      in addition, the safeguards and conditions necessary and provided in
      relation to use, also with regard to time limitations,
      geographical and personal data of those affected.

      In the present case, the judicial decisions previously provided by

      Mercadona does not specify how to control access to
      supermarkets, and the guarantees, rights and freedoms of those affected
      cannot be left to unilateral interpretation and decision on the
      scope of court decisions on the impact on those affected

      (convicted, employees and clients, including minors) of such
      treatments by the responsible company (Mercadona).

      Regarding massive and remote facial recognition (“one-to-many”), the book
      Target on Artificial Intelligence indicates what biometric identification is
      remote, in the following terms:

      “Remote biometric identification must be distinguished from authentication

      biometric (the latter is a security procedure that is based on the
      unique biological characteristics of a person to verify that it is
      who he claims to be). Remote biometric identification consists of determining the

      identity of multiple people with the help of biometric identifiers
      (fingerprints, facial images, iris, vascular patterns, etc.) remotely,
      in a public space and in a continuous or sustained manner, contrasting them with

      data stored in a database ”.

      The treatment now analyzed is characterized by:


C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 63/113








       Use biometric data, which are special categories of data of art.
      9 of the RGPD (one-to-many) on which a general prohibition of

      use, except for the exception provided for in the rule itself. This treatment is,
      therefore, exceptional.

       It is produced remotely in a space accessible to the general public.


       It is a continuous treatment that contrasts the data collected with other
      stored in a database.

       It is an automatic treatment.


       It is extremely high risk (unacceptable) as it can lead to
      massive and indiscriminate surveillance.

       How we can verify the data processing using

      Remote biometric identification is automatic, and the biometric data is captured (
      treats) automatically; therefore it is considered extremely high risk
      (unacceptable) this data processing.

      Furthermore, we cannot ignore that the implementation of

      Remote biometric “identification” of the “one-to-many” type (special category of
      personal data, art 9 RGPD) collects much more information than other types of
      treatment and, furthermore, involuntarily and without knowledge or

      consent, by setting guidelines and using predefined algorithms that determine
      the elaboration of a certain pattern (matrix) characteristic of the image
      treated of each affected person.


      In the treatment now analyzed, a system of
      indiscriminate and massive facial recognition since "depending on the data
      collected biometric data, subject data such as race or gender can be derived

      (including fingerprints), your emotional state, illnesses, defects and
      genetic characteristics, substance use, etc. Being implicit, the
      user cannot prevent the collection of said supplementary information ”-Note

      of the AEPD on the “14 mistakes in relation to the identification and
      biometric authentication ”-. This excess of processed data also violates the
      minimization principle provided in art. 5.1.c) of the RGPD.

      It is Mercadona (as data controller) who has decided

      implement a system of these characteristics that was not previously available,
      consequence of his participation in a criminal judicial process in which he has been
      party and has requested that the specific security measure be authorized

      consisting of the use of a facial recognition system.

      This shows us that Mercadona has requested in the judicial process
      the security measure linked to facial recognition, before performing a

      EIPD, before assessing whether it could carry out the treatment in accordance with the
C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 64/113








      data protection regulations and before evaluating the risks of such treatment
      of data. In this sense, it is insisted, it does not appear in this AEPD that there is

      carried out the prior consultation referred to in art. 36 of the GDPR, since
      implanted treatment is not only extremely risky
      (unacceptable) of impairment of rights and freedoms to customers and

      Mercadona workers, but is prohibited by art. 9.1 of the RGPD. In this
      sense, it should also be noted that in the risk analysis carried out
      Previously, the treatment should have been an unacceptable risk and, in

      Consequently, be avoided.

      Mercadona has requested the adoption of the security measure in the
      criminal procedure and, once agreed, enforces it to justify the

      exception of art. 9.2 of the GDPR; that is, it has preconstituted the legitimation
      necessary to carry out the processing of biometric data in a massive way
      and remote from “one-to-many”. Remember that this security measure is dictated

      only with respect to the convicted person and that only affects the limitation of their
      rights in the terms of the judicial resolution without affecting third parties,
      such as Mercadona's clients and workers. The

      proportionality judgment before requesting this measure before the judicial body,
      as will be seen later.

                                              VII

      We begin by examining whether Mercadona has the standing to carry out this

      type of treatment in the aforementioned conditions.

      Mercadona asserts that it holds legitimacy based on the public interest (art.
      6.1.e) of the RGPD) for video surveillance purposes and that the

      exception of art. 9.2.f) of the RGPD that allows data processing
      biometric of special category, that is, the circumstance that the treatment
      it is necessary for the formulation, exercise or defense of claims.

      The legal basis for the treatment alleged by the company starts from the previous

      lifting of the general prohibition imposed by art. 9.1 of the RGPD through
      of the application of art. 9.2.f) of the RGPD and, subsequently, reference is made to art.

      6.1.e) RGPD. First, the exception of art. 9.2.f) of the RGPD does not concur
      for the potential clients in the treatment now analyzed (nor for the
      workers) according to the AEPD report 010308/2019 already mentioned and, in

      second, the legal basis provided in art. 6.1.b) GDPR is not valid either
      for employees since it is a treatment outside the
      video surveillance system.

      As we have pointed out before, we can observe in terms of legitimation,

      that in the treatment examined there are three types of affected stakeholders
      for this one. On the one hand, the processing of biometric data of a convicted person
      for the imposition of a restraining security measure in a sentence

      penal; on the other, the processing of biometric data of potential clients
C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 65/113








      from Mercadona; finally, the treatment of biometric data of the own
      Mercadona employees.

       Legitimation regarding the data of a convicted person.


      Mercadona bases the treatment on the exception provided in art. 9.2.f)
      of the RGPD to consider that they are entitled to carry out the
      biometric data processing. The art. 9.2.f) of the GDPR lifts the ban

      general provided for in art. 9.1 of the RGPD when “the treatment is necessary
      for the formulation, exercise or defense of claims or when the
      courts act in the exercise of their judicial function ”.


      According to the Report of this AEPD reference 0098/2020, it is concluded that:
      (i) the RGPD separately mentions the extrajudicial claims of

      diverse nature and administrative, and on the other hand, those claims
      that are promoted through judicial bodies.

      (ii) it should be understood the lifting of the prohibition of treatment of

      special data categories such as exceptional, subsidiary and the interpretation of
      its application must be restrictive, in accordance with the special protection of the
      that are creditors this type of data derived from its legal nature.


      (iii) the national or European Union law that regulates these treatments
      It must offer sufficient guarantees to protect the rights of those affected.

      (iv) is that although the RGPD establishes some assumptions that exempt the
      prohibition of treatment of special categories of data, through the right

      Member States can introduce ad hoc regulations in order to
      to adapt the reality of the sectors involved to guarantee protection
      effective of the rights of the citizens of the union.


      The aforementioned report adds that, in general, the assumptions that raise the
      general prohibition of treatment provided for in article 9.2 RGPD, only
      serve this purpose, that is, they act as exceptions to the provisions of section
      1, which does not mean that whenever any of them is given, the treatment can
      gives or must be carried out, since the remaining obligations that
      They are derived from the GDPR itself. That is, the mere existence of a claim to the

      protection of article 9.2 f) RGPD, does not legitimize by itself, the treatment of categories
      special data collections, but must be accompanied by other elements,
      that do not appear, that make the treatment comply with the RGPD.

      The processing of biometric data (“one-to-many”), in this case, could
      occur if necessary for the formulation or exercise or defense of

      claims or when the courts act in the exercise of their judicial function.
      However, in strict terms, in accordance with the literality of the legal norm,
      and for the assumption now examined, the formulation, exercise or defense of

      claims have already been made, since the complaint made by
      Mercadona derives from the situation in which we find ourselves now.

C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 66/113








      However, we could understand that the imposition in a final sentence of a
      security measure is consequence and continuation of the claim

      filed, and this measure derived from the claim may be included in this way.
      the framework of the transcribed precept. Now, in any case, the treatment of
      biometric data for the formulation, exercise or defense of claims

      would be restricted to the biometric data of the defendant and in the
      strict terms and scope of the judicial resolution and not of third parties totally
      foreign to the procedure and even less to the free unilateral interpretation by the

      mercantile scope of the judicial resolution.

      Recital 52 of the RGPD, regarding the prohibition of the treatment of
      special categories of personal data, authorizes the exceptions "always

      that adequate guarantees are given ”, stating that“ It should also be authorized
      exceptionally, the processing of said personal data when it is
      necessary for the formulation, exercise or defense of claims, whether

      by a judicial procedure or an administrative or extrajudicial procedure ”.

      As it is an exceptional authorization, which requires seniors -in case of
      be able to be applied - the establishment of adequate guarantees, the interpretation

      that is granted must be restrictive. This is provided for in recital 51 of the RGPD
      that includes the restrictive character with which the treatment of
      these data, when it states that “Such personal data should not be processed, to

      unless its treatment is allowed in specific situations contemplated
      in this Regulation, given that Member States may
      establish specific provisions on data protection in order to

      adapt the application of the rules of this Regulation to comply with
      a legal obligation or the fulfillment of a mission carried out in the public interest
      or in the exercise of public powers conferred on the data controller.

      In addition to the specific requirements of that treatment, the
      general principles and other rules of this Regulation, especially as regards

      which refers to the conditions of lawfulness of the treatment ”; this interpretation is
      systematically collected by the AEPD in its resolutions -for all, the
      PS / 00145 / 2019-.

      Let's bring up the art. 10 of the GDPR. This precept allows the treatment

      of personal data related to convictions and criminal offenses or measures of
      security, in relation to the personal data concerned in such convictions,
      breaches or security measures. In our case, and with the diction of

      Article would only affect the personal data of the convicted person. And in
      relation with the exception of art. 9 of the RGPD, to the biometric data of the

      condemned.

      A higher, it requires, or it is executed under the supervision of the authorities
      public or authorized by the law of the Union or of the Member States
      that establishes adequate guarantees for the rights and freedoms of

      interested.
C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 67/113








      In this case, the supervision of the Judicial Authority occurs if the
      condemned violent security measures. The Judicial Authority neither reviews nor

      has reviewed the facial recognition system implemented in general,
      nor the impact of the implementation of such a system on the rights and freedoms of the
      rest of the citizens (clients and employees of Mercadona).

      In fact, if the security measure were applied directly by the organ

      judicial could not extend it to other subjects other than the convicted or
      third parties summoned in the procedure and directly affected by the measure

      of security. Consequently, what a judge cannot do in compliance
      of their own measures, much less a private individual to collaborate.

      Regarding the processing of biometric data in a massive and remote way "one-to-one
      several ”of a convicted person for the imposition of a security measure of

      withdrawal in a criminal sentence, the company states that the legal basis
      of the treatment would be that of art. 6.1.e) of the RGPD, thus forgetting the need for
      prior lifting of the general prohibition imposed by art. 9.1 of the RGPD.


      Mercadona asserts about the security measure that "This legitimation, although
      does not need a legal authorization or a specific determination at the
      normative, it must be framed within the Spanish procedural system ”.

      However, in the face of such a statement, the truth is that art. 8 of the LOPDGDD is

      exhaustive in the sense that "The processing of personal data may only
      be considered founded on the fulfillment of a mission carried out in the interest

      public or in the exercise of public powers conferred on the person in charge, in the
      terms provided in article 6.1 e) of Regulation (EU) 2016/679, when
      derives from a competence attributed by a norm with the force of law ”. In

      Consequently, it is mandatory legal authorization for such legal basis to provide
      effects.

      Well, in reality it is that the legal basis contained in art. 6.1.e) of
      RGPD could legitimize the processing of data of the convicted person regarding a

      specific security measure (provided that you have an authorization
      among those of art. 9.2 of the RGPD), understanding that they carry out a mission in

      public interest, by mandate of the judicial body that is assigned for the sake of the
      Power to do so (art 17 of Organic Law 6/1985, of July 1, of the Power of Attorney
      Judicial). However, as has already been pointed out, there is also no evidence that the measure

      security is an essential public interest since what it would protect would be a
      private interest of the merchant.

      In this sense, GT29 in its Opinion 06/2014 on the concept of interest
      public of the data controller under art. 7 of the

      Directive 95/46 / EC, examines what is understood by mission in the public interest,
      stating that “Article 7, letter e), covers two situations and is pertinent
      for both the public and private sectors. First,

      includes situations in which the data controller himself has
C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 68/113








      a public authority or a mission of public interest (but not necessarily
      a legal obligation to process the data) and the treatment is necessary for the

      exercise of said power or for the execution of said mission ”.

      “However, the treatment must be“ necessary for the fulfillment of a
      mission of public interest ”. Alternatively, a power of attorney must have been conferred
      official either to the person responsible for the treatment or to the third party to whom this

      communicates the data and the data processing must be necessary for the exercise
      of said power. It is also important to emphasize that this power

      official or public interest mission should normally be conferred or attributed
      through ordinary laws or other legal regulations. If the treatment involves
      an invasion of privacy or if this is otherwise required under the

      national legislation to guarantee the protection of the affected persons, the
      The legal basis must be sufficiently specific and precise when it comes to
      define the type of data processing that can be allowed ”.

      In endorsement of the affirmed, we only have to examine the art. 10 of the GDPR

      cited by the company: "The processing of personal data related to convictions
      and criminal offenses or related security measures based on the

      Article 6 (1) may only be carried out under the supervision of the
      public authorities or when authorized by the law of the Union or of the
      Member States to establish adequate guarantees for the rights and

      liberties of the interested parties. Only a complete record of
      criminal convictions under the control of public authorities ”.

      In our case, that legitimation that we now find based on the mission
      public interest and collaboration with justice, would be different from the interest

      public used by the company that legitimizes, via art. 6.1.e) of the RGPD and the
      art. 22 of the LOPDGDD video surveillance, especially because, as already

      We have indicated, some of the sentences examined speak
      generically from the use of electronic means to control the measurement
      security, without specifying in a “specific and precise way when defining the

      type of data processing that can be allowed ”.

       Legitimation regarding the data of the possible clients of
      Mercadona.

      The company uses the exception provided for in art. 9.2.f) of the aforementioned RGPD

      to proceed with the treatment of the biometric data "one-to-several" of the
      Mercadona customers.

      As we have indicated previously, the exception provided for in art. 9.2.f) of

      RGPD, relating to the formulation, exercise or defense of claims must be
      be construed restrictively and on its own terms, due to its
      exceptionality in view of the prohibition contained in the first section

      of art. 9 of the GDPR.

C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 69/113








      We have also meant that the proper understanding of art. 9.2.f) of
      GDPR limits, according to a literal, systematic and teleological interpretation of the

      rule, the use of special categories of personal data to cases
      in which the processing of such data is necessary for the formulation, the
      exercise or defense of claims. Thus, we could understand that the

      concept "formulation", "exercise" and "defense" could not only embrace the
      formulation, exercise or defense itself regarding a claim, but
      that could be extended to the execution of the resolution obtained after the

      formulation, exercise or defense of the claim, within the framework of the guardianship
      effective judicial.

      Let's transfer it to domestic law and to the specific "claim" process, since

      that the exception is not indifferent to the functioning of the procedural system
      Spanish.

      In the case under examination, the treatment consisting of the recognition
      facial, which, remember, has been chosen by the mercantile, derives from the imposition

      of a security measure to a specific person, according to a sentence
      favorable judicial obtained by Mercadona. Being, in our case of a

      criminal judicial procedure and restricting it to the characteristics and elements
      definitions of the same established in the legal system, would only affect
      the parties to the procedure (including, where appropriate, a third party when there is

      been summoned by the judicial body so that it can defend what in its
      right is incumbent), without being able to extend its effects to third parties outside the
      same.

      The judicial body when adopting the security measure weighs, as it can only

      be, the affectation of the security measure in the Fundamental Rights of the
      condemned. The judicial body does not examine the impact on the measure of

      security in third parties unrelated to the procedure neither value, nor weigh what
      incidence produces such a security measure in Fundamental Rights of
      the latter (privacy and protection of personal data, among others).

      And this because such a decision does not concern them at all.

      A criminal sentence between parties does not enable data processing per se
      biometric data in a massive “one-to-many”, remote and indiscriminate way, affecting
      to an important and indeterminate population group, including minors.

      In addition to the total disproportion involved in the implementation of this system,
      which we will talk about later. Extrapolating it, we would arrive at the absurdity that,

      by imposing a security measure for a subject or subjects
      specific in a court ruling, or even in an administrative resolution,
      the establishment of a facial recognition treatment could be enabled

      massive, which would violate the letter and spirit of the GDPR.

      The exception provided for in art. 9.2.f) of the RGPD, due to the effect on the
      categories of sensitive data and the risks inherent to the treatment, you must

C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 70/113








      be extremely careful in its restrictive interpretation when it affects a
      indeterminate and massive plurality of people, and who are totally alien to the

      judicial resolution issued.

      It only enables the parties to the claim to use biometric data
      necessary to exercise the claim itself, restricting it to the affectation
      specific person to whom the process refers and the subsequent resolution

      judicial. The biometric data of any potential Mercadona client will not
      have been necessary to formulate the complaint. However, this treatment of

      facial recognition implemented by Mercadona, seen as a whole
      directly affects all potential Mercadona customers, being
      strictly unrelated to the claim itself.

      In conclusion, art. 9.2.f) of the GDPR could lift the ban, but

      restricting such legitimation to a specific sentence and with express scope in
      the same and in relation to the specific security measures imposed,
      with respect to the persons mentioned in it, and for a territorial scope (a

      territory, or one or more supermarkets) and limited time. This is only
      regarding the condemned.

      However, the facial recognition system implemented by Mercadona,

      which lacks legitimacy based on art. 9.1 of the GDPR, it is highly
      intrusive, indiscriminately affecting an undetermined amount of
      citizens. An indirect security measure is imposed on them.

      criminal nature.

      It generates a perverse effect, because finally with those *** NUM. 2 processes
      judicial that say that they file annually throughout the Spanish territory,

      practically in all supermarkets they would have activated a system of
      facial recognition, monitoring all Mercadona customers,
      habitual or not. It would translate into practice in large-scale establishment

      of a facial recognition system highly intrusive in the rights and
      freedoms of those affected. It carries an extremely high risk not
      acceptable.


      In this sense, in the “Guidelines on Facial Recognition” of January 2021 of the
      “Consultative Committee of the Convention for the protection of Individuals with
      Regarding Automatic Processing of Personal Data Convention 108 ”, it is stated

      that private entities cannot develop recognition systems
      facial in uncontrolled environments such as shopping malls, especially for
      identify persons of interest for security purposes: “Private entities

      shall not deploy facial recognition technologies in uncontrolled environments
      such as shopping malls, especially to identify persons of interest, for marketing
      purposes or for private security purposes ”.

      (“Private entities will not use facial recognition technologies in

      uncontrolled environments such as shopping malls, especially for
C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 71/113








      identify persons of interest, for marketing purposes or for the purposes of
      Private security". The translation is from the AEPD).

      Regarding rights, the aforementioned Guide clarifies that they can be restricted only

      when established by a Law, that is, now, in our assumption, the
      The rights of the interested parties cannot be restricted: “These rights can be
      restricted but only when such restriction is provided for by law, respects the

      essence of the fundamental rights and freedoms and constitutes a necessary
      and proportionate measure in a democratic society for specific legitimate

      purposes (such as law enforcement purposes), according to Article 11 of
      Convention 108+ ”.

      (“These rights can be restricted but only when that restriction is
      provided by law, respects the essence of rights and freedoms

      fundamental and constitutes a necessary and proportionate measure in a
      democratic society for specific legitimate purposes (such as enforcement purposes
      of the law), in accordance with Article 11 of Convention 108+ ”. (The

      translation is from the AEPD).

      On the other hand, we must examine whether the company has legitimacy for the
      treatment of biometric data of a special nature ("one-to-several") of the

      potential clients of Mercadona.

      Apart from the general prohibition imposed in art. 9.1 of the RGPD that
      affects biometric data of a special nature, we will go back to the

      art. 6.1.e) of the RGPD cited by the company. The legal basis - if not data
      biometrics of a special nature - it would be the same, the public interest, but in this
      case is not based on the competence of a judicial body that for the

      execution of a security measure allows one of the parties in the process
      criminal processing of personal data of the convicted person (mission in the
      public). It is obvious that citizens, in general, potential clients of

      Mercadona have not been part of the procedure, they are not cited in the judgment, nor
      have been considered for the purposes of implementing any electronic means, nor
      are affected by it.


      The public interest could apparently be found in this case in a
      treatment in video surveillance. Article 22 of the LOPDGDD regulates the
      treatments for video surveillance purposes whose legitimacy is found, such and

      As stated in the Statement of Motives of the referenced legal text, in the
      existence of a purpose of incardinable public interest in article 6.1.e) of the
      RGPD, as its purpose is "to preserve the safety of people and property,

      as well as its facilities ", an objective that goes beyond the mere interests
      legitimate of an individual.

      In the field of private security, said regulation must be completed with the
      provided in its specific regulations, this is the Private Security Law

      (LSP), in which article 42 regulates video surveillance services. States that
C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 72/113








      "Video surveillance services consist of the exercise of surveillance through
      of camera or video camera systems, fixed or mobile, capable of capturing and

      record images and sounds, including any technical means or system that
      allow the same treatments as these.

      When the purpose of these services is to prevent infringements and avoid damage to
      the people or property object of protection or to prevent unauthorized access,

      will necessarily be provided by security guards or, where appropriate, by
      rural guards ”.

      In the case under examination, the video surveillance will be carried out by a

      private security company.

      Now, as reasoned in Report 31/2019 of the Legal Office
      (entry: 010308/2019) of the AEPD "video surveillance treatments

      regulated in the LOPDGDD and the LSP, refer exclusively to the
      treatments aimed at capturing and recording images and sounds, but do not include
      facial recognition treatments, which is a treatment radically

      different when incorporating biometric data, as the RGPD itself recalls in its
      Recital 51 when stating that “The treatment of photographs should not
      systematically consider treatment of special categories of data

      personal, since they are only included in the definition of
      biometric data when the fact of being treated with technical means
      allow the unique identification or authentication of a person

      physical.

      Consequently, the incorporation of video surveillance systems, aimed at
      the capture and recording of images and sounds, of applications of

      facial recognition will involve the processing of biometric data, regarding
      of which the data protection authorities had been warning of the
      risks that they imply for the rights of the people ”.

      The aforementioned report includes several documents of the Working Group of the article

      29, such as Opinion 4/2004 regarding the processing of personal data
      through video surveillance, the working paper on biometrics,

      adopted on August 1, 2003 or Opinion 3/2012 on the evolution of the
      biometric technologies, adopted on April 27, 2012, in which it is exposed
      the difference between conventional video surveillance systems and

      facial recognition, also indicating a diverse set of risks
      important and significant, such as discrimination, such as the fact that the
      treatment can be carried out without the knowledge of the interested party, the possible

      generalization of its use and the errors that may occur.

      In accordance with the above, the legal basis included in art. 6.1.e)
      of the RGPD in relation to art. 22 of the LOPDGDD would be enough to carry
      carry out an ordinary video surveillance treatment (not of a special nature). But

      it would not be enough for a facial recognition system in the terms
C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 73/113








      exposed, that is, a radically different treatment when using data
      biometric data in a massive and remote way of the "one-to-many" type, without

      previously lifted the prohibition established in art. 9.1 of RGPD. For the
      Therefore, it would be necessary to determine what is the precise legal basis to carry out
      perform a facial recognition treatment (“one-to-many”), as well as the

      precise legal requirements for this.

      The Report 31/2019 of the Legal Office (entry: 010308/2019) considers that
      “The current regulation is considered insufficient to allow the use of

      facial recognition techniques in video surveillance systems used by
      private security (...) being necessary to approve a standard with
      range of law that specifically justifies to what extent and in what assumptions,

      the use of these systems would respond to an essential public interest,
      defining said legal norm, after consideration by the legislator of the
      competing interests according to the principle of proportionality, each and every

      one of the material presuppositions of the limiting measure through rules
      that make the imposition of such limitation and its
      consequences, and establishing the technical, organizational and

      adequate procedural rules, which prevent risks of different probabilities
      and gravity and mitigate its effects ”.

      The report concludes that the use of facial recognition systems of

      video surveillance systems used by private security is
      disproportionate, in view of intrusion and unacceptable high risks
      that supposes for the fundamental rights of the citizens. At least

      when it comes to configuring the exception of art. 9.2.g) of the GDPR as a
      essential public interest, specifying the need for specific legal regulation
      (art 8.2 LOPDGDD). The Legal Report 010308/2019 of the AEPD states “…

      in the case of special categories of data, the assumption contemplated in the
      letter g) of article 9.2. does not refer only to the existence of a public interest, such

      and as the RGPD does in many other of its precepts, but it is the only
      RGPD precept that requires it to be "essential", an adjective that comes
      to qualify said public interest, taking into account the importance and necessity of

      greater protection of the data processed. "

      Therefore, we can see consequently that, in response to the
      special characteristics of the data processing carried out (with risk
      extremely high unacceptable), we are not faced with what

      We could define it as an ordinary, ordinary video surveillance system; East
      implanted system that incorporates facial recognition applications has its

      own entity and virtuality, since it treats biometric data aimed at identifying
      a unique way to a natural person through facial recognition, in
      a process of searching for “one-to-many” matches (the convict and the

      rest people who access supermarkets, whether they are potential customers or
      employees) and massively and remotely. This has been stated by the CEPD.

C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 74/113








       Legitimation regarding the data of Mercadona workers.

      Furthermore, we have to mean that there is another group affected

      by the establishment of facial recognition. We mean workers
      of the mercantile, which are also biometrically identified when entering the
      supermarkets.

      Well, the treatment of the biometric data of the employees of

      Mercadona through a facial recognition system such as the one analyzed
      It is not covered by the exception of art. 9.2.f) of the RGPD.

      The art. 20.3 of the Workers' Statute and the exceptions of art. 9.2.f) and

      9.2.h) of the RGPD do not support the legitimacy of the treatment for the purpose
      intended, which is to enforce a security measure derived from a
      judicial procedure between Mercadona and a person who has stolen products

      or caused damage to its facilities (Mercadona does not hold the legitimacy
      to defend assaults and personal and property damages suffered by their
      employees, which corresponds to the latter).

      It is fully applicable for Mercadona employees, what we have

      indicated in the previous section on the use of the legal basis of art. 6.1.e) of
      GDPR. This legal basis, without the exception of art. 9.2.f), it is not

      possible to legitimize the processing of biometric data of employees
      from Mercadona.

      We have to mean that the group of supermarket workers has not
      been considered by the data controller when assessing and choosing

      treatment consisting of a facial recognition system that respects and
      weigh the risks in the violation of rights and freedoms of this group.

      Thus it can be verified from the examination of the administrative file, since,

      In the DPIA, the categories of interested parties are “Subjects who have access to the
      MERCADONA centers; Subjects with a firm sentence ”, page 6.

      You can also see that the DPIA examines the threat consisting of
      that “A treatment is carried out that implies a systematic monitoring of the

      holders without them being aware of the activity and / or scope of the
      itself […] The facial recognition system can systematically evaluate
      (although always with human intervention) the images of the people who

      access MERCADONA centers ”, page 16.

      Employees do not appear as differentiated subjects, they are not taken into account
      as a specific group affected by its own risks. However, they are

      being detected by the facial recognition system every time they enter and
      go out the supermarket door, either to go to work or in the
      performance of their duties.



C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 75/113








      Of course, employees cannot be included among the “subjects who
      they access the MERCADONA centers ”; the latter are all

      potential clients and it is obvious because their risks, together with the eventual
      Risks to the convicted person are the only ones that are examined throughout the DPIA. Do not
      the specific and unique risks of workers are examined. In this

      In this sense, it should be noted that the DPIA provided is incorrect. In this sense,
      brings up the provisions of opinion WP248 on impact assessment
      of GT29: “… By virtue of the RGPD, failure to comply with the requirements of the EIPD

      may lead to the imposition of fines by the supervisory authority
      competent. Failure to carry out a DPIA when the treatment requires a

      evaluation of this type (article 35, paragraphs 1, 3 and 4), carry out a DPIA
      incorrectly (article 35, paragraphs 2, 7, 8 and 9) or not consulting the
      competent supervisory authority where necessary [Article 36 (3),

      letter e)] may result in an administrative fine of up to EUR 10 million or,
      in the case of a company, an amount equivalent to a maximum of 2% of the
      total global annual business volume of the previous financial year, opting

      for the highest amount…) ”.

      Thus, the Opinion 2/2017 on the treatment of data in the work of the GT29
      (adopted on June 8, 2017) establishes that “although the use of these

      technologies can be helpful in detecting or preventing loss of property
      intellectual and material of the company, improving the productivity of
      workers and protecting the personal data of which the

      data controller, also poses significant challenges in terms of
      privacy and data protection. Therefore, a new
      evaluation of the balance between the legitimate interest of the employer to protect his

      company and the reasonable expectation of privacy of the data subjects:
      workers ”.

      Therefore, “Regardless of the legal basis for said treatment, before

      its initiation, a proportionality test must be carried out in order to
      determine whether the processing is necessary to achieve a legitimate purpose, as well as the
      measures to be taken to ensure that violations of the

      rights to privacy and secrecy of communications are limited to the
      minimum. This can be part of an impact assessment regarding the
      data protection (EIPD) ”.

      In the case under examination, no proportionality test has been carried out.

      in relation to the risks and the affectation of the rights and freedoms of the
      employees. This follows clearly from the undoubted fact that not so much

      They are even cited in the EIPD that appears in the administrative file as
      a specific group to value.

      As stated by the GT29 in the aforementioned Opinion “The treatment of data in
      The work must be a proportionate response to the risks to which it is

      a businessman faces ”. In the case under examination, it is not proportionate
C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 76/113








      from the moment that not even the group has been considered at the
      time to determine the risks.

      It is unavoidable to consider whether the treatment (of the biometric data of the

      employees) is proportionate, what the risks are and consider them in all
      case in the DPIA. Opinion 2/2017 on data processing at work
      of the GT29 highlights the need for its realization “in particular if you use

      new technologies, by their nature, scope, context or purposes, entail a high
      risk to the rights and freedoms of natural persons ”. And this because "The

      Modern technologies allow workers to be subjected to
      monitoring over time, at workplaces and at home, through
      through many different devices, such as smartphones,

      desktop computers, tablets, vehicles and wearable technology. If the treatment
      has no limits and is not transparent, there is a high risk that the interest
      legitimate interests of employers in improving the efficiency and protection of

      assets of the company become an unjustified and intrusive control ”.

      In any case, the processing of biometric data of the employees of the
      supermarket supposes an indirect control of these (in the sense that the

      purpose of the treatment is aimed at unambiguously identifying the convict).
      Full control.

      If you have to be to the provision of art. 89 of the LOPDGDD for the purposes of
      respect the privacy of workers when using devices

      video surveillance, much more if we are faced with a differentiated treatment
      of video surveillance, more invasive, with more specific and higher risks, which
      involves the use of biometric data. If such a precept imposes the measure of

      prior information to employees and their representatives, you must also
      proceed in the case examined for the sake of transparency. The

      Information must be provided, in any case, to the representatives of the
      workers and the latter under art. 13 of the GDPR.

      In the case of Mercadona, in view of the number of workers that
      have, the representative body will be the Company Committee, since the

      art. 63 of the Workers' Statute establishes that “The works council is
      the representative and collegiate body of all workers in the
      company or work center for the defense of their interests, becoming

      in each work center with a census of fifty or more workers ”.

      It should be noted, for information, the recent modification of article 64.4.d)
      of the Workers' Statute Law, approved by Royal Decree

      Legislative 2/2015, of October 23 (Workers' Statute), which remains
      drawn up as follows in accordance with article 13.2.f) of the
      GDPR:

      << d) Be informed by the company of the parameters, rules and instructions in

      those that are based on algorithms or artificial intelligence systems that affect
C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 77/113








      decision-making that may affect working conditions, the
      access and maintenance of employment, including profiling. >>

      In addition to the obligations of information and transparency derived from the

      data protection, employee representatives have the right to be
      informed and consulted in certain cases provided for by law.

      The art. 64 of the Workers' Statute (as of the date of the events), on this

      In particular, it indicates that “The works council shall have the right to be informed and
      consulted by the employer on those issues that may affect
      workers, as well as on the situation of the company and the evolution of the

      employment in the same, in the terms provided in this article.

      Information is understood as the transmission of data by the employer to the committee
      company, so that it is aware of a specific issue

      and can proceed to its examination. By consultation is meant the exchange of
      opinions and the opening of a dialogue between the employer and the works council
      on a specific issue, including, where appropriate, the issuance of a report

      prior by the same ".

      It remains the same indicating that the works council will also carry out a work
      of, art. 64.7.a) “1.º Of vigilance in the fulfillment of the norms in force in
      labor, social security and employment matters, as well as the rest of the

      pacts, conditions and uses of the company in force, formulating, where appropriate, the
      timely legal actions before the employer and the agencies or courts

      competent ", for which it will require information on the actions
      business.

      This last precept we can connect you with art. 5.1.a) and arts. 12, 13 and 14 of
      RGPD and art. 89 of the LOPDGDD.

      The administrative file contains a communication to the Inter-Center Committee of

      Mercadona on this matter. The inter-center committee is a body
      second-level representative, established by collective agreement and with the

      functions provided for therein (art. 63 of the ET) that cannot be assumed
      functions of the Works Council, which is the one that, for the reasons stated,
      should be communicated with these questions of implementation of a system of

      facial recognition. However, according to the allegation presented by the
      mercantile, it should be noted that, indeed, in the present case there is
      legally assumed the competence of the Works Council in the Committee

      Intercentres.








C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 78/113








      In any case, the communication made shows that, considered
      this group by the company as affected by the treatment of
      facial recognition, however, there is no reference to the risks on the

      workers' rights in the DPIA. (art 35 RGPD and list of types of
      data processing requiring impact assessment related to protection
      of data). In this sense, as has already been pointed out, the incorrect evaluation of
      impact is grounds for sanction in accordance with the provisions of the CEPD guideline
      Reference WP248, rev. 01, section I in fine.


      That control of the facial recognition system in the terms set forth
      also produces coercive pressure on workers and can lead to
      an extremely high unacceptable risk that restricts the freedom of
      employees, personally and professionally. It is a risk of tracking your
      activities without a sufficiently justified cause and, above all, that

      it has not been taken into account in the elaboration of the DPIA.

      As determined by Opinion 2/2017 on the processing of data in the
      work of WG29, “Systems that allow entrepreneurs to control who

      can enter their facilities, and / or certain areas of their facilities,
      they can also allow the monitoring of workers' activities ”.
      In relation to video surveillance, it continues to state that “Video surveillance continues

      presenting the same issues for worker privacy as
      before: the ability to continuously record the behavior of the
      employee".


      We must not ignore other risks that can be inferred from all this, as it follows
      indicating the aforementioned Opinion that “Although these systems have existed since
      years ago, new technologies aimed at tracking the

      time use and the presence of workers are becoming more widespread,
      including those that process biometric data and others such as tracking
      mobile devices ”and that“ Although these systems may constitute a

      important component of follow-up by the employer, also
      pose the risk of providing an invasive level of insight and control
      on the worker's activities in the workplace ”.


      Thus we find the highly plausible risk of combining data
      obtained from the video surveillance and biometrics system, to "follow" in a
      continued behavior of the worker, although the treatment of

      facial recognition was not originally established for it.

      As GT29 ends, “Therefore, businessmen must refrain from
      to use facial recognition technologies. There may be some
      marginal exceptions to this rule, but such scenarios cannot be used

      to invoke a general legitimation of the use of this technology ”.

      Paraphrasing GT29, compliance with a security measure intended
      a single specific person cannot be used to invoke legitimation


C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 79/113








      general use of this technology in the terms set forth, nor with respect to the
      employees or any other citizen.

      For all the foregoing, we can conclude that the treatment as a whole does not

      has the legitimacy to carry it out, so it violates the provisions of
      the arts. 9 and 6 of the RGPD, infractions typified in article 83.5.a) of said rule
      and considered very serious for the purposes of prescription in art. 72.1.e) and a),

      respectively, of the LOPDGDD.

                                             VIII

      It is necessary to carry out the proportionality judgment before initiating
      any treatment.

      In this sense, the Constitutional Court has indicated, for all the Judgment

      of the Constitutional Court 14/2003, of January 28, that “to verify whether a
      restrictive measure of a fundamental right exceeds the judgment of
      proportionality, it is necessary to verify whether it meets the three requirements or

      following conditions: whether such a measure is likely to achieve the objective
      proposed (suitability judgment); if, in addition, it is necessary, in the sense that
      there is no other more moderate measure to achieve this purpose with

      equal efficacy (judgment of necessity); and, finally, if it is weighted or
      balanced, by deriving from it more benefits or advantages for the interest
      general that damages to other goods or values in conflict (judgment of

      proportionality in the strict sense) ”.

      And this is based on the jurisprudence established by the European Court of Rights.
      Humans, that is, the overcoming of a triple judgment, in the sense of determining whether
      the interference produced in the owner of the right object of restriction by the

      measure is the minimum in order to achieve the legitimate aim pursued with it.

      The first thing we have to indicate is that, regarding the treatment of
      Mercadona's facial recognition -which affects data processing not only
      of the convicted person, but of all potential clients and employees-, the judgment of

      proportionality in the broad sense must be carried out in a timely manner.

      Notwithstanding the foregoing, authorized by the judicial body an electronic means
      generic or a specific one such as facial recognition without indicating the shape or

      way to carry it out (see judgments), it is still necessary to carry out the trial
      proportionality before starting treatment to assess which means is
      more suitable, if necessary to fulfill the purpose allowed by the

      judgment and examine the proportionality of the measure.

      Second, that the proportionality judgment when it covers the treatment of
      biometric data requires particularly careful examination and
      detailed.



C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 80/113








      The GT29 in its Opinion 3/2012 on the evolution of biometric technologies
      indicates that “When analyzing the proportionality of a proposed biometric system,

      It is necessary to consider in advance whether the system is necessary to respond to the
      identified need, that is, if it is essential to satisfy that need, and
      not just the most suitable or profitable. A second factor to be taken into account

      account is the probability that the system will be effective in responding to the
      need in question in light of the specific characteristics of the technology
      biometric to use. A third aspect to ponder is whether the loss of

      Resulting intimacy is proportional to the expected benefits. If the profit
      it is relatively minor, such as a greater comfort or a slight saving,

      then the loss of privacy is not appropriate. The fourth aspect to evaluate
      the adequacy of a biometric system is to consider whether a less
      invasive of intimacy would achieve the desired end ”.

      Third, and already entering the examination of the proportionality judgment, regarding

      suitability, the facial recognition system may be suitable for
      comply with the restraining measure with respect to the convicted person, but it is not
      necessary, as there are less intrusive alternative measures, nor is it strictly

      proportional, to the extent that more benefits are derived for the interest
      public that damages to other goods or values in conflict, taking into account

      account that its massive and indiscriminate application is intended for all
      potential clients, regardless of the level of risk they represent and
      becoming the exception of the possibility of processing biometric data

      in the general rule, contrary to what is intended by the RGPD.

      In this way, in the previously cited judgments it is considered that the
      security measure requested by the company is possible to apply it without

      to rule on the guarantees on the rights and freedoms of the
      affected that must be associated with its implementation or justify the application of
      none of the exemptions of art. 9.2 of the RGPD. Now, of course, the

      judicial body does not manifest itself regarding the restriction of rights
      fundamental neither for the convicted person nor for the rest of the citizens with the
      implementation of the generalized facial recognition system, as it exceeds the

      scope of its competence. And in this sense it has already been pointed out, and it will be insisted
      later, that said treatment is prohibited in application of art. 9.1

      of the GDPR.

      Let us take as an example the aforementioned Judgment of Santander in which it is indicated
      that “It is also requested that the establishment be authorized to control this
      measured through the electronic means available to the entity

      Mercadona in order to facial recognition, pursuant to art. 58.4 CP, which
      provides: “The judge or court may agree that the control of these measures shall be
      perform through those electronic means that allow it ”. There are not

      absolutely no inconvenience in granting what is requested, since
      that the impact on the sphere of rights or interests of the convicted person is

C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 81/113








      minimum, in the case of only a means or instrument available to the
      establishment to enforce what has been agreed more effectively ”.

      Thus, it authorizes the implementation of the security measure with respect to the convicted person.

      assessing conflicting interests, without even examining the impact on
      Mercadona's clients and workers (as none of them are part of the
      Penal procedure). It may, therefore, be an ideal measure with respect to the

      condemned, but it is not with respect to the rest of the citizens, specifically
      Mercadona's clients and workers, who are affected in a way

      indiscriminate.

      Therefore, the facial recognition treatment as a whole, integrating the
      treatment of biometric data of potential clients and employees of
      Mercadona is not suitable. Other systems or mode of

      carry it out in a way that does not affect their rights and public freedoms.

      Let us remember that even understanding that this biometric data processing
      implanted by Mercadona is the one authorized by the judicial body, it would only be

      for the purpose of adopting a security measure in relation to the
      convicted and, even so, respecting his fundamental rights, unless resolution
      judicial against.

      In any case, there are means that are less invasive of the rights and freedoms of

      prospective customers and supermarket employees to get the
      intended purpose; some of which could fall directly on the

      convicted (such as and together with the prohibition to go to certain
      places, impose on the convicted person a slight penalty of permanent location or
      impose a location system on it, which would be assessed by the judicial body

      at the request of the party concerned) without affecting at all and at no time to
      the rights and freedoms of no one else; others could be traditionally
      used to hang the photograph of the convicted person in the place -with restricted access

      and controlled - where ordinary video surveillance images are displayed, or
      well that the photograph of the convicted person included in an electronic device is
      manually compared "one-to-one" at the entrance of the establishment.


      Fourth, and once the decision to install the system has been made, it must be
      necessary “in the sense that there is no other more moderate measure for
      achieving this purpose with equal efficiency ”.

      It should be examined whether to carry out the treatment it is necessary to

      a certain pre-established way or if, of all the options
      available, the one that is more moderate and with less incidence in
      the rights and freedoms of the citizens concerned and in accordance with the

      RGPD and LOPDGDD regulations.

      We will start from the concept of the need for treatment, which should not be confused
      with utility of the same. A facial recognition system can be helpful, but

C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 82/113








      does not have to be objectively necessary (the latter being what
      really must be present). As established by GT29 - Opinion 3/2012

      on the evolution of biometric technologies - it should be examined “if it is
      essential to satisfy that need, and not just the most suitable or profitable one ”.

      In this sense, the AEPD, analyzing the need for a treatment concludes
      that, “If it is necessary or not, in the sense that there is no other measure

      moderate for the achievement of such purpose with equal effectiveness by power
      the activity be carried out manually. The term need should not

      be confused with useful but if the treatment is objectively necessary for the
      purpose ”-by all, PS / 00052 / 2020-.

      If there is no objective need for the treatment now under analysis, if it is not
      essential to satisfy that need, the treatment is not proportional or lawful.

      Consequently, it is prohibited.

      In the case examined the facial recognition system can be useful,
      but not necessary, since not being the only one with which the

      intended purpose as there are multiple alternatives, it is the only one that can
      produce a devastating interference with the rights and freedoms of
      citizens. Consequently, it is insisted, it is prohibited.

      In this same sense, the SEPD states, in an article on October 28,

      2019 titled "Facial Recognition: A solution in search of a problem?" addressing
      this type of treatment. Thus, it requires that treatment by

      facial recognition is "demonstrably necessary", that is, objectively
      necessary and that there are no other less intrusive alternative means through
      which the same objective is obtained and expressly states that "the

      efficiency and convenience are not sufficient justification ”.

      (Retrieved on February 22, 2020 from https://edps.europa.eu/press-
      publications / press-news / blog / facial-recognition-solution-search-problem_en.)

      But it is that, in addition, to greater abundance and, for the purposes merely

      illustrative, we cannot ignore the fact that the convicted person can circumvent
      with ease the facial recognition system with a simple mask -like
      It is explained in the note of the AEPD on the “14 mistakes in relation to the

      identification and biometric authentication ”, with which, it could happen that
      Once the system was implemented, it was also neither useful nor effective for the purpose
      claimed by the supermarket.

      Here, the principle of minimal intervention comes into play (art. 5.1.c) and art. 25.1

      RGPD), because, in addition, it must be proven that there is no other measure more
      moderate for the achievement of the intended purpose with equal effectiveness, in the

      framework of the proactive responsibility of the data controller.

      Although the court generically authorizes the facial recognition system,
      does not oblige to install it or prevent the establishment of another with which it can
C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 83/113








      the same purpose be achieved by other less intrusive systems. This is nothing
      would happen if, instead of installing this facial recognition system like the

      now analyzed, Mercadona would choose another that would allow it to make the
      security measure (e.g. ordinary surveillance system with or without security guard
      security, ie not remotely "one-to-one").

      Furthermore, the authorization of the judicial body is not at all a carte blanche,

      nor does it confer an unlimited right for Mercadona, but must comply with the
      data protection regulations. Especially because the establishment of

      this facial recognition system may de facto involve the implantation
      improper security measure for all customers and employees of
      Mercadona, as has happened.

      In this same sense, Report 36/2020 of the Legal Office of the AEPD,

      regarding the use of facial recognition techniques in conducting
      online evaluation tests, pointed out that "the existence of a public interest
      does not legitimize any type of personal data processing, but must

      be, in the first place, to the conditions that the
      legislator, as provided for in article 6 of the RGPD, in its sections 2 and 3,

      as well as the aforementioned principles of article 5 of the RGPD, especially the
      of limitation of the purpose and minimization of data. And in case they go
      to be subject to any or some of the personal data included

      in the special categories of data referred to in article 9.1. of the
      RGPD, that any of the circumstances contemplated in its section concur
      2 that lifts the prohibition on the processing of said data, established with

      general character in its section 1 ”.

      Fourth, and with regard to proportionality in the strict sense, we must
      to examine how many convictions they have obtained, what is the measure

      agreed in each of them, regarding how many people, how many
      supermarkets affect such sentences and if all this is proportional in relation to
      with the number of clients that enter their centers each day and the number of

      global supermarkets that they have in the Spanish territory.

      Thus, we must consider whether the adoption of such treatment is weighted,
      balanced, derive from it more benefits or advantages for the general interest than
      damages to other goods or values in conflict. Faced with the interest of

      Mercadona to enforce a restraining order (with respect to who has
      committed a minor crime in its facilities), the rights to

      privacy and data protection of all customers and their employees.

      At a glance, it turns out that the treatment is excessive. Well to do
      effective a security measure for an average of *** NUM. 3 people per year
      throughout the territory of the Spanish State - according to their calculations, on an average of

      *** NUMBER 2 judicial processes - for a limited period and established in the sentence
      -which can be a maximum of six months as it is a minor offense- it is

C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 84/113








      could manage to monitor once implemented in all shopping centers
      to an average of *** NUM. 7 clients per year (…). This measure would also affect the
      collective of its workers, which number in more than 100,000 workers.

      Mercadona has 1,624 establishments in Spanish territory.

      Or, in other words, to control access to Mercadona from a single
      person will be controlled at an average *** NUM. 1 potential daily customers

      per store (to be multiplied by the number of stores
      affected by the security measure).

      Mercadona alleges that the system has only been installed in *** NUMBER 8 centers, and in
      Consequently the above numbers are incorrect. In this sense, it must

      note that the aforementioned *** NUMBER 8 establishments refer to "test" mode and
      the highly plausible intention is its extension to the whole of
      commercial establishments.

      If to adopt a security measure of a citizen it has to be a matter of

      massively and indiscriminately the personal data of the rest of the citizens,
      the treatment is clearly disproportionate. Now let's add that we
      we find the processing of biometric data intended to identify

      uniquely to a person. A system would be installed in the private sphere that
      is not being used by the State Security Forces and Bodies that

      They pursue the achievement of general interest purposes.

      Regarding the immense amount of data collected, it should also be added,
      that there is no evidence that adequate technical measures have been taken to prevent
      a possible transfer of such data to third parties, including third countries

      outside the EEA. The measure taken is limited to a contractual prohibition of the type
      formal between the company and the entity in charge and owner of the applied software
      (*** COMPANY.2), based on a prior authorization from the person in charge, without

      previous studies that reliably prove the technical impossibility of
      carry out the aforementioned transfer to third countries given the extremely risk
      high (unacceptable) that would lead to a reduction in rights, guarantees and

      freedoms of those affected.

      We must review regarding the disproportion of the treatment, which are treated
      personal data of any person who enters the supermarket, buys or not,
      including unimpeachable minors. Unimpeachable minors

      in no case can they be affected by a conviction.
      The company argues that it is not possible to detect the age of the people
      affected, with all the more reason not to carry out this type of treatment. The

      extremely high risk assumed in treatment is unacceptable.

      Also for these reasons there would be a violation of the principle
      data minimization (art. 5.1.c) RGPD).



C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 85/113








      Thus, it can be verified from the simple examination of the administrative file,
      that in the EIPD and on the threat consisting in that "Data is processed

      inadequate, irrelevant, excessive or unnecessary for the intended purpose "
      no mention is made of these data, which are entirely excessive,
      page 13. They limit themselves to considering only the data of the convicted person regarding

      of the principle of minimization, since they state that “Only the
      data derived from final judgments, in which MERCADONA is a party and is
      have provided images during the procedure as evidence, which

      determine the restraining order becoming effective through the possible
      use of new technologies ”.

      The principle of minimization that is obliged in all data processing

      personal article 5.1.c) of the RGPD, in view of the documentation sent and
      to the description of the treatment carried out, we can consider that the system
      facial recognition implemented by Mercadona in forty (*** NUM. 8)

      of its shopping centers treats biometric data aimed at "identifying" a
      unequivocally to a natural person, in a process of searching for
      “one-to-many” correspondence subject to the provisions of article 9 of the

      RGPD, treatment also called by the doctrine "massive and form
      remote ”, in order to differentiate it from other automated facial treatments

      also biometrics of a comparative "one-to-one" type aimed at "authenticating"
      a person with a database (could also be facial images)
      automated or with human intervention in each of the checks,

      less intrusive features. It is the case of having in a team
      electronic the database of images to be compared (undoubted people) and
      be limited manually to perform the comparison "one-to-one" to

      “Authenticate”, what the doctrine calls “non-remote mass” treatment. There are not
      doubt that the latter type of treatment would considerably minimize the
      risks of violating the rights, guarantees and freedoms of the people who

      enter the establishment by limiting themselves to what is necessary and pertinent (principle of
      minimization, art. 5.1.c) GDPR).

      Consequently, this processing operation in the terms set forth

      violates the provisions of art. 5.1.c) of the RGPD, offense typified in art.
      83.5.a) of said rule, considered very serious for the purposes of prescription in the
      art. 72.1.a) of the LOPDGDD, when treating excessive personal data for the

      purpose to which it is directed.

                                              IX

      It is necessary to carry out the impact assessment before starting any
      high-risk treatment in order to be able to detect, where appropriate, those

      unacceptable that would prevent treatment.

      In the case analyzed, in addition, a DPIA must be carried out. In this sense
      it is accurate when “it is probable that the processing operations

C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 86/113








      pose a high risk to the rights and freedoms of natural persons ”,
      considering 84 RGPD, “before treatment”, considering 90 RGPD, and
      will be carried out under the terms of art. 35 of the GDPR. The treatment intended by

      Mercadona is included in the list of types of data processing
      that require an impact assessment related to data protection (art 35.4). The
      EIPD must include the aforementioned proportionality judgment.


      Before implementing a “one-to-many” facial recognition system, the
      responsible should first assess if there is another less intrusive system with the
      that the same purpose is obtained. Section 72 of CEPD Guide 3/2019

      "On processing of personal data through video devices", clarifies in this regard
      that “The use of biometric data and in particular facial recognition
      entailheightened risks for data subjects ’rights. It is crucial that recourse to such

      technologies takes place with due respect to the principles of lawfulness,
      necessity, proportionality and data minimization as set forth in the GDPR. Whereas
      the use of these technologies can be perceived as particularly

      effective, controllers should first of all assess the impact on fundamental rights
      and freedoms and consider less intrusive means to achieve their legitimate
      purpose of the processing ”.

      (“The use of biometric data and, in particular, facial recognition

      carries greater risks for the rights of data subjects. It's fundamental
      that the use of such technologies takes place respecting the principles of
      legality, necessity, proportionality and minimization of the established data

      in the GDPR. Considering that the use of these technologies can be perceived
      As particularly effective, managers should first assess
      the impact on fundamental rights and freedoms and consider means

      less intrusive to achieve their legitimate goal of transformation ”. The
      translation is from the AEPD).

      However, Mercadona has requested the adoption of a security measure in

      the courts consisting of facial recognition treatment before
      assess the concurrence of risks and the need to carry out a DPIA, which does not
      appears in the administrative file - as evidenced in the fact

      that the DPIA is subsequent to the request for such a security measure in a
      plurality of criminal proceedings. Even if the DPIA is prior to the
      execution of the treatment, the adequate understanding of the responsibility

      proactive and privacy by design imply valuing from the moment
      original of the outline of a treatment of personal data if this can
      carried out. Thus, the first moment in which the idea of requesting the

      security measure consisting of a facial recognition treatment before
      courts and tribunals, it should have been the occasion to assess and detect the risks
      on the rights and freedoms of citizens.

      It should be added that the risks derived from such automaticity are high in themselves

      themselves and, in fact, unacceptable by not being able to reduce the initial inherent risk
C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 87/113








      at adequate levels (residual irrigation) as there is a legal prohibition in accordance with
      points out article 9.1 of the RGPD. Such treatment occurs without intervention

      as soon as the corresponding system is installed and activated, in such a way
      that the person concerned cannot prevent the processing of their data
      personal in its aspect of the exercise of the right of suppression and opposition,

      which may imply an infringement of art. 35 of the RGPD, typified in article 83.4.a) of
      said norm and considered serious for the purposes of prescription in art. 73.t) of the
      LOPDGDD (in this sense, see GT29 248 already mentioned).


                                              X

      In this approach, they ignore and do not consider the possibility that the
      all potential customers entering the supermarket are being treated
      inappropriate, irrelevant, excessive or unnecessary for the purpose

      planned. They have not considered for a moment that this is the situation of the
      unimputable minors.

      Although in principle the personal data of minors is not

      they are especially safeguarded in view of simply age
      of these, it is also the case that the legal system protects them especially,
      because of their special vulnerability. This protection is specifically deployed in

      protection of personal data from the Convention 108 of the Council of Europe
      - “specific attention shall be given to the data protection rights of children and
      other vulnerable individuals ”-, going through the RGPD and the LOPDGDD, to the

      Organic Law 1/1996, of January 15, on the Legal Protection of Minors, of
      partial modification of the Civil Code and the Civil Procedure Law.

      The latter establishes in its article 2 that “Every minor has the right to have his

      best interests is valued and considered as paramount in all
      actions and decisions that concern him, both in the public sphere and
      private ”, specifying in its art. 4 regarding your right to honor, privacy

      personal and family and to the own image and in its art. 22c regarding the
      Treatment of personal data.

      The art. 28.2 of the LOPDGDD prevents as one of the greatest risks to

      that must be attended by the person in charge and the person in charge of the treatment that the “e)
      When the data processing of affected groups is carried out in
      situation of special vulnerability and, in particular, of minors and

      People with disabilities".

      In this sense, we will highlight recital 38 of the RGPD which establishes that
      “Children deserve specific protection of their personal data, since
      may be less aware of the risks, consequences, guarantees and

      rights concerning the processing of personal data. Such protection
      Specific should apply in particular to the use of personal data of
      children for the purposes of marketing or profiling of personality or

      user, and to obtain personal data related to children when they are used
C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 88/113








      services offered directly to a child. The consent of the owner of the
      parental authority or guardianship should not be necessary in the context of services

      preventive or advisory services offered directly to children ”.

      For this reason, the AEPD has clarified in its guides the special recommendations of
      protection of minors, as happens with video surveillance
      in relation to image capture in school environments.


                                               XI

      Regarding transparency, in relation to the information provided
      To those interested, there are several aspects to review.

             Previously, in the present case, it should be noted that the treatment analyzed

      does not comply with the rules of the RGPD as indicated above,
      so it is a prohibited treatment. However, we proceed to analyze
      briefly the informative posters.

             First, regarding the posters, they indicate that it is “to detect

      only those people with a restraining order or judicial measure
      analogous, in force that may pose a risk to your safety ”.

      These convicted persons generate risk to the property and facilities of the

      supermarket, which is why they have been condemned. The security risk
      of clients is clearly indirect and very tangential. And the
      security of customers by ordinary video surveillance system. There are not

      transparency in information.

      In the administrative file, in the EIPD, it is established in a context -se
      copy literally- “Facial recognition system to identify agents
      outsiders with a current restraining order issued within the framework of a sentence

      firm, enabling the use of technological means for its effectiveness,
      harmful to MERCADONA employees and centers ”, page 4.

      In the same way we find it when in the aforementioned document they determine the

      purpose of the treatment, which again restricts the security of your
      employees and their assets (Mercadona centers): “Recognition system
      facial to identify external agents with a restraining order in force

      issued within the framework of a final judgment, allowing the use of
      technological means for its effectiveness, harmful to employees and
      MERCADONA centers ”, page 6 (private interests).


      They do not cite the supermarket chain's customers as potential
      objectives of "your security." Surprisingly, they do so in the aforementioned posters
      above and in the information they show their employees to give

      explanations to potential clients.



C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 89/113








      The information provided is not correct, nor does it fit the purpose (to make
      effective a security measure), since the system does not start up

      to protect customers, but Mercadona, as a result of obtaining
      a judgment favorable to their interests (containing a penalty for the
      condemned). In any case, the ordinary security system is sufficient to

      guarantee the security of the clients (art. 22.1 of the LOPDGDD). Not precise
      establish facial recognition system like the one now analyzed for
      guarantee the safety of customers, because if necessary for such purposes,

      it would be the one that would ordinarily be established in all types of facilities. Without
      However, this facial recognition system is a security system

      extraordinary when biometric data is processed in order to identify
      univocally to a person "one-to-many" and remotely meet
      included in the special category of personal data (art. 9 of the RGPD).

      As we have pointed out before, the information provided in the posters of the

      supermarkets is the same, without specifically indicating in which of them it is
      activated the system or if by simply hanging the poster it is found
      activated, or for how long it is activated (duration of the

      security), nor is the specific purpose explicit.

      The impression is passed on to customers that in all supermarkets there is
      installed the system and permanently. Potential customers are stolen

      the possibility of not entering the specific supermarket and choosing another in which you do not
      the facial recognition system is installed. It is de facto limiting the
      right of self-determination, freedom and privacy. Derivative risks

      of this incorrect information are clear, the impairment of their freedoms and
      Fundamental rights.

      The information should indicate whether or not the system is installed. Especially if so and

      as Mercadona affirms, it will only use the system “in the event that it is
      part of a judicial procedure in which, by means of a final resolution,
      determine the use of facial recognition to enforce orders for

      remoteness".

             Second, that in the case of such invasive technologies and, based on the
      Reasons set forth above regarding minors and other vulnerable groups
      that deserve special protection, the information provided should be

      specific to them.

      Recital 58 RGPD, on the principle of transparency (information) “…
      Since children deserve specific protection, any information and

      Communication whose treatment affects them must be facilitated in clear language and
      simple that is easy to understand ”. And article 12 RGPD states that “The
      responsible for the treatment will take the appropriate measures to facilitate the

      interested party all information indicated in articles 13 and 14, as well as any
      communication in accordance with articles 15 to 22 and 34 regarding the treatment, in

C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 90/113








      concise, transparent, intelligible and easily accessible form, with clear language and
      simple, in particular any information directed specifically to a

      kid…".

      Although the EIPD indicates that “The use of innovative technologies such as
      easy recognition poses a risk to subjects due to the novelty that
      They present the same and the ignorance about their operation. MEASURES;

      It is reported clearly and transparently about the treatment and technology
      used ”, page 17, there is no additional measure, specific to

      transmit the information adequately to minors and other groups
      vulnerable. The information provided is the same for everyone.

              Third, regarding transparency and possible transfers
      international organizations, which assert that they will not be produced, the truth is that in the

      treatment manager contract means that there is the possibility of
      international transfer in certain cases: “8.2. In case of
      transfer of personal data to a third country that does not belong to the Union

      European Union, a country that does not have an adequate level of protection, or a
      international organization, the Data Processor must obtain the

      prior written authorization of the Data Controller and cooperate to
      guarantee an adequate protection framework under current regulations,
      through the application of binding corporate rules, the formalization of

      standard contractual clauses adopted by the European Commission or, in its
      case, obtaining the authorization of the transfer by the authority
      competent". They do not inform customers of such a possibility or establish how

      it would report if this assumption finally occurred. Previously it has already
      noted the absence of technical measures to avoid possible transfers
      undue internationals.


      The lack of transparency in the information that prevents warning those affected
      that the implanted treatment is not possible, rather, it is prohibited,
      constitutes another of the volitional elements of responsibility.

      Consequently, the information provided by the company both to the public in

      general as the employees violates the provisions of art. 12 of the GDPR to
      Failure to comply with the requirements cited in arts. 13 of said rule, infringement
      typified in article 83.5.b) and considered very serious for the purposes of prescription in

      the art. 72.1.h) of the LOPDGDD.

                                               XII

      The foregoing is applicable to the information provided in the "privacy policy",
      in which it is limited to informing generically - regarding the treatment of

      facial recognition system or early detection system-, the following:

      Data categories: biometrics (in those stores in Spain where it is
      early detection system implemented).

C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 91/113








      Purpose: "Carry out the necessary actions to protect the interests
      vital of the clients when it is necessary, or the fulfillment of the

      judicial decisions and the measures agreed upon therein ”.

      Data maintenance time: "In relation to the protection of the
      vital interest of the people and the execution of the judgments or resolutions that
      carry restraining orders on the work centers and / or people, the

      data will be processed and guarded for the necessary time to give
      compliance with the judicial measures of those people sentenced to

      said restraining order (in those stores in Spain where it is
      early detection system implemented.

      However, the data collected accessory to comply with said
      purpose will remain on the server only in the process of

      check (this check takes tenths of a second). One time
      Once this check is performed, it will proceed to be definitively destroyed (in
      those stores in Spain where the detection system is implemented

      anticipated) ”.

      International transfers: “In those cases in which Mercadona
      have service providers or suppliers that are outside the

      European Union, international transfers made with them are
      fully guaranteed according to the standards established by the
      Regulation (EU) 2016/679 of the European Parliament and of the council of April 27

      of 2016, and criteria of the Spanish Agency for Data Protection ”.

      Legitimation: "In the case of the treatment of sensitive data
      will be treated for reasons of public interest with the consequent

      considerations provided by data protection regulations, which must be
      proportional to the objective pursued, which is to enforce the law, respecting the
      remaining principles of the data protection regulations and establishing the

      adequate and specific measures to protect the interests and rights of
      interested, on the basis of the law of the Union or of the member states
      (in those stores in Spain where the detection system is implemented

      anticipated) ”.

      Data communication: “The State Security Forces and Bodies in
      by virtue of what is established in the law ”.

      Other information: "In the same way we inform you that, in order to improve the

      security of customers and employees, Mercadona, based on the public interest can
      process your image or your biometric facial profile to identify subjects with a
      restraining order 8th analogous judicial measure) in force against Mercadona or

      against any of its workers (in those stores in Spain where
      the early detection system is in place).



C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 92/113








      These images will only be processed internally by Mercadona, being
      exclusively communicated to the Security Forces and Bodies for
      protect the safety of Mercadona customers and workers and the

      compliance with the measures judicially decreed (in those stores of
      Spain where the early detection system is implemented) ”.

      Rights: (…) regarding the opposition, “In certain circumstances and for

      reasons related to your particular situation to the processing of your data, the
      Interested parties may object to the processing of their data. Mercadona will stop
      process the data, except for compelling legitimate reasons, or the exercise or

      defense of possible claims ”.

                                             XIII

      On the other hand, the risks derived from errors in the identification of a
      person who is not prohibited from access by the security measure, linked to
      form intrinsic to the design by default indicated in art. 25.1 of the GDPR.


      In these facial recognition systems, a pattern is used to make the
      facial recognition - the result of an initial treatment of personal data by
      which also constitutes personal data prepared and contained in the scope

      of the right of access that may be exercised-, but it is known that “the
      stored biometric information (e.g. pattern) allows to reconstruct
      partially original biometric information (eg face). Bliss

      partial reconstruction is sometimes faithful enough for another
      biometric system recognizes it as the original ”-14 misunderstandings regarding
      the identification and biometric authentication of the AEPD-. And this links us with the

      need to implement regular evaluations to verify the
      relevance and sufficiency of the guarantees granted (section 4 of
      Guidelines3 / 2019 on processing of personal data through video devices, del

      CEPD).

      There are several studies in the framework of facial recognition, both of the type
      “One-to-one” (biometric data) as “one-to-many” (biometric data of category
      special), which refer to the high error rates in certain

      assumptions inherent to the incipient technology and scant datification of the
      applied artificial intelligence systems. In this sense, the great demand
      global "data" to feed this type of software, makes it necessary to take

      measures, at least technical, to avoid undue transfers and, in particular,
      possible international transfers that will make possible in the future
      identification of the affected person in environments and purposes very different from

      initials.

      For such purposes, the studies carried out by C.C.C., who put
      showed that the high rates of error in the identification of individuals

      through facial recognition occur when it comes to individuals of
      color and women (in the latter case, whatever the color of their skin).
C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 93/113








      In this second assumption the misunderstandings originate derived from the minimum
      number of images of women containing the training sets and the

      test sets (which mostly use images of white men).
      He also considers that facial recognition does not work well in children and
      elderly adults. C.C.C. perceives the existence of what they call the

      algorithmic bias.

      (*** URL.1)

      *** URL.2

      In addition, we must bring up the error in the identification that can be
      produce currently due to the pandemic situation that requires us to

      Mandatory masks for all people. The National Institute of
      United States Standards and Technology (NIST) has conducted since

      2002 various independent evaluations of TRF's business systems.
      It is the Face Recognition Vendor Test. One of his evaluations was
      focuses on the massive use of masks, concluding that the error rate in the

      Today's most widely used facial recognition algorithms are skyrocketing
      between 5% and 50%.

      (Retrieved February 22, 2021 from https://www.nist.gov/programs-
      projects / face-recognition-vendor-test-frvt


      https://pages.nist.gov/frvt/html/frvt_facemask.html

      https://www.nist.gov/news-events/news/2020/07/nist-launches-studies-masks-
      effect-face-recognition-software)

      Identification errors also occur in relation to family members and
      brothers, as stated by the AEDP in its note on the “14 mistakes with

      relation to biometric identification and authentication ”.

      It is true that the predictable error rate issues from the
      Design is a controversial issue, since the greatest technological development in the

      more or less near future will improve the accuracy rate.

      (Retrieved on February 22, 2021 from
      https://itif.org/publications/2020/01/27/critics-were-wrong-nist-data-shows-best-

      facial-recognition-algorithms)

      But, today, it is one more risk that we cannot afford, since the
      Inaccuracy is predictable from the moment of design of this type of
      information systems when it comes to identifying the convicted person and his confusion with

      another person may generate a risk of discrimination and social exclusion
      unacceptable. And this to the greater abundance of all considerations
      wielded about the lack of regulations that legitimize it (prohibited treatment) and



C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 94/113








      ensure the appropriate level of proportionality vis-à-vis the rights and
      freedoms for those affected.

      The violation of data protection by design violates article 25.1

      of the RGPD, typified in article 83.4.a) and considered serious for the purposes of
      prescription in art. 73.d) of the LOPDGDD.

                                              XIV


      Regarding the risks derived from the treatment, it must be taken into account
      that facial recognition is configured as a method
      involuntary identification through the use of biometric data, as stated

      established in the Ethical Guidelines for a Reliable AI, a document presented in
      2019, produced by the High-Level Expert Group on Artificial Intelligence
      under the protection of the European Commission.

      The risks derived from such automatism are very high by themselves, since a

      person cannot prevent the processing of their personal data, because such
      treatment (the capture and subsequent treatment of your biometric data of your

      face in the case of facial recognition) occurs automatically, without
      human intervention as soon as the corresponding system is installed and activated.

      In fact, in the cited document it is listed as one of the first and
      major concerns the identification and follow-up of people through

      artificial intelligence techniques and, as for what interests us, that "the
      automatic identification raises serious concerns both from the point of
      legal and ethical view, since it can have unexpected effects on many

      psychological and sociocultural levels ”; therefore, they differentiate "between
      identification of a person versus track and trace, and between a

      selective or massive surveillance ”.

      Likewise, they assert that the application of this type of technology must be
      clearly justified in existing legislation, which is not the case.

      Furthermore, we cannot ignore that the implementation of a system
      facial recognition such as the one now analyzed collects much more

      information of the subject than other types of treatments, not being able to be prevented
      by the affected person, consequence of automation and algorithms

      applied, since “depending on the biometric data collected, they can
      derive data from the subject such as race or gender (including fingerprints
      fingerprints), their emotional state, illnesses, defects and characteristics

      genetics, substance use, etc. Being implicit, the user cannot
      prevent the collection of said supplementary information "-Note from the AEPD on
      the "14 mistakes in relation to biometric identification and authentication" -.

      Regarding the risks of social exclusion, discriminatory risks and the principle of

      accuracy, it should be noted that we can perceive two major risks of

C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 95/113








      social exclusion derived from a possible malfunction of the system
      implanted by the mercantile.

      In this sense, it is included in the Guidelines 3/2019 on processing of personal

      data through video devices (Version for public consultation. Adopted on 10 July
      2019), that “In addition to privacy issues, there are also risks related to possible
      malfunctions of these devices and the biases they may induce. Researchers

      report that software used for facial identification, recognition, or analysis performs
      differently based on the age, gender, and ethnicity of the person it’s identifying.
      Algorithms would perform based on different demographics, thus, bias in facial

      recognition threatens to reinforce the prejudices of society. That is why, data
      controllers must also ensure that biometric adopted 5 data processing deriving
      from video surveillance be subject to regular assessment of its relevance and

      sufficiency of guarantees provided ”.

      (“In addition to privacy concerns, there are also risks related to
      possible malfunctions of these devices and the biases that can
      induce. The researchers report that the software used for the

      facial identification, recognition or analysis is performed in a
      different depending on the age, gender and ethnicity of the person who is
      identifying. The algorithms would be performed on the basis of different

      demographics, therefore, facial recognition bias threatens
      reinforce prejudices in society. Therefore, those responsible for the treatment
      data must also ensure that biometric data processing

      adopted in 5 derived from video surveillance undergo evaluation
      periodic review of its relevance and sufficiency of the guarantees provided ”. The
      translation is from the AEPD).


      On the one hand, we find a long-term risk of discrimination against
      a person criminally convicted (even after he has served the
      conviction and criminal record expunged) that continues to be identified

      as in a situation of distance from supermarkets.

      In the DPIA, all issues related to the
      principle of accuracy; of the one carried out by the company there is no evidence that
      assessed and these risks are specifically considered

      previously, which has led to treatment operations
      undue with impairment of the guarantees, rights and freedoms for the
      affected. To which we must add that it is not contemplated in the EIPD either

      provided by the company, evaluation of any impact on minors who
      access the premises and their employees, and leaves empty of content in the exercise
      of certain rights contained in articles 12 and 13 and 15 to 22 of the RGPD.


      These deficiencies in the elaboration of the DPIA with the aforementioned consequences
      should be considered a substantial defect that de facto invalidates the DPIA
      done. Consequently, the lack of knowledge of the possible impacts of the

C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 96/113








      data processing implemented on the freedoms and rights of those affected
      and, consequently, absence of corrective measures that minimize it or, as

      is the case, that they invalidate it, supposes an infringement of the provisions of the article
      35 of the RGPD, infraction typified in article 83.4.a) of said norm and considered
      serious for the purposes of prescription in art. 73.t) of the LOPDGDD.

      For illustrative purposes only, we will mean that some companies have

      abandoned their businesses and facial recognition programs for
      interference with privacy and clear risks of racial discrimination.

      There is also a general risk of using biometric data from

      facial recognition by converting everyone who enters the
      supermarket in possible suspects, subject to biometric surveillance
      indiscriminate (does not discriminate neither by group, nor by age, nor by vulnerability,

      etc.) which implies an abuse of the use of biometric data and a clear
      interference in the fundamental rights and public freedoms of the
      citizens. This has been understood by the European Citizens' Initiative (ICE)

      entitled «Initiative of the civil society to prohibit the practices of
      massive biometric surveillance »(Civil society initiative for a ban on biometric mass

      surveillance practices) presented to the European Commission in January 2021.

      Regarding the specific risks of vulnerable subjects, the European Agency
      Fundamental Rights (European Union Agency for Fundamental Rights,
      known by its acronym UEFRA) has produced in 2019 a document entitled

      "Facial recognition technology: fundamental rights considerations in the context
      of law enforcement ”. In the same examines, in addition to the risks to
      privacy, the protection of personal data and discrimination concerned

      for a treatment with a facial recognition system, other rights,
      freedoms and legal rights affected.

      It makes specific mention of certain most vulnerable groups which are

      minors, elderly or disabled people.

      Regarding minors, it indicates that “Facial recognition systems affect the
      rights of children in different ways. […] The child’s best interests must also be

      given a primary consideration in the context of using facial recognition technology
      for law enforcement and border management purposes. […] Due to the particular
      vulnerability of children, the processing of their biometric data, including facial

      images, must be subject to a stricter necessity and proportionality test, compared
      to adults. […] Software tests clearly indicate that images of younger people result
      in considerably more false negatives (misses) compared to other age groups,

      most probably due to rapid growth and change in facial appearance ”.

      (“Facial recognition systems affect the rights of children from
      different ways. [...] The best interests of the child should also receive
      a primary consideration in the context of the use of

      facial recognition for law enforcement and border management. [...]
C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 97/113








      Due to the particular vulnerability of children, the processing of their data
      biometric data, including facial images, must be subject to a

      stricter necessity and proportionality, compared to adults. [...]
      Software tests clearly indicate that images of people
      Younger people result in considerably more false negatives (faults) in

      compared to other age groups, most likely due to the rapid
      growth and change in facial appearance ”. The translation is from the AEPD).

      Therefore, given the special protection that the legal system provides to the

      childhood, the evaluation regarding the proportionality of the data processing
      personal data of minors through biometric systems must be subject to a
      judgment of necessity proportionality much stricter than that which would refer to

      grown ups. This is not reflected in the DPIA carried out by Mercadona. The exam
      is absolutely generalist and omits groups at high risk, a circumstance
      which, if taken into account, would have reported a risk finding

      extremely high unacceptable and therefore prohibited.

      Regarding the risks to the rights and freedoms of the employees of
      Mercadona have not even been considered in the DPIA presented.

      Earlier we mentioned the right to self-determination. Attached to

      itself, together with the right to privacy, arises the true risk of loss of
      freedom and privacy. Judgment 600/2019 of the First Civil Chamber of the
      Supreme Court, of November 7, 2019 (Rec. 5187/2017) examined the

      that the right to privacy involved the establishment of a fictitious chamber;
      Thus, it is recognized as part of the right to privacy the right not to have to
      endure permanent uncertainty regarding a camera that may or may

      not be activated, real or fictitious. It is true that it refers to a camera oriented
      to a private estate and not to a public space, but it serves to illustrate the

      impact on privacy. The undoubted fact is that no one
      it behaves the same if it is being recorded or so you think; if a fake camera can
      produce a more than significant impact on privacy, is located in a

      private or public space, imagine the repercussion of a camera
      fully operational and, furthermore, the shock of the use of
      massive and indiscriminate facial recognition of the "one-to-many" type. The risk

      is increased by the lack of adequate information on the posters, as
      as we have expressed in previous sections.

      Opinion 3/2012 on the evolution of biometric technologies of the GT29

      considers that “However, these systems used on a large scale can
      produce serious side effects. In the case of facial recognition, where
      biometric data can be easily captured without the knowledge of the

      interested, wide use could end anonymity in the spaces
      public and allow continuous monitoring of people ”.



C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 98/113








      It is necessary to add, regarding the risks derived from the exercise of rights,
      we can see how in the DPIA presented by the company, page 17,
      understands as one of the threats to the group of people who

      access to supermarkets that "No means have been made available
      or the interested party has not been informed about his option of opposition to the taking of
      automated decisions ”, explaining that“ Although the

      information to the subjects of the possibility of exercising their right to
      opposition (based on the legitimacy of article 6 of the RGPD), this can

      present certain risks ”.

      Subsequently, among the measures to be adopted, they indicate that “based on the article
      21.1 MERCADONA must stop processing the data unless it proves
      compelling legitimate reasons for the treatment that prevail over the

      interests, rights and freedoms of the interested party, or for the formulation, the
      exercise or defense of claims ”.


      Since the data processing of the facial recognition system is

      automatic, massive and remote and the image is captured and processed automatically,
      This measure is impossible to carry out (make effective the right to
      opposition / deletion) safe from uninstalling the system established in all

      supermarkets. If an interested party exercises their right of opposition / deletion and
      You have the right to it, your opposition affects the processing of data by the
      supermarket from the same capture of the facial image,

      regardless of where the supermarket you are
      access the interested party.


      In the documentation provided by the company (doc 7.1 and Doc. 7.2) it is not justified
      the denial of the right of opposition exercised, on a generic basis in the
      existence of “… an overriding public interest…”. Recital 69 of the GDPR

      states: “(69) In cases where personal data can be processed
      lawfully because the treatment is necessary for the fulfillment of a
      mission carried out in the public interest or in the exercise of public powers

      conferred on the data controller or for reasons of legitimate interests
      of the person in charge or of a third party, the interested party must, however, have the right
      to object to the processing of any personal data related to your situation

      particular. The person responsible must be the one who demonstrates that their legitimate interests
      overriding interests or rights and freedoms prevail

      fundamentals of the interested party ”. In the same sense, the article 21.1 of the
      RGPD: “… The data controller will stop processing personal data,
      unless it proves compelling legitimate reasons for the treatment that

      prevail over the interests, rights and freedoms of the interested party, or
      for the formulation, exercise or defense of claims ... "

      It would be leaving without content and de facto the right of opposition or deletion,
      remembering that a limitation to these rights can only be established by
C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 99/113








      of legislative provisions of the EU or of the Member States, in the
      terms of recital 73 and articles 23 and 89 of the RGPD.

                                             XV


      Furthermore, this approach is not unique at the European level since other
      control authorities follow him.

      In this regard, the Control Authority of the Netherlands (Netherlands) issued
      a formal warning to a supermarket for the use of

      facial recognition.

      The system implemented, the purpose of its establishment, the question regarding its
      lack of legitimacy in relation to facial recognition treatment

      used by a Dutch supermarket chain is almost identical to the
      Mercadona course.

      Thus, this treatment is implemented to prevent certain people from
      access supermarkets, in response to a ban issued to that effect. The

      supermarket claims that the facial recognition system had been
      implemented in order to protect its customers and staff and prevent theft
      in the shops. The cameras were also located at the entrance to

      the stores and, in the same way as Mercadona, we proceed to scan all the
      people entering the store, comparing it with the database of

      persons prohibited from entering and, if discarded, deleting the
      data processed after several seconds.

      The Vice President of the Netherlands Supervisory Authority, has
      stated that “It is unacceptable that this supermarket, or any other store

      from the Netherlands, start using facial recognition technology ”,
      stating that the use of this technology is prohibited in almost all cases.
      He goes on to explain that “Facial recognition makes us all walk

      barcodes ”, and that“ Your face is scanned every time you enter a
      store, stadium or sports stadium that uses this technology. And it is done

      without your consent. By putting your face through a search engine, there is
      the possibility that your face could be linked to your name and other information
      personal. This could be done by matching your face to your social media profile,

      for instance".

      The Netherlands Supervisory Authority also considers that with the
      implantation of facial recognition cameras we can be monitored
      continually. And that there is an extremely high (unacceptable) risk of

      subsequent use of the information that qualifies us as suspicious or
      interest or profile us.

      The aforementioned Control Authority continues to indicate that there are two cases of use

      allowed to use facial recognition. The first is based on the
      explicit consent of the client to process their data; not constituting
C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 100/113








      explicit consent the informative warning to the client of the use of the
      technology in stores. Entering a supermarket cannot be understood

      how to give consent.

      In our case under examination, Mercadona intends to process the data
      biometric data of potential clients without requesting their consent, based on
      one of the exceptions indicated in art. 9.2 of the RGPD that, as we have

      explained, it is not applicable.

      And the second exception is if facial recognition technology is necessary
      for security purposes, but only in the public interest

      substantial. The supermarket claims that this is the case. But the aforementioned Authority
      de Control does not consider it that way. The vice president of the Control Authority of
      The Netherlands indicates that the only example in their country is that of security in

      a nuclear power plant.

      (Retrieved February 19, 2021 from https://edpb.europa.eu/news/national-
      news / 2021 / dutch-dpa-issues-formal-warning-supermarket-its-use-facial-

      recognition_es)

      For its part, the European Data Protection Supervisor, as we have
      stated above, published an article on October 28, 2019 titled
      "Facial Recognition: A solution in search of a problem?" addressing this type of

      treatments.

      (Retrieved on February 22, 2020 from https://edps.europa.eu/press-
      publications / press-news / blog / facial-recognition-solution-search-problem_en)

      In said article, it is indicated that “The purposes that triggered the introduction of

      facial recognition may seem uncontroversial at a first sight: it seems
      unobjectionable to use it to verify a person's identity against a presented facial
      image, such as at national borders including in the EU. It is another level of

      intrusion to use it to determine the identity of an unknown person by comparing
      her image against an extensive database of images of known individuals ”,

      (“The purposes that triggered the introduction of facial recognition

      may seem uncontroversial at first glance: It seems unobjectionable to use it
      to verify the identity of a person against a presented facial image,
      as well as at national borders, even in the EU. It's another level of intrusion

      use it to determine the identity of an unknown person by comparing
      your image with an extensive database of images of individuals
      known ”. The translation is from the AEPD)


      That is, it raises more than reasonable doubts due to the intrusion that “using it
      to determine the identity of an unknown person by comparing their
      image with an extensive database of images of known people "

      (one-to-many).

C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 101/113








      And, he adds, that “any interference in fundamental rights under the Article 52 of the
      Charter must be demonstrably necessary. The bar for this test becomes higher

      the deeper the interference. Is there any evidence yet that we need the
      technology at all? Are there really no other less intrusive means to achieve the
      same goal? Obviously, ‘efficiency’ and ‘convenience’ could not stand as

      sufficient ".

      (“Any interference with fundamental rights under Article 52 of the
      Letter must be demonstrably necessary. The bar for this test is

      it becomes higher the deeper the interference. Is there any evidence
      still that we need technology for everything? Are there really no other
      less intrusive means to achieve the same goal? Obviously the

      "Efficiency" and "convenience" couldn't be enough. " The translation is of
      the AEPD).

      Another issue that we highlight from your article is the reference to respect for
      principles of data minimization and accuracy, when he mentions that “Facial

      recognition technology has never been fully accurate, and this has serious
      consequences for individuals being falsely identified whether as criminals or

      otherwise. The goal of ‘accuracy’ implies a logic that irresistibly leads towards an
      endless collection of (sensitive) data to perfect an ultimately unperfectible
      algorithm. In fact, there will never be enough data to eliminate bias and the risk of

      false positives or false negatives "

      (“Facial recognition technology has never been completely accurate, and
      this has serious consequences for the people who are identified
      falsely, whether as criminals or otherwise. The goal of 'accuracy'

      implies a logic that leads irresistibly to an endless collection of
      (sensitive) data to refine an algorithm that is ultimately

      possible. In fact, there will never be enough data to eliminate bias and
      risk of false positives or false negatives ”. The translation is from the AEPD).

                                             XVI

      In the present case, it must be concluded that the data processing

      personal data through facial recognition on the terms that the
      Mercantile has implemented in its supermarkets, it does not allow
      exemption of article 9.2.f) of the RGPD to the general prohibition imposed by the

      Article 9.1 of said rule. Consequently, from that moment on it is not
      possible to legitimize the treatment based on the legality criteria of article 6
      of the GDPR. The implanted treatment is prohibited in accordance with

      provided in art. 9.1 of the RGPD, regardless of the measures of
      security and legality conditions set out in article 6 of the RGPD.

      Notwithstanding the foregoing, it would not be lawful to go directly to the provisions
      in article 6.1.e) since it cannot be shared that with the measure of

      implemented identification is protecting the public interest, but rather,
C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 102/113








      the private or particular interests of the company in question, public interest
      which in any case should be essential. In the same vein, the legal basis

      provided in art. 6.1.b) GDPR is also not valid for employees every time
      that it is a treatment outside the video surveillance system. What's more,
      There is no legal regulation that allows it according to the provisions of article 8 of the

      LOPDGDD. It should be emphasized that the treatment analyzed is
      prohibited from its origin as indicated in article 9.1 of the RGPD

      On the other hand, the company does not comply with the right to information.

      required in article 12 and 13 of the RGPD. In this sense, it is not reported
      significant way on the logic applied in the recognition treatment
      applied facial, nor does it allow those affected to exercise their rights given the

      immediacy of treatment. It should be emphasized that the treatment analyzed
      It is prohibited from origin as indicated in article 9.1 of the RGPD

      Nor does it appear that the principle of minimization stated in the
      Article 5.1.c) of the RGPD. The treatments carried out through

      facial recognition technology are extremely risky treatments
      high (unacceptable), with a high probability of incidence and severity which

      makes the inherent risk very high and its reduction to
      acceptable residual risk, which would allow with a high probability that
      carry out treatments of various kinds (including those affected by the article

      9.1 of the RGPD) and with great impact beyond what is strictly necessary. In view of
      an "unacceptable" level of risk must resort to the provisions of article 36
      of the RGPD, prior consultation, which has not been carried out. Also, keep in mind

      account of the incorrect assessment of the impact on the rights and freedoms of
      those affected when it does not contemplate all of the subjects involved. I know
      must insist that the treatment analyzed is prohibited from origin

      as indicated in article 9.1 of the RGPD

      Furthermore, and without prejudice to the fact that the treatment analyzed is
      is prohibited from origin as indicated in article 9.1 of the RGPD with

      independence of the security measures implemented, the treatment
      analyzed does not have the proper security safeguards from design,
      since the implanted system carries out a systematic evaluation and

      exhaustive of personal aspects of natural persons on a large scale of data
      of special category. In fact, it is clear that the entity in charge of the logic
      applied to the treatment undertakes to guarantee a level of security

      appropriate to the risk, which, where appropriate, includes, among others: pseudonymisation. In
      Consequently, the design admits the possibility that the treatment of

      data is carried out on people identified remotely, massively and
      indiscriminate.

      Finally, and taking into account all of the above, especially the high level of
      risk of the violation of the rights and freedoms of those affected by the

      treatment object of analysis, it is considered proportional the maintenance of
C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 103/113








      precautionary measure imposed as it is a treatment prohibited from its origin
      In accordance with what is stated on the article. 9.1 of the RGPD.


                                              XVII

      The facts analyzed could constitute an infringement, attributable to the
      claimed, for violation:


       of art. 9 of the RGPD (treatment of special categories of data),
      typified in article 83.5.a) of said rule and considered very serious for the purposes of
      prescription in art. 72.1.e) of the LOPDGDD, which may be sanctioned with

      fine of up to € 20,000,000 or, in the case of a company, of
      an amount equivalent to a maximum of 4% of the total annual turnover
      overall of the previous financial year, opting for the highest amount, of

      in accordance with article 83.5.a) of the RGPD.

       of art. 6 of the RGPD (legality of treatment), typified in article 83.5.a) of

      said rule and considered very serious for the purposes of prescription in art. 72.1.a)
      of the LOPDGDD, which may be sanctioned with a fine of up to € 20,000,000
      at most or, in the case of a company, an amount equivalent to 4%

      maximum of the total annual global turnover of the financial year
      above, opting for the highest amount, in accordance with article 83.5.a)
      of the GDPR.


       of arts. 12 and 13 of the RGPD (transparency of the information provided to
      different groups affected), typified in article 83.5.b) and considered very

      serious for the purposes of prescription in art. 72.1.h) of the LOPDGDD, which may be
      sanctioned with a fine of € 20,000,000 at most or, in the case of a
      company, of an amount equivalent to a maximum of 4% of the volume of

      total annual global business of the previous financial year, opting for that of
      higher amount, in accordance with article 83.5.b) of the RGPD.

       of art. 5.1.c) (principle of data minimization) and typified in art.

      83.5.a) and considered very serious for the purposes of prescription in art. 72.1.a) of the
      LOPDGDD, which may be sanctioned with a maximum fine of € 20,000,000
      or, in the case of a company, an amount equivalent to a maximum of 4%

      of the global total annual turnover of the previous financial year,
      opting for the highest amount, in accordance with article 83.5.a) of the RGPD.

       of art. 25.1 of the RGPD (data protection by design) typified in

      art 83.4.a) and considered serious for the purposes of prescription in art. 73.d) of the
      LOPDGDD, which may be sanctioned with a maximum fine of € 10,000,000

      or, in the case of a company, an amount equivalent to a maximum of 2%
      of the global total annual turnover of the previous financial year,
      opting for the highest amount, in accordance with article 83.4.a) of the RGPD.


C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 104/113








       of art. 35 of the RGPD (impact assessment), typified in article 83.4.a) and
      considered serious for the purposes of prescription in art. 73.t) of the LOPDGDD,

      which may be sanctioned with a fine of € 10,000,000 at most or, in the case of
      of a company, of an amount equivalent to a maximum of 2% of the volume
      total annual global business of the previous financial year, opting for that of

      higher amount, in accordance with article 83.4.a) of the RGPD.

      Likewise, it is considered that the sanctions to be imposed should be adjusted according to

      in accordance with the following criteria as indicated in art 83 of the RGPD:

      Art 83.1 of the RGPD. effective, proportional and dissuasive (company size)

      "1. Each supervisory authority will guarantee that the imposition of fines
      administrative regulations pursuant to this article for the infractions of this

      Regulations indicated in paragraphs 4, 5 and 6 are in each individual case
      effective, proportionate and dissuasive ”.

      The defendant has a turnover in 2019 (latest audit report

      published) of more than 25,000 million euros and 90,000 employees, so that
      constitutes a large company, with 1,636 stores open.

      Art 83.2 RGPD.


      “A) the nature, seriousness and duration of the offense, taking into account the
      nature, scope or purpose of the processing operation in question

      as well as the number of interested parties affected and the level of damages
      who have suffered "

      The data being processed is of a special category and the volume of data

      treated can exceed *** NUM. 7 per year of facial examinations,
      including minors and vulnerable people. The treatment is carried out in a
      remote, massive and indiscriminate.


      "B) intentionality or negligence in the infringement"


      The development of the early detection system has been promoted by the
      responsable. There is no evidence that the respondent has chosen to make a consultation
      prior to the AEPD as indicated in art. 36 of the GDPR, even though the

      implanted treatment constitutes an extremely high risk
      unacceptable source for the rights and freedoms of customers and employees
      of the mercantile.

      "D) the degree of responsibility of the person in charge or the person in charge of the treatment,

      taking into account the technical or organizational measures that have been applied in
      under articles 25 and 32 "



C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 105/113








      The degree of responsibility is fully attributable to the claimed and consists
      that the deficiencies and incompatibilities of the treatment are of decision and

      own responsibility, specifically purpose and means.

      "G) the categories of personal data affected by the infringement"

      From the design of the implanted security system, it is established that a

      systematic and exhaustive evaluation of personal aspects of natural persons
      large-scale special category data.

      “H) the way in which the supervisory authority learned of the infringement, in

      particular if the person in charge or the person in charge notified the infraction and, in such case, in
      what measure "

      It is clear that the AEPD was aware of the treatment now analyzed through

      of two claims unrelated to the one claimed.

      "K) any other aggravating or mitigating factor applicable to the circumstances of the
      case, such as financial benefits obtained or losses avoided, direct

      or indirectly, through the infringement ”.

      Article 76 of the LOPDGDD. Sanctions and corrective measures.

      "1. The sanctions provided for in sections 4, 5 and 6 of article 83 of the

      Regulation (EU) 2016/679 will be applied taking into account the criteria of
      graduation established in section 2 of the aforementioned article.

      2. In accordance with the provisions of article 83.2.k) of Regulation (EU) 2016/679

      The following may also be taken into account:

      As aggravating factors:

      a) The continuing nature of the offense.


      Costa that the treatment is being carried out from July 1, 2020, until
      05/06/2021.


      b) The linking of the activity of the offender with the performance of treatments
      of personal data.


      The claimed is a large company in the general distribution sector with
      CNAE code 4711, “Retail trade” sector in establishments no
      specialized, and processes the personal data of customers and

      workers habitually.

      (…)

      f) Affecting the rights of minors.

C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 106/113








      It is established that the data processing implemented affects minors and
      vulnerable people accessing the establishments.


      (…)

      3. It will be possible, complementary or alternatively, the adoption, when

      appropriate, of the remaining corrective measures referred to in article
      58.2 of Regulation (EU) 2016/679.

      4. The information that

      identify the offender, the offense committed and the amount of the penalty imposed
      when the competent authority is the Spanish Agency for the Protection of
      Data, the penalty was greater than one million euros and the offender is a

      legal person. When the competent authority to impose the sanction is
      an autonomous data protection authority, its regulations of
      app."


      As mitigating factors:

      Art 83.2) RGPD:

      e) There is no record of recidivism or repetition. This mitigation has been of special

      relevance to establish the amount of the pecuniary fine now proposed.

      From the foregoing, it is considered proportional, effective and dissuasive to impose the

      following administrative fines as indicated in art. 58.2.i) of the RGPD that a
      The following is indicated:

       For the alleged violation of arts. 6 and 9 of the RGPD, typified in art

      83.5.a) of said rule and considered very serious for the purposes of prescription in
      the art. 72.1.a) and e), respectively, of the LOPDGDD, administrative fine of
      amount € 2,000,000.


       for the alleged infringement of art. 5.1.c) of the RGPD, typified in art
      83.5.a) of said rule and considered very serious for the purposes of prescription in the

      art. 72.1.a) of the LOPDGDD, administrative fine of € 500,000.

       For the alleged violation of arts. 12.13 of the RGPD, typified in art
      83.5.b) of said rule and considered very serious for the purposes of prescription in the

      art. 72.1.h) of the LOPDGDD, an administrative fine of € 100,000.

       For the alleged infringement of art. 25.1 of the RGPD, typified in art

      83.4.a) of said rule and considered serious for the purposes of prescription in art.
      73.d) of the LOPDGDD, administrative fine of € 500,000.





C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 107/113








       For the alleged infringement of art. 35 of the RGPD, typified in article 83.4.a)
      of said rule and considered serious for the purposes of prescription in art. 73.t) of

      the LOPDGDD, administrative fine of € 50,000.

                                              XVIII


      The art. 69 of the LOPDGDD, states the following:


      “Article 69. Provisional measures and guarantee of rights.


      1. During the performance of the preliminary investigation or initiated actions

      a procedure for the exercise of the sanctioning power, the Agency
      Spanish Data Protection may reasonably agree on the measures
      provisional necessary and proportionate to safeguard the right

      fundamental to data protection and, in particular, those provided for in article
      66.1 of Regulation (EU) 2016/679, the precautionary blocking of data and the
      immediate obligation to meet the requested right.



      2. In cases where the Spanish Data Protection Agency considers
      that the continuation of the processing of personal data, its communication or
      international transfer would entail a serious impairment of the right to

      protection of personal data, you may order those responsible or in charge
      of the treatments the blocking of the data and the cessation of its treatment and, in
      If these mandates are not complied with, proceed to immobilize them.


      3. When it has been submitted to the Spanish Agency for the Protection of

      Data a claim that refers, among other issues, to the lack of
      attention within the term of the rights established in articles 15 to 22 of the
      Regulation (EU) 2016/679, the Spanish Data Protection Agency may

      agree at any time, even prior to the initiation of the
      procedure for the exercise of the sanctioning power, by resolution

      motivated and after hearing the person responsible for the treatment, the obligation to
      meet the requested right, continuing the procedure for the rest
      of the issues that are the subject of the claim ”.


      Preamble I of the LOPDGDD says: “The protection of natural persons in
      relationship with the processing of personal data is a fundamental right

      protected by article 18.4 of the Spanish Constitution. This way,
      our Constitution was a pioneer in the recognition of the fundamental right
      to the protection of personal data when it provided that "the law shall limit the use

      information technology to guarantee the honor and personal and family privacy of the
      citizens and the full exercise of their rights ”. Thus echoed the
      work developed since the late 1960s in the Council of

C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 108/113








      Europe and the few legal provisions adopted in countries of our
      environment. The Constitutional Court indicated in its Sentence 94/1998, of 4

      May, that we are faced with a fundamental right to the protection of
      data by which the person is guaranteed control over their data,
      any personal data, and about its use and destination, to avoid traffic

      illicit of the same or harmful to the dignity and rights of those affected; of
      In this way, the right to data protection is configured as a faculty
      of the citizen to object to certain personal data being used

      for purposes other than the one that justified its obtaining. For its part, in the
      Judgment 292/2000, of November 30, considers it a right

      autonomous and independent consisting of a power of disposition and control
      on the personal data that empowers the person to decide which of those
      data to provide to a third party, be it the State or an individual, or which may

      this third party to collect, and that also allows the individual to know who owns those
      personal data and what for, being able to oppose such possession or use. (…). By
      On the other hand, it is also included in article 8 of the Charter of Rights

      Fundamentals of the European Union and in article 16.1 of the Treaty of
      Functioning of the European Union. Previously, at the European level,
      The aforementioned Directive 95/46 / EC adopted, the purpose of which was to ensure that the guarantee

      of the right to the protection of personal data does not constitute an obstacle to the
      free movement of data within the Union, thus establishing a

      common space of guarantee of the right that, at the same time, ensures that in
      case of international transfer of data, its treatment in the country of
      destination was protected by safeguards adequate to those provided in the

      own directive ”.
      Article 56 of Law 39/2015, of October 1, on the Procedure

      Common Administrative of Public Administrations (hereinafter,
      LPACAP), as applicable, states the following:

      "1. Once the procedure has been initiated, the competent administrative body to resolve,

      may adopt, ex officio or at the request of a party and in a reasoned manner, the measures
      provisional that it deems appropriate to ensure the effectiveness of the resolution

      that could fall, if there were sufficient elements of judgment for it, of
      in accordance with the principles of proportionality, effectiveness and less burdensome.


      2. Before the initiation of the administrative procedure, the competent body
      to initiate or instruct the procedure, ex officio or at the request of a party, in the

      cases of urgent urgency and for the provisional protection of interests
      involved, may adopt the provisional measures that are motivated
      are necessary and proportionate. Provisional measures must be

      confirmed, modified or lifted in the initiation agreement of the
      procedure, which must be carried out within fifteen days after its

      adoption, which may be subject to the appropriate appeal.

C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 109/113








          In any case, these measures will be without effect if the
      procedure within said period or when the initiation agreement does not contain a

      express pronouncement about them.


      3. In accordance with the provisions of the two previous sections, they may agree
      the following provisional measures, in the terms provided in Law 1/2000,

      of January 7, of Civil Procedure:



          a) Temporary suspension of activities.


          b) Provision of bonds.


          c) Withdrawal or intervention of productive assets or temporary suspension of
      services for health, hygiene or safety reasons, the temporary closure of the

      establishment for these or other causes provided for in the regulatory regulations
      applicable.


          d) Preventive seizure of goods, income and fungible things computable in
      cash due to the application of certain prices.



          e) The deposit, retention or immobilization of personal property.


          f) The intervention and deposit of income obtained through an activity
      that is considered illegal and whose prohibition or cessation is intended.


          g) Consignment or constitution of deposit of the amounts that are
      claim.



          h) The withholding of income on account to be paid by the Administrations
      Public.


          i) Those other measures that, for the protection of the rights of
      interested parties, expressly provide for the laws, or that are deemed necessary to

      ensure the effectiveness of the resolution.



      4. Provisional measures may not be adopted that may cause damage to
      difficult or impossible to repair the interested parties or that imply violation of
      rights protected by law.




C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 110/113








      5. Provisional measures may be lifted or modified during the
      processing of the procedure, ex officio or at the request of a party, by virtue of

      circumstances that occurred or that could not be taken into account in the
      time of adoption.


         In any case, they will be extinguished when the resolution takes effect
      administrative that puts an end to the corresponding procedure ”.


      In the treatment of data on facial recognition now analyzed and that

      it is clear that the claim was being carried out since July 1, 2020
      (until 05/06/2021) in various open centers in Spain (at least

      forty), is a processing of personal data expressly prohibited by the
      Article 9.1 of the GDPR


      It is established that on 05/06/2021, the respondent carried out the execution of the
      precautionary measure imposed by providing reliable documentation that proves it,

      turning off implanted facial recognition systems and removing the
      posters.


      The adoption of this provisional measure in the Initiation Agreement and its confirmation
      and finalization in this Proposal for Resolution, weighs all

      the rights and interests in conflict and does not invalidate the security measure
      adopted by judicial bodies, but only the means of recognition

      facial to carry it out, without prejudice to the fact that the person responsible for the treatment
      can adopt other less intrusive systems to achieve such
      purpose.

      Consequently, the processing of data based on the recognition

      facial for identification purposes implanted by MERCADONA is
      is prohibited by the provisions of article 9.1, as it does not include

      no cause that allows lifting the prohibition among those exposed in the
      art. 9.2 of the RGPD, so it is not appropriate to take refuge in the causes of legality
      of art. 6.1 of the same. Such a prohibition cannot be obviated by

      application of proactive security measures, since the prohibition of
      treatment indicated in article 9.1 of the RGPD determines that they are
      irrelevant, so they are not analyzed.


      In view of the above, the following is issued

                              MOTION FOR A RESOLUTION

      That the Director of the Spanish Data Protection Agency sanction

      to MERCADONA S.A., with NIF A46103834, for the violation of the following
      articles and penalties:

C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 111/113








       art. 6 and 9 of the RGPD, typified in art. 83.5.a), of said rule, fine
      administrative amount of € 2,000,000 (two million euros).


       art. 12 and 13 of the RGPD, typified in art. 83.5.b), of said rule, fine
      administrative amount of € 100,000 (one hundred thousand euros).

       art. 5.1.c) of the RGPD, typified in art. 83.5.a), of said rule, fine

      administrative amount of € 500,000 (five hundred thousand euros).

       art. 25.1 of the RGPD, typified in art. 83.4.a), of said rule, fine

      administrative amount of € 500,000 (five hundred thousand euros).

       art. 35 of the RGPD, typified in art. 83.4.a), of said rule, fine
      administrative amount of € 50,000 (fifty thousand euros).


       Confirm the provisional measure imposed on MERCADONA in the Agreement
      of Initiation on the temporary suspension of all data processing

      personal information related to facial recognition in their establishments as
      said treatment prohibited in accordance with the provisions of the RGPD and regulations
      related and be elevated to definitive.


      Likewise, in accordance with the provisions of article 85.2 of the LPACAP,
      You are informed that you may, at any time prior to the resolution of the
      present procedure, carry out the voluntary payment of the proposed sanction,
      which will mean a reduction of 20% of the amount of the same. With the
      application of this reduction, the penalty would be set at € 2,520,000

      (two million five hundred twenty thousand euros) and its payment will imply the termination of the
      process. The effectiveness of this reduction will be conditional on the
      withdrawal or resignation of any action or appeal through administrative channels
      against sanction.

      In case you choose to proceed to the voluntary payment of the amount

      specified above, in accordance with the provisions of the aforementioned article 85.2,
      You must make it effective by entering the restricted account number ES00
      0000 0000 0000 0000 0000 opened in the name of the Spanish Agency for
      Data Protection in the banking entity CAIXABANK, S.A., indicating in the
      concept the reference number of the procedure that appears in the

      heading of this document and the cause, by voluntary payment, of
      reduction of the amount of the sanction. Likewise, you must send proof of the
      entry to the Subdirectorate General of Inspection to proceed to close the
      proceedings.


      By virtue of this, you are notified of the foregoing, and the
      procedure so that within TEN DAYS you can claim how much
      consider in your defense and present the documents and information that
      deems pertinent, in accordance with article 89.2 of the LPACAP). >>




C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 112/113








SECOND: On July 19, 2021, the claimed party has proceeded to pay
the sanction in the amount of € 2,520,000 making use of the reduction foreseen in the
proposed resolution transcribed above.


THIRD: The payment made entails the waiver of any action or recourse in progress.
administrative against the sanction, in relation to the facts to which the
motion for resolution.



                            FOUNDATIONS OF LAW

                                            I
By virtue of the powers that article 58.2 of the RGPD recognizes to each authority of
control, and as established in art. 47 of Organic Law 3/2018, of 5 of

December, Protection of Personal Data and guarantee of digital rights (in
hereinafter LOPDGDD), the Director of the AEPD is competent to sanction the
infractions that are committed against said Regulation.

                                            II
Article 85 of Law 39/2015, of October 1, on Administrative Procedure

Common of Public Administrations (hereinafter LPACAP), under the rubric
"Termination of sanctioning procedures", provides the following:

"1. Initiated a sanctioning procedure, if the offender acknowledges his responsibility,
the procedure may be resolved with the imposition of the appropriate sanction.


2. When the sanction is solely of a pecuniary nature or it is possible to impose a
pecuniary sanction and other non-pecuniary sanction but the
inadmissibility of the second, the voluntary payment by the presumed responsible, in
any time prior to the resolution, will imply the termination of the procedure,

except in relation to the replacement of the altered situation or to the determination of the
compensation for damages caused by the commission of the offense.

3. In both cases, when the sanction is solely of a pecuniary nature, the
competent body to resolve the procedure will apply reductions of, at least,
20% on the amount of the proposed sanction, these being cumulative among themselves.

The aforementioned reductions must be determined in the notice of initiation
of the procedure and its effectiveness will be conditional on the withdrawal or resignation of
any action or appeal in administrative proceedings against the sanction.

The percentage of reduction foreseen in this section may be increased

regulations. "


In accordance with the above, the Director of the AEPD,


RESOLVES:

FIRST: DECLARE the termination of the referral sanctioning procedure
PS / 00120/2021 in accordance with the provisions of article 85 of the LPACAP,

C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es 113/113








sanctioning MERCADONA, S.A., with NIF A46103834, for the violation of the
following articles:


     art. 6 and 9 of the RGPD, typified in art. 83.5.a), of said rule,

     art. 12 and 13 of the RGPD, typified in art. 83.5.b), of said rule,


     art. 5.1.c) of the RGPD, typified in art. 83.5.a), of said rule,

     art. 25.1 of the RGPD, typified in art. 83.4.a), of said rule,


     art. 35 of the RGPD, typified in art. 83.4.a), of said rule,


     Prohibit all processing of personal data related to recognition

        facial in their establishments, in accordance with article 58.2.f).

SECOND: NOTIFY this resolution to MERCADONA, S.A., with NIF

A46103834 and with address at Paseo de la Castellana 259 C, 28046 Madrid.


In accordance with the provisions of article 50 of the LOPDGDD, this
Resolution will be made public once it has been notified to the interested parties.

Against this resolution, which puts an end to the administrative procedure as prescribed by

the art. 114.1.c) of Law 39/2015, of October 1, on Administrative Procedure
Common of Public Administrations, interested parties may file an appeal
administrative litigation before the Contentious-Administrative Chamber of the
National High Court, in accordance with the provisions of article 25 and section 5 of
the fourth additional provision of Law 29/1998, of July 13, regulating the

Contentious-Administrative Jurisdiction, within a period of two months from the
day following notification of this act, as provided in article 46.1 of the
referred Law.


                                                                                     968-160721

Mar Spain Martí
Director of the Spanish Agency for Data Protection
















C / Jorge Juan, 6 www.aepd.es
28001 - Madrid sedeagpd.gob.es