IMY (Sweden) - DI-2020-2719: Difference between revisions
No edit summary |
No edit summary |
||
Line 5: | Line 5: | ||
|DPAlogo=LogoSEnew.png | |DPAlogo=LogoSEnew.png | ||
|DPA_Abbrevation=IMY | |DPA_Abbrevation=IMY | ||
|DPA_With_Country= | |DPA_With_Country=IMY (Sweden) | ||
|Case_Number_Name=DI-2020-2719 | |Case_Number_Name=DI-2020-2719 |
Revision as of 15:07, 14 April 2021
IMY - DI-2020-2719 | |
---|---|
Authority: | IMY (Sweden) |
Jurisdiction: | Sweden |
Relevant Law: | Criminal Data Act |
Type: | Investigation |
Outcome: | Violation Found |
Started: | |
Decided: | 10.02.2021 |
Published: | 11.02.2021 |
Fine: | 2500000 SEK |
Parties: | Polismyndigheten |
National Case Number/Name: | DI-2020-2719 |
European Case Law Identifier: | n/a |
Appeal: | n/a |
Original Language(s): | Swedish |
Original Source: | Integritetsskyddsmyndigheten (in SV) |
Initial Contributor: | Rie Aleksandra Walle |
The Swedish DPA (IMY) fined the Swedish Police Authority €248,218 for unlawful use of the facial recognition app Clearview AI, in violation of several paragraphs as per the Swedish Criminal Data Act.
English Summary
Facts
Through information in the media the Swedish DPA became aware that the Swedish Police Authority were using the controversial app Clearview AI for law enforcement and investigation purposes. Clearview AI is an app in which the user can upload images and through facial recognition find out if the person depicted on the image exists in other images in Clearview AI's extensive database, which is made up of three billion images scraped from social media sites and webpages.
Following the news, the DPA initiated an investigation against the Police. The investigation concludes that Cleaview AI has been used by the Police on a number of occasions. According to the Police a few employees have used the application without any prior authorisation.
The DPA highlights that the Police:
- Failed to implement sufficient organisational measures to ensure and be able to demonstrate that the processing of personal data was carried out in compliance with the Criminal Data Act
- Unlawfully processed biometric data for facial recognition
- Failed to conduct a data protection impact assessment (DPIA) which this case of processing would require
Dispute
Did the Swedish Police Authority violate the Criminal Data Act when using the Clearview AI app?
Holding
The DPA found that the Swedish Police Authority had not fulfilled it’s obligations with regards to the appropriate technical and organisational measures that should be fulfilled to ensure compliance with the Criminal Data Act (CDA). Even though the employees had used the app without an explicit recommendation from the employer, the employer is still considered responsible for the use of the app and as being the data controller. Since the data that was processed in Clearview AI was biometric data, i.e. sensitive, and also regulated by the public access and secrecy act the breach is to be considered severe.
The DPA held that the Police must:
- Pay a fine of SEK 2 500 000
- Implement sufficient training and organisational measures as per Chapter 3 § 2 CDA
- Inform the affected data subjects as per Chapter 4 § 2 CDA
- Delete all data from the Clearview AI app
All measures need to be executed by 15 September 2021.
Comment
The Swedish Criminal Data Act applies to personal data processing within law enforcement activities at such authorities as the Swedish Police Authority, the Swedish National Economic Crimes Bureau, the Swedish Customs, the Swedish Tax Agency, the Swedish Coast Guard and the Swedish Prosecution Authority. The term law enforcement activities refers to all work carried out with the purpose of preventing, investigating, detecting or prosecuting crimes.
Many law enforcement authorities also have other tasks, for example border control or customs control, that do not involve law enforcement directly. The Criminal Data Act will not apply to such activities but the entire General Data Protection Regulation will apply instead.
Further Resources
The DPA's press release (in English): https://www.imy.se/nyheter/police-unlawfully-used-facial-recognition-app/
About the Swedish Criminal Data Act and its relation to the GDPR: https://www.imy.se/other-lang/in-english/the-criminal-data-act/purpose-and-area-of-applicability/
English Machine Translation of the Decision
The decision below is a machine translation of the Swedish original. Please refer to the Swedish original for more details.
1 (8) The police authority Sent only by e-mail registrator.kansli@polisen.se Record number: Decision after supervision according to DI-2020-2719 Your record number: Criminal Data Act - The Police Authority A126,614 / 2020 use of Clearview AI Date: 2021-02-10 The decision of the Integrity Protection Authority The Integrity Protection Authority states that the Police Authority has processed personal data in violation of ch. 2 Section 12 and Chapter 3 § 2 and § 7 first paragraph the Criminal Data Act by using the application Clearview AI during the period autumn 2019 through March 3, 2020. The Integrity Protection Authority decides on the basis of ch. § 1 of the Criminal Data Act that The police authority must pay a penalty fee of 2,500,000 (two million five hundred thousand crowns. The Integrity Protection Authority submits to the Police Authority in accordance with ch. § 7 first paragraph 2 of the Criminal Data Act to take training measures and other organizational measures required according to ch. Section 2 of the Criminal Data Act to ensure that the routines that exists reaches the entire organization and that the authority can thus show and ensure that all processing of personal data is in accordance with the constitution. The measures must have taken by 15 September 2021 at the latest. The Privacy Protection Authority submits with the support of ch. Section 7, first paragraph 2 Criminal Data Act The Police Authority that according to ch. Section 2 of the Criminal Data Act informs them registered, whose personal data the Police Authority entered in Clearview AI, in the to the extent that the obligation to provide information is not limited according to ch. § 5 the Criminal Data Act. The information must have been submitted no later than 15 September 2021. The Privacy Protection Authority submits with the support of ch. Section 7, first paragraph 2 Criminal Data Act Police authority to take possible and relevant measures to they personal data that the Police Authority entered into Clearview AI is deleted from the application. Such measures must have been taken on 15 September 2021. Postal address: Box 8114 104 20 Stockholm Website: www.imy.se E-mail: imy@imy.se Phone: 08-657 61 00Integrittsskyddsmyndigheten Record number: DI-2020-2719 2 (8) Date: 2021-02-10 Report on the supervisory matter The Privacy Protection Authority (IMY), formerly the Swedish Data Inspectorate, received attention in February 2020 through information in the media that law enforcement agencies in Sweden could have used the application Clearview AI. IMY initiated against the background of the information immediately a review where the Police Authority was asked to answer about the authority used the application as well as with what legal aid processing in so fall occurred. Statements from the Police Authority in the spring of 2020 showed that the application has been used by some employees within the authority on a number of occasions without that the authority has provided the application to the employees. IMY therefore initiated one in-depth review of the Police Authority's use of Clearview AI. Clearview AI is an application provided by a US company that offers a face recognition service where the user, after downloading the application on a digital device, uploads an image that is biometrically matched to a very large one number of images scraped from the open internet. The user then gets a result in in the form of a number of URLs where any matches are available. 2 The police authority's use of the Clearview AI application The police authority has stated that the application Clearview AI has been used in a number opportunities during the period autumn 2019 to 3 March 2020. They are employees at the National Operations Department (NOA) and the South region who have used Clearview AI. At NOA, the application has been used by a total of six employees, five of whom used the application in operational activities, e.g. in order to identify plaintiffs in suspected sexual offenses against children and in reconnaissance the activity of trying to identify unknown persons in the investigation of gross organized crime. At Police Region South, the application has been used in investigations of sexual offenses against children and tested by employees against images in the Police Authority s.k. OBS portal. When the Police Authority's use of Clearview AI became known through information in the media, the authority's data protection officer went out with one recommendation that the National Forensic Center and NOA clarify and disseminate that such use was not permitted. Grounds for the decision The police authority's responsibility for employees' treatment of personal data in law enforcement activities The police authority is a large organization with a special task of enforcing the law and order. In addition, the police authority is governed by clear legislation on how personal data must be processed, especially in law enforcement activities. The large amount of personal data, including sensitive ones, that the authority processes as well as the far-reaching powers the Police Authority has means that the authority has a special responsibility for the personal processing of personal data. 1The web service states that they have collected three billion facial images from Facebook, YouTube and millions of others websites. IMY Report 2021: 1; Privacy Protection Report 2020 - report on developments in the IT area when it applies to integrity and new technology, p. 70. 2https: //clearview.ai, 2020-11-25. 3it support within the Police Authority where intelligence information is disseminated to law enforcement activities. Integrity Protection Authority Record number: DI-2020-2719 3 (8) Date: 2021-02-10 The police authority is, as the person responsible for personal data, responsible for everything personal data processing that takes place under the authority's guidance or on on behalf of the authority according to ch. Section 1 of the Criminal Data Act. This means that all personal data processing carried out at the authority falls under the Police Authority's personal data liability, including the processing of personal data assistants, employees, persons on an equal footing with employees (eg temporary staff) or contractor performs. Also of ch. Section 1 of the Police Treatment Act (2018: 1693) within the area of the Criminal Data Act (PBDL) it appears that the Police Authority is responsible for the personal data processing carried out at the authority. That means it is The police authority's obligation to ensure that all processing that takes place within the authority has i.a. a legal basis, a legitimate purpose and that adequate safeguards are in place in place through appropriate technical and organizational measures. The police authority shall ensure that there are clear guidelines and routines regarding the IT tools provided by the employees may use and that the employees are sufficiently trained and informed about how personal data may be processed. 5 According to the Police Authority, there are a few employees who have used Clearview AI without the authority provided the application to the employees. However, the treatment has taken place in the performance of the employees' duties at the authority. The treatment has also performed with personal data taken from current investigations and at the majority of cases during ongoing criminal investigations, ie. during exercise of authority. That it happened without the authority providing Clearview AI or approved the use of the IT tool, the Police Authority does not deprive it responsibility that the authority has as personal data controller. Against this background, IMY states that the Police Authority is responsible for them employees' processing of personal data when using Clearview AI. The police authority's obligation to through technical and organizational measures ensure and demonstrate that the authority personal data processing is in accordance with the constitution As the person responsible for personal data, the Police Authority has according to ch. Section 2 of the Criminal Data Act an obligation to, through technical and organizational measures, ensure and be able to show that the authority's processing of personal data is in accordance with the constitution and that it data subjects' rights are protected. It must be assessed in each individual case which ones 6 measures needed taking into account e.g. which personal data are processed. Organizational measures can include be to adopt internal data protection strategies, 7 inform and train employees and ensure a clear division of responsibilities. Actions which can be taken to show that the treatment is constitutional can e.g. be 8 documentation of IT systems, treatments and measures taken, etc. In the case, the police authority has submitted an internal routine for processing personal data within the authority. Some additional control document for the employees' treatment of personal data has not been submitted. Nor any data on how education of employees or how the internal routine should reach the entire organization has been submitted or accounted for. There is also no information that any education or equivalent activity has actually been carried out within the authority. That employees at two different 4 Prop. 2017/18: 232 p. 171 f., 319 and 452. 5 See further below under the heading The police authority's obligation to through technical and organizational measures ensure and demonstrate that the authority's personal data processing is in accordance with the constitution. 6 Prop. 2017/18: 232 pp. 172 f. 7 8 Sandén, H-O, 2019, SFS 2018: 1177 Lagkommentar, Norstedts juridik. Prop 2017/18: 232, p. 453.Integritetsskyddsmyndigheten Record number: DI-2020-2719 4 (8) Date: 2021-02-10 organizational units used Clearview AI in violation of applicable regulatory shows that the routine has not had sufficient impact within the authority. The police authority has the courage this background has not been able to show that there are sufficient organizational measures, in form of e.g. internal strategies or training, in place to ensure that the treatment is in accordance with the constitution. IMY notes that the Police Authority has not taken appropriate organizational measures measures to ensure and be able to demonstrate that the Authority's treatment of personal data has been in accordance with the constitution. The police authority has thus violated Chapter 3 Section 2 of the Criminal Data Act. The police authority's processing of biometric data in associated with the use of Clearview AI According to the Police Authority's information, there are pictures of people, who were then transformed into biometric data, in ongoing operational matters loaded into Clearview AI at a several times. The police authority has not reported on how the biometric data read into Clearview AI is treated in the application, e.g. if and if so how long the data saved, how the matching of biometric data is done, if the data is transferred to third country or if the information is disclosed to others in connection with its use. The according to information from the Police Authority is due to the fact that there are no legal assessments made before using Clearview AI. Biometric personal data is sensitive personal data and may, according to ch. § 12 the Criminal Data Act is only dealt with if it is specifically prescribed and absolutely necessary for the purpose of the treatment. Of ch. 2 Section 4 of the PBDL states that the Police Authority may process biometric data if the use is absolutely necessary for the purpose of the treatment. The use of a service like Clearview AI involves the biometric of individuals personal data is matched against large amounts of personal data that have been collected unfiltered from the open internet. According to the IMY's assessment, a law enforcement agency can processing of personal data when using such a service is unlikely meet the strict requirement of necessity that follows from the Criminal Data Act and that underlying criminal data directive. The European Data Protection Board has given expression for a similar view. 10 The processing of biometric data that took place during the Police Authority's use of Clearview AI has been performed without any control or knowledge from the Police Authority on how the data is handled by Clearview AI. Any assessment of whether the treatment been absolutely necessary according to ch. Section 12 of the Criminal Data Act has not been passed. Through it information submitted in the case, the Police Authority has not shown that the authority processing of biometric data in the Clearview AI service has been absolutely necessary for the purpose of the treatment. 9Article 10 of Directive (EU) 2016/680 of the European Parliament and of the Council of 27 April 2016 on the protection of persons with regard to the processing of personal data by the competent authorities in order to prevent, deter, investigate, detect or prosecute crimes or carry out criminal penalties, and the free flow of such information and if Repeal of Council Framework Decision 2008/977 / JHA. EDPB response to MEPs Sophie in ‘t Veld, Moritz Körner, Michal Šimečka, Fabiene Keller, Jan-Christoph Oetjen, Anna Donáth, Maite Pagazaurtundúa, Olivier Chastel, concerning the facial recognition app developed by Clearview AI.Integritetsskyddsmyndigheten Record number: DI-2020-2719 5 (8) Date: 2021-02-10 IMY notes that the Police Authority's processing of biometric data has taken place in conflict with ch. Section 12 of the Criminal Data Act. Obligation to carry out an impact assessment According to ch. Section 7, first paragraph, of the Criminal Data Act, a person responsible for personal data is obliged to carry out an impact assessment before starting a new treatment that can assumed to entail a special risk of invasion of the data subject's privacy. The preparatory work for the Criminal Data Act and the underlying Criminal Data Directive is clear that in assessing whether an impact assessment is needed, the treatment must scope, nature, context and nature are taken into account. Special consideration must be given to the treatment includes any new technology. 11 The personal data that has been processed in the application are biometric data and these may, as reported above, only be treated if specifically prescribed and absolutely necessary in view of the purpose of the treatment. Since the use of Clearview AI has involved the processing of biometric data, which includes face recognition, with new technology provided by an external actor in third countries, the treatment may be presumed to have entailed a particular risk of infringement registered privacy. An impact assessment would therefore have carried out before starting treatment. The use of the Clearview application AI was initiated, however, without any legal considerations or consideration of risks invasion of the data subjects' privacy took place. The police authority has not used the application systematically in its operations and the application has not been recommended by the authority. As IMY stated above is however, the Police Authority is responsible for the treatment employees performed during use by Clearview AI. The lack of organizational measures has led to employees using the application without first carrying out an impact assessment, despite the fact that required according to ch. Section 7 of the Criminal Data Act. IMY states that the Police Authority in violation of ch. Section 7, first paragraph the Criminal Data Act has failed to carry out an impact assessment before use by Clearview AI. Choice of intervention Of ch. 5 Section 7 of the Criminal Data Act follows the corrective powers of the IMY violations of the said law. These consist of i.a. injunctions, prohibition of processing and issuing a penalty fee. Of ch. 6 Sections 1 and 2 of the Criminal Data Act follow that the IMY can issue a penalty fee at violation of i.a. the provisions of ch. Section 12 and Chapter 3 §§ 2 and 7 the Criminal Data Act. The penalty fee must meet the requirements set out in the Criminal Data Directive 12 that sanctions should be proportionate, dissuasive and effective. When assessing whether a penalty fee should be charged and the size of the penalty fee special consideration shall be given to the circumstances specified in Chapter 6. Section 4 of the Criminal Data Act, i.e. if the infringement was intentional or due to negligence, the injury, danger or 11Prop. 2017/18: 232 p.181 f. 12Prop. 2017/18: 232 p. 309 f.Integritetsskyddsmyndigheten Record number: DI-2020-2719 6 (8) Date: 2021-02-10 infringement of the infringement, the nature, severity of the infringement and duration, what the personal data controller or personal data assistant has done to limit the effects of the infringement, and if the data controller or the personal data assistant was previously required to pay a penalty fee. IMY states that the Police Authority has failed in several respects in its personal data liability when using Clearview AI. The police authority has not have taken sufficient organizational measures to ensure and be able to demonstrate that the processing of personal data in the case in question has been constitutional, processed biometric data in violation of the Criminal Data Act and failed to implement a impact assessment. As IMY stated above, this means that the Police Authority has processed personal data in violation of ch. 2 Section 12 and Chapter 3 § 2 and § 7 first paragraph the Criminal Data Act. It has not been possible to investigate whether the data subjects suffered any actual damage Police Department's use of Clearview AI. However, this is not crucial for if penalty fee shall be charged, without the fact that the risk of injury existed is in accordance prepares enough for this.Through the use of Clearview AI has employees at the Police Authority gained access to privacy-sensitive information about a large number registered. The European Court of Justice has ruled that access to large amounts of personal data by an authority must be seen as a special intrusion, regardless of whether the individual suffered someone 14 damage to the treatment or not. Also the information that the Police Authority entered into the application has been of a privacy-sensitive nature and it has not been possible to clarify what what happened to this personal data after the entry. Due to what now stated, the IMY considers the risk of injury for the data subjects to be high in the case in question. The police authority has processed personal data without legal considerations and without any control over or assessment of the treatment's intrusion into individuals' personal integrity. Regarding the use of Clearview AI, the Police Authority has not been able to account for what happened to the personal data the authority entered into the application and with the result obtained. It can be assumed that the data entered into the application is often covered by some form of confidentiality, e.g. Chapter 35 Section 1 of the Public Access to Information and Secrecy Act (2009: 400), and the Police Authority has not presented any breach of confidentiality that could justify disclosure of the data. These circumstances mean that there are reasons to look seriously the infringements. The scope and duration of the treatment must also be taken into account determining the amount of the penalty fee. An attenuating circumstance is that only a few are registered whose personal data is shared with Clearview AI. However, the use of Clearview AI has been done for a total of a few months, and ceased only after the Privacy Protection Authority's investigation began. In addition, the Police Authority has gained access through the use of Clearview AI a large number of personal data and it is unclear how long they were entered the personal data and the data obtained through the matching have been processed. It may further be considered aggravating that the data of the data subjects were matched against one application with personal data from the entire open internet and that The police authority has no knowledge of what happened to the information provided 1Prop. 2017/18: 232 pp. 483. Judgments of the Court of Justice of the European Union in Cases C-594/12, paragraph 35 and C-623/17, paragraph 70. Integrity Protection Authority Record number: DI-2020-2719 7 (8) Date: 2021-02-10 matats in. The fact that it has been biometric, ie. sensitive personal data, such as processed and the fact that the data has been used for facial recognition also means that there are reasons to take the infringement seriously. 15 In summary, the reported circumstances entail a penalty fee shall be levied and that a relatively substantial penalty fee is justified. The size of the penalty fee According to ch. 6 Section 5 of the Criminal Data Act, a penalty fee may be reduced in whole or in part if the infringement was excusable or that it would be unreasonable to issue a penalty fee. The the fact that the data controller did not know the rules or had Inadequate procedures are not a reason to reduce the penalty fee. 16 IMY notes that there are no other reasons for reducing the sanction fee according to ch. 6 Section 5 of the Criminal Data Act. Of ch. 6 Section 3, first paragraph, of the Criminal Data Act follows that a penalty fee for violation of ch. 3 Section 7 of the same law may not exceed SEK 5 million. Of the second paragraph follows that for a violation of e.g. Chapter 2 Section 12 and Chapter 3 § 2 According to the Criminal Data Act, a penalty fee may amount to a maximum of SEK 10 million. The highest the amount that can be determined is thus ten million kronor. IMY decides on the basis of an overall assessment that the Police Authority shall pay one penalty fee of SEK 2,500,000. Injunctions The police authority shall be instructed to take training measures and others organizational measures required according to ch. Section 2 of the Criminal Data Act to ensure that the routines that exist reach the entire organization and that the authority can thus show and ensure that all processing of personal data is in accordance with the constitution. The measures must have been taken by 15 September 2021. According to ch. 4 Section 2 of the Criminal Data Act shall be the person responsible for personal data in an individual case provide certain information to the data subject, if necessary for him or her to be able to exercise their rights. The information must include include the legal basis for the processing, categories of recipients of the personal data and for how long personal data may be processed. The preparatory work shows that the provision is applicable e.g. in cases where the data subject risks losing his or her rights if does not receive the information or if it is important to him or her for any other reason to know the treatment in order to be able to exercise their rights. Another example which is included in the preparatory work is that sensitive personal data has been processed in violation of 2 Cape. Section 11 on sensitive personal data. 17 The information obligation according to ch. 4 Section 2 of the Criminal Data Act does not apply to that extent it is specifically prescribed by law or other statute or otherwise stated in the decision which has been announced on the basis of the constitution that information may not be disclosed, e.g. of taking into account the interest in preventing, deterring or detecting criminal activity; investigate or prosecute crimes, that other judicial investigations or investigations do not 15 16Prop 2017/18: 232 pp. 485. Prop 2017/18: 232 p.465 17Prop. 2017/18: 232 p. 465Integritetsskyddsmyndigheten Record number: DI-2020-2719 8 (8) Date: 2021-02-10 hindered, or that someone else's rights and freedoms are protected (Chapter 4, Section 5 criminal law). As IMY has stated, it has not been possible to clarify what happened to them personal data entered by the Police Authority in Clearview AI. It is therefore important that the data subjects become aware of the processing in order to be able to exercise their rights, in particular as regards sensitive personal data processed in breach of Chapter 2 11 § BDL. IMY therefore assesses that the Police Authority has an obligation to leave information according to ch. 4 2 § BDL. During the investigation, the Police Authority has not stated something that the data subjects should have been informed about the use. Against this background, the police authority shall be ordered to provide information to them registered, whose personal data the Police Authority entered into Clearview AI, according to Chapter 4 Section 2 of the Criminal Data Act with the restrictions that follow from Chapter 4 § 5 of the same law. The information must have been submitted no later than 15 September 2021. As IMY has stated, the Police Authority has entered sensitive personal data in Clearview AI. Then there is a lack of information about what happened to the personal data shared with Clearview AI and if these are still stored with the application Finally, the police authority is instructed to take possible and relevant measures to: ensure that the personal data entered in Clearview AI is deleted from the application. Such measures must be taken by 15 September 2021. This decision was made by the Director General Lena Lindgren Schelin after the presentation by lawyer Elena Mazzotti Pallard. The lawyer Frida Orring is also involved in the proceedings and the process owner for the supervision process Katarina Tullstedt participated. At the final The case is handled by Chief Justice David Törngren and Head of Unit Charlotte Waller Dahlberg participated. Lena Lindgren Schelin, 2021-02-10 (This is an electronic signature) Appendix: How to pay penalty fee. Copy for knowledge of: The Data Protection Ombudsman: dataskyddsombud@polisen.se How to appeal If you want to appeal the decision, you must write to the Privacy Protection Authority. Enter i the letter which decision you are appealing and the change you are requesting. The appeal shall have been received by the Privacy Protection Authority no later than three weeks from the date of the decision was announced. If the appeal has been received in time, send The Integrity Protection Authority forwards it to the Administrative Court in Stockholm examination. You can e-mail the appeal to the Privacy Protection Authority if it does not contain any privacy-sensitive personal data or data that may be covered by secrecy. The authority's contact information can be found on the first page of the decision.