Tietosuojavaltuutetun toimisto (Finland) - 3394/171/21

From GDPRhub
Revision as of 12:58, 11 October 2021 by FD (talk | contribs) (→‎Holding)
Tietosuojavaltuutetun toimisto (Finland) - 3394/171/21
LogoFI.png
Authority: Tietosuojavaltuutetun toimisto (Finland)
Jurisdiction: Finland
Relevant Law:
Finish Data Protection Act in Criminal Matters
Type: Investigation
Outcome: Violation Found
Started:
Decided: 20.09.2021
Published:
Fine: None
Parties: n/a
National Case Number/Name: 3394/171/21
European Case Law Identifier: n/a
Appeal: n/a
Original Language(s): Finnish
Original Source: Finish DPA (in FI)
Initial Contributor: Florence D'Ath

The Finnish DPA found that the Finnish police had unlawfully relied on the facial recognition software Clearview AI in 2020. It ordered the Finnish Police to bring processing into compliance and to notify identifiable data subjects of this breach.

English Summary

Facts

On March 31, 2021, the Finish Central Criminal Police (CCP) received an inquiry from the US online publication Buzzfeed News as to whether the Finnish police had used an application called Clearview AI (a facial recognition software). The CCP first answered by the negative, following which Buzzfeed News revealed that they were aware of about 120 searches made by the CPP in 2020 using Clearview AI. Buzzfeed News also specified that many authorities around the world had unofficially conducted searches via Clearview AI during the free trial period, and that those searches may thus not have come to the attention of the CPP. On April 7, 2021, the Police Board and the CPP immediately began an investigation into this matter.

Following this investigation, it was found that some members of the CPP had indeed tested Clearview AI during the free trial period, together with the National Center for Missing and Exploited Children (NCMEC). In particular, the purpose was to test the suitability of the software for protecting children against the dissemination of pedopornographic material. During that testing phase, those members of the CPP screened the internet for pedoporographoc content matching the profile picture of potential Finnish victims. The profile pictures of potential victims had primarily been collected from social media by NCMEC. During the experiment, one match was found between some pedopornographic material and the profile picture of a Finish minor. The matter was referred to the social authorities. Consequently, personal data have been processed for criminal law purposes during that period by relying on Clearview AI. The trial was discontinued in early 2020.

On 9 April 2021, the National Police Board filed an data breach notification, which was completed upon request from the Finish DPA. In the course of these clarifications, it was further revealed by the CPP that, during a meeting hosted by Europol in 2019, the latter had presented and recommended Clearview AI for the automatic screening of pedopornographic material.

Holding

The Finish DPA first noted that the CPP had processed biometric personal data as defined in the Finnish Act on the Processing of Personal Data in Criminal Matters n°1054/2018. Hence, to be lawful, such processing had to have a valid legal basis.

The Finnish DPA also noted, among others, that:

  • Clearview AI had only be used for one month, and that upon notification of the data breach, the CPP had made sure that the service was no longer used within the organisation;
  • the risks relating to such processing had been minimized because the CPP had 'sanitize' the profile pictures (i.e. ensuring that only the data subject is identifiable, at the exclusion of any other individual);
  • searches could also be made with hash numbers;
  • the NCMEC had collected the profile pictures of potential victims from the internet, and in particular from social media (i.e. freely available sources);
  • the CPP had issued a bulletin regrading the incident at the attention of data subjects;
  • after the notification, the Police Board and the CPC both committed to develop and implement internal policies and procedures to further prevent such incident from happening in the future (including, obligation to refer the use of facial recognition technologies to a DPO ; internal proceeding requiring that processing must comply with data protection law; enhanced training of police force in relation to data protection ; mandatory data protection impact assessment in similar situations; etc).

Despite this, the Finnish DPA found that the CPP had unlawfully processed personal data by relying on facial recognition technologies without any prior controls or safeguards. The Finnish DPA therefore concluded that the CPP had violated the Data Protection Act in Criminal Matters. Consequently, the Finnish DPA ordered the CPP to bring the processing into compliance (if not done already), and to notify the data subjects that could be identified about the breach. No fine was imposed on the CPP.

Comment

Share your comments here!

Further Resources

Share blogs or news articles here!

English Machine Translation of the Decision

The decision below is a machine translation of the Finnish original. Please refer to the Finnish original for more details.

Processing of personal data with a facial recognition program
    Note and provisions of the Assistant Data Protection Supervisor following a security breach notification
    Advertiser
    Police Board
    Description of the security breach
    Preliminary notification of a security breach
    On March 31, 2021, the Central Criminal Police received an inquiry from the US online publication Buzzfeed News as to whether the Finnish police authorities had used an application called Clearview AI. The Central Criminal Police has answered the questionnaire on April 1, 2021 that it is not known that the technology is used by the Finnish police. Buzzfeed News returned to the survey on April 7, 2021, stating that they are aware of about 120 searches made in 2020. Buzzfeed News also said, based on its previous data, that many authorities around the world have searched during the free trial period, when the system was not actually up and running. therefore, may not have come to the attention of the management of the organizations. On April 7, 2021, the Police Board and the Central Criminal Police immediately began an investigation into a possible trial period and also asked Buzzfeed News for more information about the use of the app.
    Since the beginning of 2020, the CAM / CSE team of the Central Criminal Police has taken a free trial of Clearview AI to ensure suitability for testing. Among other things, the CAM / CSE team is pre-screening international packages of potential child sexual abuse material arriving from NCMEC (National Center for Missing and Exploited Children). NCMEC is an international organization that collaborates with social media services to collect material from them about possible exploitation of minors or the distribution of material related to minors.
    The use of Clearview AI has been limited to an experiment with NCMEC data to retrieve a public social media profile of a potential victim with Clearview AI to identify and protect the victim and to determine if the tool would be a valuable aid in identification work in the future. Consequently, personal data have been processed for criminal law purposes. The images were sanitized during the experiment to ensure that they did not contain anything other than the facial profile of the potential victim sought. NCMEC has originally collected images mostly from social media. During the experiment, one hit was found in which the child's sexual-toned material was combined with a Finnish profile. The owner of the profile was the person appearing in the material and the matter was referred to the social authorities.
    Based on the preliminary information received, users became aware that the service does not record searches. According to the current study, about 120 searches have been made in the KRP in 2020.
    Additional clarification requested by the EDPS Office
    The data breach notification made by the controller on 9.4.2021 was preliminary. On his own initiative, the controller did not complete the notification, which is why on 2 June 2021 the Office of the Data Protection Officer requested further clarification from the controller. Further clarification was received on 18.8.2021. The additional investigation stated:
    On 8 April 2021, a report by the Central Criminal Police revealed that a meeting hosted by Europol in 2019 had presented and recommended this Clearview AI application for the automatic screening of sexual exploitation material (CAM / CSE). The presentation gave participants an understanding that the system would not store searches and would therefore be suitable for this use.
    The key role of the CAM / CSE group is to prevent and expose the sexual exploitation of children and the dissemination of materials. In late 2019, Clearview AI was admitted to the Central Criminal Police CAM / CSE team for free trial use and testing to ensure suitability. The trial was discontinued on its own initiative in early 2020. Prior to use, the Central Criminal Police received registration links to the service from Sweden and Latvia. Among other things, the CAM / CSE group is pre-processing international packages of potential child sexual abuse material arriving from NCMEC (National Center for Missing and Exploited Children). NCMEC is an international organization that collaborates with social media services to collect material from them about possible exploitation of minors or the distribution of material related to minors.
    In Central Criminal Police, trial use of Clearview AI was limited to an experiment with NCMEC data to retrieve a public social media profile of a potential victim with Clearview AI to identify and protect the victim for crime prevention purposes. At the same time, it was investigated whether the tool would serve as a valuable aid in the identification work in the future.
    The Central Criminal Police had 4 personal trial licenses in use, which were valid for one month. In connection with the experimental use, searches were performed mainly with test data. The current members of the CAM / CSE group have tested the program in two cases with real images. In either case, no images have been used that contain any information about what is at stake or any additional information about who appears in the image. There has also been no material that offends sexual morality in the pictures. Faces have been used in searches because it is a face recognition program.
    During the experiment, one hit was found in which the child's sexual-toned material was combined with a Finnish profile. The owner of the profile was the person appearing in the material and the matter was referred to the social authorities. No preliminary investigation was launched into the case. The number of searches provided by the Buzzfeed News online publication has not been confirmed in the survey. A completely certain picture has also not been obtained in the study of how much of the program has been tried with real pictures due to the removal of people. Based on the test performed, Clearview AI was found to be unsuitable for government work for this purpose.
    Further investigations revealed that the Central Criminal Police CAM Group had access to the Arachnid service, run by a Canadian NGO from 10 February 2020 to 3 March 2021, which searches the CAM for material online. The service is used in an open network via a web browser. It is a program that searches the internet for so-called material related to the sexual exploitation of children. That organization and program also has the potential to help remove illegal material from the Internet.
    Through the portal, it is possible to make queries with hash numbers, images, or the image can be saved to the system so that the system will automatically search for that image in the future as well. In connection with the introduction of the service, the group was instructed to make inquiries only with such choices that no data is stored in the service. From an operational point of view, it has been an acquisition of information, not a transfer of information.
    The CAM / CSE group works to combat the sexual exploitation of children and from time to time there is a need to find out if a particular image has been disseminated on the internet and if so, where to find it. It is not actually possible to do this without in one way or another targeting Image Search to the Internet. There was no reason to doubt the reliability of that service provider.
    According to information from the Central Criminal Police, 2 people from the current CAM / CSE group had used the service, one time and one three times. No hits were found. The exact dates of the surveys are not known, but the surveys were conducted between 10 February 2020 and 3 March 2021. The use of the service was actually terminated during the investigation on 15 April 2021. The second investigator’s survey was related to an individual intelligence case, and no cam image was used in the survey, and the other investigator’s surveys have been surveys in which local police had asked to find out if the images had been distributed on the Internet. No information has been attached to the images, but the service has sought to identify whether such an image of child sexual abuse has been disseminated elsewhere on the network.
    The number or identities of the people in the pictures are unknown. Unfortunately, the exact content of the images is not known. It is therefore possible that there are several persons in the images, but it is also possible that the persons are not identifiable. pictures back to camera for type reasons. It is not known whether the images concerned the same persons. According to one assessment, the pictures have included pictures of 4 to 16 people.
    Targeted information
    The personal data processed have been biometric personal data in accordance with Section 3 (1) (3) of the Act on the Processing of Personal Data in Criminal Matters and the Maintenance of National Security (1054/2018, hereinafter the Criminal Data Protection Act), which are subject to the processing conditions provided in section 11 of the Act.
    Based on the explanations provided by the registrar, the number or identity of the persons in the pictures has not been comprehensively determined. According to the study, when using test data, researchers are e.g. used their own images as well as randomly selected images unrelated to the Internet. In its report, the registrar has not received a definite picture of how much of the program has been tried with the right pictures due to the removal of the person. By the current members of the CAM / CSE team of the Central Criminal Police, the program has been tested with real images in two cases. In either case, no images have been used that contain any information about what is at stake or any additional information about who appears in the image. There has also been no material that offends sexual morality in the pictures. Face images have been used in the search because it is a face recognition program.
    Measures taken
    In its reports, the controller reported the following implemented and future measures:
    After the trial run, the Clearview AI service is stopped. The use of the Arachnid service detected as a result of the additional investigation has been discontinued on 15 April 2021.
    During 2020, the police have provided guidance and training, in particular on the processing of biometric data and the tools that can be used for this purpose. In May 2020, the police introduced their own face recognition system KASTU, which meets data protection and data security requirements. The KASTU system allows the police to identify persons suspected of a crime from the police badge register.
    Immediately after the present case arose, the Police Board immediately requested clarification from the Central Criminal Police. The Police Board submitted a prior notification of a personal data breach to the Data Protection Commissioner on 9 April 2021. At the same time, the Central Criminal Police also issued a bulletin on the matter. The police also proceeded with the delay to ensure that the Clearview AI service in question was not available elsewhere in the organization. It has not come to light that this or similar services have been used elsewhere in the police organization.
    Based on the response data, the risks were minimized by sanitizing the images used. This ensured that they did not contain anything other than the facial profile of the potential victim sought. Queries could also be made with hash numbers. The service provider had originally collected most of the images from the internet / social media. No images have been used in the cases, which would contain any information about what is at stake or who appears in the image. There has also been no material that offends sexual morality in the pictures.
    Clearview AI users were minimized to four users. The trial license was valid for one month. The service was mainly tested with test data. When using the test data, the researchers had used their own images as well as randomly selected unrelated facial images from the Internet. Users of the Arachnid service were minimized to two users. In connection with the trial introduction of the service, the group was instructed to make inquiries only with such choices that the data is not stored in the service.
    In the view of the Central Criminal Police, it should be emphasized that the images of the test drives are likely to originate from a third country from the outset, as they have been uploaded to the Internet and the servers may have been in a third country. The service provider has thus only become aware that, for some reason, the searcher has been interested in the images in question and finding them on the Internet.
    Future actions
    The police will continue to pay special attention to the development of uniform policies and tools to avoid similar incidents.
    The police are in the process of deploying information systems and new services. The Police Board strives to ensure that there is sufficient awareness of the procedures in accordance with this process through the police administration by raising this issue, e.g. in the national police data protection network. This has also been the case in the Central Criminal Police, where the Data Protection Officer has been appointed as a member of the Agency's development team, so that data protection issues are taken into account at an early stage in development activities. In addition, the Central Criminal Police has emphasized in the internal proceedings that the processing of personal data in similar situations must ensure compliance with data protection legislation.
    Among other things, the above-mentioned deployment process also includes a data protection impact assessment process, which is currently being further developed by the police. The aim is to make the impact assessment process such a comprehensive and well-known process through the police that the processing of personal data will not be initiated without at least an assessment of the need for a data protection impact assessment.
    The Police Board is also currently developing information management monitoring and control activities, one of the objectives of which is to help ensure that personal data is processed only in permitted ways.
    As a result of the case, the National Board of Police is still updating its instructions and training material on the facial recognition system. The Police Board is also preparing the Principles of Reliable Artificial Intelligence for the Police, which should emphasize, among other things, the importance of fundamental rights, such as protection of privacy and equality, and ensuring compliance with legal requirements.
    Contact registrants
    The registrar has assessed the obligation to notify data subjects of a data breach in the manner required by section 35 of the Criminal Data Protection Act. According to current information, the security breach has not posed a significant risk to data subjects' rights that would require personal disclosure. The Central Criminal Police has also issued a bulletin available to registrants.
    Actions of the Assistant Data Protection Supervisor
    Note
    The Assistant Data Protection Commissioner gives the controller a notice in accordance with section 51 (1) (4) of the Criminal Data Protection Act, because the police have processed personal data in violation of Sections 4 and 14 of the Criminal Data Protection Act.
    According to section 14 (1) of the Data Protection Act in Criminal Matters, the controller is responsible for ensuring that personal data is processed lawfully. It must also be able to demonstrate that personal data have been processed in accordance with Chapter 2 of the Act. The registrar shall take the necessary technical and organizational measures required by the liability provided for in subsection (1). The measures shall take into account the nature, extent, context and purposes of the processing and the risks to the rights of natural persons.
    In order to fulfill the obligations provided for in section 14 of criminal matters, the controller shall provide training related to the design and use of new ways of processing personal data. The controller must also ensure that up-to-date guidelines are in place for the processing of personal data and that adequate controls are in place for the processing of personal data. It is the duty of the controller to ensure that police officers are aware of the regulations and procedures they must follow in their operations.
    In accordance with section 4 (2) of the Criminal Data Protection Act, personal data must be processed appropriately and carefully. The requirement of diligence is emphasized when processing data relating to specific categories of personal data. When using Article 11 of the Criminal Data Protection Act, the processing of biometric data intended for unambiguous identification of a person is permitted only if it is necessary and the safeguards necessary to safeguard the data subject's rights have been taken and the other conditions mentioned in the above provision are met.
    The processing of personal data has taken place without the consent or supervision of the controller. Police officers have used Clearview AI and Arachnid services at their own discretion. Based on the report received, the registrar has not been aware of the use of the services. The report received does not indicate the position of the persons involved in the processing of personal data and whether their superiors were aware of the activity.
    The processing of personal data has started without obtaining information on how the services in question process personal data. The processing of personal data has not assessed or implemented the technical or organizational measures that would have been required to process personal data belonging to specific categories of personal data, nor has it been clarified, inter alia, how long the personal data will be stored or possibly disclosed to third parties. The necessity of the processing of personal data and other conditions related to the processing of biometric personal data have not been met.
    Control, training, internal control or other measures taken by the controller have not been able to prevent the unlawful processing of personal data. The responsibility of the controller for the processing of personal data described above and the requirement of lawfulness of the processing of personal data have not been fulfilled in the manner referred to in Sections 4 and 14 of the Criminal Data Protection Act.
    Regulation
    Pursuant to section 51 (1) (10) of the Criminal Data Protection Act, the Assistant Data Protection Commissioner orders the controller to bring the processing operations in accordance with the Criminal Data Protection Act. The order must be implemented in such a way that by 29.10.2021 the data controller will ask the administrators of Clearview AI and Arachnid services to delete the personal data transmitted by the police to these service providers.
    The processing of personal data has been illegal. As the retention of personal data by the above-mentioned service providers poses a high risk to data subjects, as described below, the controller must strive for the service providers to delete the data transmitted to them by the police from their storage media.
    Notification to data subjects
    In accordance with section 51 (1) (6) of the Data Protection Act on Criminal Matters, the Assistant Data Protection Commissioner orders the data controller to notify data subjects of a personal data breach. The notification must comply with the provisions of section 37 of the Criminal Data Protection Act. The notification shall be made to those data subjects whose identities are known to the controller.
    However, according to section 35 (3) of the Criminal Information Act, notification to the data subject may be postponed or restricted or omitted if the conditions set out in section 28 of the Act are met. If section 35 (3) of the Data Protection Act in Criminal Matters applies, a notification thereof and the reasons therefor must be submitted to the Office of the Data Protection Commissioner.
    The controller has not been able to identify all the persons whose personal data have been processed. However, this does not mean that the obligation to notify could not apply to persons whose identities are known to the registrar.
    According to the report received, no images have been used in the operation, which contain information about what is in question or who appears in the image. However, biometric facial images are biometric personal data in accordance with Section 3 (1) (13) of the Personal Data Act in Criminal Matters, from which a person can be identified using facial recognition technology.
    When assessing the risk posed by a security breach, the severity and probability of the possible consequence of the security breach should be considered. The more serious the consequence for the individual, the greater the risk and the more likely it is to materialize. In the case of the processing of biometric personal data by the police in breach of the Criminal Data Protection Act, it can in principle be assessed that data subjects have been exposed to a high risk of a data breach. In the case in question, the controller has no information on how the personal data have been processed by the service providers in question or for how long they may be stored or possibly further processed. For this reason, the biometric personal data of data subjects have come under the control and supervision of the controller. Based on the above, the processing of personal data has posed a significant risk to data subjects.
    It has been taken into account that, on the basis of the information currently available, the risk of a security breach has not materialized. As previously reported, the Central Criminal Police has also issued a bulletin available to data subjects. For this reason, the obligation to notify is limited to those persons whose identities are known to the controller.
    Appeal
    A person dissatisfied with the decision of the Data Protection Officer may appeal against it to the Administrative Court with a written appeal, in accordance with the provisions of the Administrative Procedure Act (586/1996).
    Service
    The decision is notified by post in accordance with section 60 of the Administrative Procedure Act (434/2003) against an acknowledgment of receipt.