Article 25 GDPR

From GDPRhub
Revision as of 08:28, 2 March 2023 by Kv (talk | contribs) (added EDPB Guidelines)
Article 25 - Data protection by design and by default
Gdpricon.png
Chapter 10: Delegated and implementing acts

Legal Text


Article 25 - Data protection by design and by default

1. Taking into account the state of the art, the cost of implementation and the nature, scope, context and purposes of processing as well as the risks of varying likelihood and severity for rights and freedoms of natural persons posed by the processing, the controller shall, both at the time of the determination of the means for processing and at the time of the processing itself, implement appropriate technical and organisational measures, such as pseudonymisation, which are designed to implement data-protection principles, such as data minimisation, in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of this Regulation and protect the rights of data subjects.

2. The controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed. That obligation applies to the amount of personal data collected, the extent of their processing, the period of their storage and their accessibility. In particular, such measures shall ensure that by default personal data are not made accessible without the individual's intervention to an indefinite number of natural persons.

3. An approved certification mechanism pursuant to Article 42 may be used as an element to demonstrate compliance with the requirements set out in paragraphs 1 and 2 of this Article.

Relevant Recitals

Recital 78: Appropriate Technical and Organisational Measures
The protection of the rights and freedoms of natural persons with regard to the processing of personal data require that appropriate technical and organisational measures be taken to ensure that the requirements of this Regulation are met. In order to be able to demonstrate compliance with this Regulation, the controller should adopt internal policies and implement measures which meet in particular the principles of data protection by design and data protection by default. Such measures could consist, inter alia, of minimising the processing of personal data, pseudonymising personal data as soon as possible, transparency with regard to the functions and processing of personal data, enabling the data subject to monitor the data processing, enabling the controller to create and improve security features. When developing, designing, selecting and using applications, services and products that are based on the processing of personal data or process personal data to fulfil their task, producers of the products, services and applications should be encouraged to take into account the right to data protection when developing and designing such products, services and applications and, with due regard to the state of the art, to make sure that controllers and processors are able to fulfil their data protection obligations. The principles of data protection by design and by default should also be taken into consideration in the context of public tenders.

Commentary

With the introduction of the GDPR, a provision solely dedicated to the concepts of “data protection by design” and “data protection by default”, was introduced. The Data Protection Directive did not contain a similar provision. Although Article 17 DPD Recital 46 had a similar thrust, the focus in those provisions revolved mostly around security.[1]

However, these concepts were not new: privacy by design -and default was originally conceptualized in the 1990s by the Canadian Information and Privacy Commissioner of Ontario. They held that, in order to be effective, data protection must be implemented ex ante. Hence, the controller must define the privacy requirements that need to be taken into account while engineering, and determine the default settings of the final product.[2] Now, because of the differences between privacy and data protection, the GDPR speaks of data protection by design –and default, rather than privacy by design- and default.[3]

The “overall thrust of the provision” is to impose an obligation on controllers to put in place technical and organisational measures that are designed to implement data protection principles and the rights of data subjects.[4] Although the controller is responsible for adherence with these principles, Recital 78 stipulates that producers of applications, products, and services, are encouraged to consider the data protection obligations that controllers need to fulfil. Hence, the goal is to have developers and controllers embrace a culture of responsibility and systematically indicate processes which could infringe the GDPR,[5] and to strengthen the data subject's trust in the processing systems.[6] Although the provision obligations on controllers, a violation can only occur if this is in connection with other GDPR principles.[7]

Article 25 is structured as followed: the first paragraph describes the principles of data protection by design in more detail. The second paragraph expands on this by describing the principles of data protection by default. The third paragraph is similar to the third paragraph of Article 24 since it explains that an approved certification mechanism, pursuant to Article 42, “may be used as an element to demonstrate compliance”.With the introduction of the GDPR, a provision solely dedicated to the concepts of “data protection by design” and “data protection by default”, was introduced. The Data Protection Directive did not contain a similar provision. Although Article 17 DPD Recital 46 had a similar thrust, the focus in those provisions revolved mostly around security.[8]

(1) Data Protection by Design

Data Protection by Design - the Meaning of the Principle

The principle of data protection by design follows from the realisation that principles of data protection can best be assured when already integrated into the architectural design of the specific technology. Again, like in Article 24(1), the controller must implement appropriate technical and organisational measures to ensure compliance with the data protection principles. However, Article 25(1) is different because "technology is no longer the object of regulation, but the content".[9] Now, to determine the appropriateness of the measures, the controller must consider several elements. Like in Article 24(1), they must follow a risk-based approach, and therefore, again these must also be considered in the light of the principle of proportionality.[10]

Elements to Take into Account

State of the Art

In general, this means, that the controller has to take into account the latest developments in its field and has to stay up-to-date with technology. However, "state of the art" also refers to organisational measures, meaning that the internal policies, training etc., must be updated accordingly. Although existing standards can indicate what is "state of the art", this assessment must be done continuously.[11]

Cost of Implementation

With "cost", resources in general are meant, including time spent and human resources. Although alternative, less resource demanding (but effective) measures can be used, "the cost of implementation is a factor to be considered to implement data protection by design rather than a ground to not implement it".[12]

Nature, Scope, Context and Purpose of Processing

These criteria have the same meaning as in Article 24(1) and Article 32(1). Hence, the nature is “the inherent characteristics of the processing” (i.e., whether sensitive data is processed); the "scope" refers to the size and range of the processing; the context relates to all relevant circumstances, and with "purpose", the aim of the processing is meant.[13]

Risks of Varying Likelihood and Severity for Rights and Freedoms of Natural Persons

Again, just like Article 24(1) and Article 32(1), the same (above-mentioned) conditions must be considered, to protect the same rights, against the same risks. Considering this risk-based approach, a controller can perform a Data Protection Impact Assessment (DPIA) to assess these risks. Although "best practices and standards" may be used as a "useful toolbox", such a DPIA must, in principle, always be carried out on a case by case basis.[14]

Time Aspect

As with the criterion of "state of the art", controllers must assess their implemented measures continuously, to ensure data data protection by design "at the time of the processing". However, by stipulating that data protection by design shall also be implemented "at the time of the determination of the means for processing", it is clear that the legislator intended that the controller also has to consider the principle already during the planning and development stage. Hence, the processing operations should be considered as early as possible, and the controller can not use the "excuse" that it would lead to disproportionally high costs to implement data protection friendly measures at a later stage.[15] More problematic is what to do with an existent system (that pre-dated the coming into force of the GDPR) that cannot easily be changed. Companies and institutions must re-asses their means of processing if the systems they use are outdated, and incompatible to ensure compliance with the GDPR.[16] Because the the state of the art continuously changes, updating systems will be a continuous and necessary practical component of adhering to the privacy by design principle during ongoing processing activities.[17]

Types of Measures and Necessary Safeguards

As is the case with Article 24(1), the measures to be implemented to ensure compliance with the principle of data protection by design, must be understood in a broad sense. Any method that implements the data protection principles "effectively" and suffices the above-mentioned criteria, can be used. As the EDPB stipulates, the "appropriateness" requirement is closely related to the requirement of "effectiveness".[18] Although "pseudonymisation" is the only measure that is listed in the provision as an example, the training of personnel, limiting access to personal data, or any technical measure like anonymisation or advanced encryption, could all be effective measures. However, what differs these measures from measures under Article 24(1), is that these measures are already designed. For example: automatic erasure of certain personal data by the software to comply with the principle of storage limitation.[19] However, not only active measures by the controller or developer are meant. The possibility for the data subject to exercise their rights and control the extend of processing through dashboards is another example of a measure.[20]

To deal with the broadness of measures that can be taken, a controller needs to have a data strategy in place. Such a data strategy may consist of data guidelines, documentation, monitoring and the evaluation of measures.[21] The GDPR does not contain concrete examples of data protection by design. However, the Spanish Data Protection Authority has published a useful guide with practical examples regarding a strategy for data collection[22] and processing.[23] Moreover, an important part of Article 25 GDPR is the so-called “privacy engineering".[24] Tactics for privacy engineering are needed in each step of the software design pattern and in the final PETS (Privacy Enhancing technologies).[25] The design and development of the system needs a privacy verification and validation process, which consists of integration of the system, proof, evaluations, and continuous maintenance.[26]

(2) Data Protection by Default

The principle of data protection by default means that a product or service should have the most data protection-friendly settings configured when the product or service is first turned on or used.[27] The word "default" comes from computer science, and means so much as "the pre-existing or preselected value of a configurable setting". Hence, the "factory presets", in case of electronic products, should conform to the highest data protection standard.[28]

Scope

Although many different kinds of controllers fall under the scope of Article 25(2), it seems to primarily focus on internet-based services, like social media networks, but also operating systems "smart devices" that collect data. Sentence 3 of Article 25(3) seems to particularly refer to social media networks and services alike that offer to provide personal data to an indefinite number of people. It follows from the principle of data protection by default that users of such services can select how big the group of recipients of of their personal data should be, but that the smallest group of people should be the standard.[29] Moreover, it follows that if third party software is used, controllers are obliged to disable features that collect personal data without a basis in Article 6(1) GDPR. Lastly, the principle is also relevant where roles are allocated to staff who have access to data.[30]

Appropriate Technical and Organisational Measures

To ensure the highest "default" data protection standard, the controller must implement appropriate organisational and technical measures. Again, before analysing what these measures entail, it needs to be clarified what “appropriate” means. The EDPS Guidelines on assessing the proportionality of measures that limit the fundamental rights to privacy and to the protection of personal data[31], described in Article 24 GDPR and Article 32 GDPR, can be used for insight. Although the measures should be implemented to ensure compliance with every data protection principle, and are therefore to be understood the same way as in Article 25(1), the measures in the context of Article 25(2) apply especially to the principle of data minimisation.[32]

Dimensions of the data minimisation obligation

It follows from the second sentence of Article 25(2) that there are different dimensions that result from the obligation of data minimisation: the amount of personal data collected; the extent of their processing; the period of their storage and the accessibility. By default, controllers should not collect a higher amount of data than is necessary for the purpose. Moreover, not every processing operation is necessary to fulfil the purpose. The storage period needs to be objectively justified and if possible, data shall be deleted by default. Lastly, the controller must limit, by default, the amount of persons that have access to the personal data.[33]

Differences with Principle of Data Protection by Design

Although the principles of data protection by design -and by default are similar, there are are considerable differences between them. First, "by design" is broader than "by default, since the focus of the latter principle is on ensuring data minimisation and confidentiality. Moreover, whereas "by design" seems to have a focus on the stages of the development of the product, "by default" focusses more on the end-result: are the settings configured in such a way that data minimisation and confidentiality are ensured? However, although Article 25(1) mentions that the measures apply to both the development and processing stage, this also has to be assumed for Article 25(2), even though the paragraph does not state it explicitly.[34] After all, a factory preset can only be set to the most data protection-friendly default setting when this end-result has already been envisaged during the development process. Hence, as the EDPB stipulates, these concepts (should) reinforce each other. One can consider the following example: a company that produces operating software for a computer has, inter alia, to consider that a customer might want to change their settings in such a way that they can amend their data protection settings themselves, as follows from Article 25(1). However, when this computer is delivered to the customer, the default settings within the software must already be set in such a way that the data protection principles of data minimisation and confidentiality are already ensured, since this follows from Article 25(2).

(3) Approved Certification Mechanism

The last paragraph of the provision is similar to Article 24(3). It states that an "approved certification mechanism pursuant to Article 42" may be used as an element to demonstrate compliance with the requirements set out in the first two paragraphs of the provision. Hence, just like in Article 24(3), it follows from the word "element" that such adherence only supports the assumption that the controller is compliant, and does not prove it.[35]

EDPB Guidelines: on this article there are EDPB Guidelines 4/2019 on Article 25 Data Protection by Design and by Default

Decisions

→ You can find all related decisions in Category:Article 25 GDPR

References

  1. Bygrave, in Kuner, Bygrave, Docksey, The EU General Data Protection Regulation (GDPR): A Commentary, Article 25 GDPR, p. 573 (Oxford University Press 2020).
  2. Nolte, Werkmeister, in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 1 (C.H. Beck 2018, 2nd Edition).
  3. Hartung, in Kühling, Buchner, DS-GVO BDSG, Article 25, margin number 1 (C.H. Beck 2020, 3rd Edition).
  4. Bygrave, in Kuner, Bygrave, Docksey, The EU General Data Protection Regulation (GDPR): A Commentary, Article 25 GDPR, p. 576 (Oxford University Press 2020).
  5. AEPD, Guía de Privacidad desde el Diseño, October 2019, pp. 6-7 (available here).
  6. Martini, in Paal, Pauly, DS-GVO, Article 25, margin number 11 (C.H. Beck 2021, 3rd Edition), citing 'Cavoukian Privacy by Design - The 7 Foundational Principles', 2011, p. 1 (available here).
  7. Nolte, Werkmeister, in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 3 (C.H. Beck 2018, 2nd Edition).
  8. Hartung, in Kühling, Buchner, DS-GVO BDSG, Article 25, margin number 10 (C.H. Beck 2020, 3rd Edition).
  9. Martini, in Paal, Pauly, DS-GVO, Article 25, margin number 10 (C.H. Beck 2021, 3rd Edition).
  10. Hartung, in Kühling, Buchner, DS-GVO BDSG, Article 25, margin number 19 (C.H. Beck 2020, 3rd Edition).
  11. EDPB, 'Guidelines 4/2019 on Article 25 Data Protection by Design and by Default', 20 October 2020 (Version 2.0), p. 8 (available here).
  12. EDPB, 'Guidelines 4/2019 on Article 25 Data Protection by Design and by Default', 20 October 2020 (Version 2.0), pp. 8-9 (available here).
  13. EDPB, 'Guidelines 4/2019 on Article 25 Data Protection by Design and by Default' Version 2.0 (2020). p. 9.
  14. EDPB, 'Guidelines 4/2019 on Article 25 Data Protection by Design and by Default', 20 October 2020 (Version 2.0), pp. 9-10 (available here).
  15. EDPB, 'Guidelines 4/2019 on Article 25 Data Protection by Design and by Default', 20 October 2020 (Version 2.0), p. 10 (available here); Hartung, in Kühling, Buchner, DS-GVO BDSG, Article 25, margin number 23 (C.H. Beck 2020, 3rd Edition).
  16. EDPB, 'Guidelines 4/2019 on Article 25 Data Protection by Design and by Default', 20 October 2020 (Version 2.0), p. 11 (available here).
  17. Nolte, Werkmeister, in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 14 (C.H. Beck 2018, 2nd Edition).
  18. EDPB, 'Guidelines 4/2019 on Article 25 Data Protection by Design and by Default', 20 October 2020 (Version 2.0), p. 6 (available here).
  19. Bygrave, in Kuner, Bygrave, Docksey, The EU General Data Protection Regulation (GDPR): A Commentary, Article 25 GDPR, p. 577 (Oxford University Press 2020).
  20. Hartung, in Kühling, Buchner, DS-GVO BDSG, Article 25, margin number 16 (C.H. Beck 2020, 3rd Edition).
  21. Nolte, Werkmeister, in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 20 (C.H. Beck 2018, 2nd Edition).
  22. AEPD, Guía de Privacidad desde el Diseño, October 2019, p. 24: These practical examples consist out of (1) minimisation: Limit the needed data to the maximum needed (selection, exclusion, cutting of and delete by means of anonymization, pseudonymisation, bloc possibilities to connect data with each other), (2) hiding: Measures that prevent personal data to be public or known (Restrict access possibilities, disassociate and aggregate credential-based attributes, mixing data or encrypt them), (3) separating: Separate data in different containers, isolate data or distribute them by means of anonymous blacklists, homorphic encryption, physical and logical separation, (4) abstraction: by leaving out details to the highest extent possible (summarizing, grouping and disturbing with aggregation in time, K-anonymity, obfuscation of measurements by noise aggregation, dynamic location granularity).
  23. AEPD, Guía de Privacidad desde el Diseño, October 2019, p. 25: These practical examples consist out of: (1) information of data subjects on the processing and its conditions via simple explanation and notifications (also: notification of data breaches,  dynamic visualization of privacy policies,  privacy icons and processing alerts), (2) control – Giving data subjects control over their personal data by consent, alert, choice, actualization, reiterations (panels to choose preferences, active presence transmission, selection of credentials, informed consent), (3) compliance by respect and boost compliance with obligations imposed by current legislation and own privacy policies (definitions, maintenance and defense, evaluation of DPIAs, access control, management of obligations, compliance with policies), (4) demonstration – show that processing is respecting privacy by registering, audit and information..
  24. AEPD, Guía de Privacidad desde el Diseño, October 2019, pp. 17 et seqq: E.g. disconnecting information from each other – minimise, abstract, spate, occult; control – comply, show; transparency – inform).
  25. AEPD, Guía de Privacidad desde el Diseño, October 2019, p. 16, citing Commission, Communication from the Commission to the European Parliament and the Council on Promoting Data Protection by Privacy Enhancing Technologies (PETs), 2 May 2007, p. 3: “...the use of appropriate technological measures is an essential complement to legal means and  should be  an  integral part  in  any efforts  to  achieve a  sufficient  level of  privacy  protection...".
  26. AEPD, Guía de Privacidad desde el Diseño, October 2019, p. 15 (available here).
  27. Hartung, in Kühling, Buchner, DS-GVO BDSG, Article 25, margin number 24 (C.H. Beck 2020, 3rd Edition).
  28. EDPB, 'Guidelines 4/2019 on Article 25 Data Protection by Design and by Default', 20 October 2020 (Version 2.0), p. 11 (available here).
  29. Nolte, Werkmeister, in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 28 (C.H. Beck 2018, 2nd Edition); Hartung, in Kühling, Buchner, DS-GVO BDSG, Article 25, margin numbers 25-26 (C.H. Beck 2020, 3rd Edition).
  30. EDPB, 'Guidelines 4/2019 on Article 25 Data Protection by Design and by Default' Version 2.0 (2020). p. 11, paras 40-43.
  31. EDPS, ‘Guidelines on assessing the proportionality of measures that limit the fundamental rights to privacy and to the protection of personal data’, 19 December 2019 (available here).
  32. EDPB, 'Guidelines 4/2019 on Article 25 Data Protection by Design and by Default', 20 October 2020 (Version 2.0), p. 12 (available here).
  33. EDPB, 'Guidelines 4/2019 on Article 25 Data Protection by Design and by Default', 20 October 2020 (Version 2.0), pp. 12-14 (available here).
  34. Bygrave, in Kuner, Bygrave, Docksey, The EU General Data Protection Regulation (GDPR): A Commentary, Article 25 GDPR, p. 577 (Oxford University Press 2020).
  35. Hartung, in Kühling, Buchner, DS-GVO BDSG, Article 25, margin numbers 25-26 (C.H. Beck 2020, 3rd Edition).