Article 25 GDPR

From GDPRhub
Article 25 - Data protection by design and by default
Gdpricon.png
Chapter 10: Delegated and implementing acts

Legal Text

Article 25 - Data protection by design and by default

1. Taking into account the state of the art, the cost of implementation and the nature, scope, context and purposes of processing as well as the risks of varying likelihood and severity for rights and freedoms of natural persons posed by the processing, the controller shall, both at the time of the determination of the means for processing and at the time of the processing itself, implement appropriate technical and organisational measures, such as pseudonymisation, which are designed to implement data-protection principles, such as data minimisation, in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of this Regulation and protect the rights of data subjects.

2. The controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed. That obligation applies to the amount of personal data collected, the extent of their processing, the period of their storage and their accessibility. In particular, such measures shall ensure that by default personal data are not made accessible without the individual's intervention to an indefinite number of natural persons.

3. An approved certification mechanism pursuant to Article 42 may be used as an element to demonstrate compliance with the requirements set out in paragraphs 1 and 2 of this Article.

Relevant Recitals

Recital 78: Appropriate Technical and Organisational Measures
The protection of the rights and freedoms of natural persons with regard to the processing of personal data require that appropriate technical and organisational measures be taken to ensure that the requirements of this Regulation are met. In order to be able to demonstrate compliance with this Regulation, the controller should adopt internal policies and implement measures which meet in particular the principles of data protection by design and data protection by default. Such measures could consist, inter alia, of minimising the processing of personal data, pseudonymising personal data as soon as possible, transparency with regard to the functions and processing of personal data, enabling the data subject to monitor the data processing, enabling the controller to create and improve security features. When developing, designing, selecting and using applications, services and products that are based on the processing of personal data or process personal data to fulfil their task, producers of the products, services and applications should be encouraged to take into account the right to data protection when developing and designing such products, services and applications and, with due regard to the state of the art, to make sure that controllers and processors are able to fulfil their data protection obligations. The principles of data protection by design and by default should also be taken into consideration in the context of public tenders.

Commentary on Article 25

With the introduction of the GDPR, a provision solely dedicated to the concepts of “data protection by design” and “data protection by default”, was introduced. The Data Protection Directive did not contain a similar provision. Although Article 17 DPD Recital 46 had a similar thrust, the focus in those provisions revolved mostly around security.[1]

However, these concepts were not new: privacy by design -and default was originally conceptualized in the 1990s by the Canadian Information and Privacy Commissioner of Ontario. They held that, in order to be effective, data protection must be implemented ex ante. Hence, the controller must define the privacy requirements that need to be taken into account while engineering, and determine the default settings of the final product.[2] Now, because of the differences between privacy and data protection, the GDPR speaks of data protection by design –and default, rather than privacy by design- and default.[3]

The overall thrust of the provision is to impose an obligation on controllers to put in place technical and organisational measures that are designed to implement data protection principles and the rights of data subjects.[4] Although the controller is responsible for adherence with these principles, Recital 78 stipulates that producers of applications, products, and services, are encouraged to consider the data protection obligations that controllers need to fulfil. Hence, the goal is to have developers and controllers embrace a culture of responsibility and systematically indicate processes which could infringe the GDPR,[5] and to strengthen the data subject's trust in the processing systems.[6]

Article 25 is structured as followed: the first paragraph describes the principles of data protection by design in more detail. The second paragraph expands on this by describing the principles of data protection by default. The third paragraph is similar to the third paragraph of Article 24 since it explains that an approved certification mechanism, pursuant to Article 42, “may be used as an element to demonstrate compliance”.[7]

(1) Data Protection by Design

Data Protection by Design - the Meaning of the Principle

The principle of data protection by design follows from the realisation that principles of data protection can best be assured when already integrated into the architectural design of the specific technology. Again, like in Article 24(1), the controller must implement appropriate technical and organisational measures to ensure compliance with the data protection principles. However, Article 25(1) is different because "technology is no longer the object of regulation, but the content".[8] Now, to determine the appropriateness of the measures, the controller must consider several elements. Like in Article 24(1), they must follow a risk-based approach, and therefore, again these must also be considered in the light of the principle of proportionality.[9]

Elements to Take into Account

State of the Art

In general, this means, that the controller has to take into account the latest developments in its field and has to stay up-to-date with technology. However, "state of the art" also refers to organisational measures, meaning that the internal policies, training etc., must be updated accordingly. Although existing standards can indicate what is "state of the art", this assessment must be done continuously.[10]

Cost of Implementation

With "cost", resources in general are meant, including time spent and human resources. Although alternative, less resource demanding (but effective) measures can be used, "the cost of implementation is a factor to be considered to implement data protection by design rather than a ground to not implement it".[11]

Nature, Scope, Context and Purpose of Processing

These criteria have the same meaning as in Article 24(1) and Article 32(1). Hence, the nature is “the inherent characteristics of the processing” (i.e., whether sensitive data is processed); the "scope" refers to the size and range of the processing; the context relates to all relevant circumstances, and with "purpose", the aim of the processing is meant.[12]

Risks of Varying Likelihood and Severity for Rights and Freedoms of Natural Persons

Again, just like Article 24(1) and Article 32(1), the same (above-mentioned) conditions must be considered, to protect the same rights, against the same risks. Considering this risk-based approach, a controller can perform a Data Protection Impact Assessment (DPIA) to assess these risks. Although "best practices and standards" may be used as a "useful toolbox", such a DPIA must, in principle, always be carried out on a case by case basis.[13]

Time Aspect

As with the criterium of "state of the art", controllers must assess their implemented measures continuously, to ensure data data protection by design "at the time of the processing". However, by stipulating that data protection by design shall also be implemented "at the time of the determination of the means for processing", it is clear that the legislator intended that the controller also has to consider the principle already during the planning and development stage. Hence, the processing operations should be considered as early as possible, and the controller can not use the "excuse" that it would lead to disproportionally high costs to implement data protection friendly measures at a later stage.[14] More problematic is what to do with an existent system (that predated the coming into force of the GDPR) that cannot easily be changed. Companies and institutions must re-asses their means of processing if the systems they use are outdated, and incompatible to ensure compliance with the GDPR. Because the the state of the art continuously changes, updating systems will be a continuous and necessary practical component of adhering to the privacy by design principle during ongoing processing activities.[15]

Types of Measures and Necessary Safeguards

As is the case with Article 24(1), the measures to be implemented to ensure compliance with the principle of data protection by design, must be understood in a broad sense. Any method that implements the data protection principles "effectively" and suffices the above-mentioned criteria, can be used. As the EDPB stipulates, the "appropriateness" requirement is closely related to the requirement of "effectiveness".[16] Although "pseudonymisation" is the only measure that is listed in the provision as an example, the training of personnel, limiting access to personal data, or any technical measure like anonymisation or advanced encryption, could all be effective measures. However, what differs these measures from measures under Article 24(1), is that these measures are already designed. For example: automatic erasure of certain personal data by the software to comply with the principle of storage limitation.[17] However, not only active measures by the controller or developer are meant. The possibility for the data subject to exercise their rights and control the extend of processing through dashboards is another example of a measure.[18]

To deal with the broadness of measures that can be taken, a controller needs to have a data strategy in place. Such a data strategy may consist of data guidelines, documentation, monitoring and the evaluation of measures.[19] The GDPR does not contain concrete examples of data protection by design. However, the Spanish Data Protection Authority has published a useful guide with practical examples regarding a strategy for data collection[20] and processing.[21] Moreover, an important part of Article 25 GDPR is the so-called “privacy engineering".[22] Tactics for privacy engineering are needed in each step of the software design pattern and in the final PETS (Privacy Enhancing technologies).[23] The design and development of the system needs a privacy verification and validation process, which consists of integration of the system, proof, evaluations, and continuous maintenance.[24]

(2) Data Protection by Default

Data Protection by Default - The Meaning of the Principle and Differences with Data Protection by Design

The principle of data protection by default means that a product or service should have the most data protection-friendly settings configured when the product is first turned on or used.[25] The word "default" comes from computer science, and means so much as "the pre-existing or preselected value of a configurable setting". Hence, the "factory presets", in case of electronic products, should conform to the highest data protection standard.[26]

Although the principles of data protection by design -and by default are similar, there are are considerable differences between them. First, "by design" is broader than "by default, since the focus of the latter principle is on ensuring data minimisation and confidentiality. Moreover, whereas "by design" seems to have a focus on the stages of the development of the product, "by default" focusses more on the end-result: are the settings configured in such a way that data minimisation and confidentiality are ensured?[27] However, although Article 25(1) mentions that the measures apply to both the development and processing stage, this also has to be assumed for Article 25(2), even though the paragraph does not state it explicitly.[28] After all, a factory preset can only be set to the most data protection-friendly default setting when this end-result has already been envisaged during the development process. Hence, as the EDPB stipulates, these concepts (should) reinforce each other.[29]

One can consider the following example: a company that produces operating software for a computer has, inter alia, to consider that a customer might want to change their settings in such a way that they can amend their data protection settings themselves, as follows from Article 25(1). However, when this computer is delivered to the customer, the default settings within the software must already be set in such a way that the data protection principles of data minimisation and confidentiality are already ensured, since this follows from Article 25(2).

Requirements that follow from the principle

It follows that if third party software is used, controllers are obliged to disable features that collect personal data without a basis in Article 6(1) GDPR.[30] Defaults are also relevant where roles are allocated to staff who have access to data.[31] Finally, the storage period needs to be objectively justified and if possible, data shall be deleted by default.[32]


Article 25 GDPR can result in a violation of the GDPR only if it is violated in connection with other GDPR principles.[33] Article 25 (2) GDPR is lex specialis in relation to Article 25(1) GDPR. Article 25(2) GDPR states that only the personal data which is necessary for each specific purpose shall be processed, while Article 25(1) GDPR regulates general privacy design obligations.[34] --> check if this is only one opinion or shared among scholars

Appropriate Technical and Organisational Measures

Before analysing technical and organisational measures, it needs to be clarified what “appropriate” means. The EDPS Guidelines on assessing the proportionality of measures that limit the fundamental rights to privacy and to the protection of personal data[35], described in Article 24 GDPR and Article 32 GDPR, can be used for insight. Technical measures[36] and organisational measures that implement data protection principles[37] are also named as examples in some commentaries

Above all, controllers have to demonstrate that they have implemented measures to be effective.

(3) Approved Certification Mechanism

A certification mechanism could be the certification described in Article 42 GDPR, but this remains debated.[38]

Decisions

→ You can find all related decisions in Category:Article 25 GDPR

References

  1. Bygrave, in Kuner et al., The EU General Data Protection Regulation (GDPR), Article 25 GDPR, p. 573 (Oxford University Press 2020).
  2. Nolte, Werkmeister, in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 1 (Beck 2018, 2nd ed.) (accessed 19 August 2021).
  3. Hartung, in Kühling & Buchner, DS-GVO BDSG, Art. 25, para 1 (C.H.Beck 2020).
  4. Bygrave, in Kuner et al., The EU General Data Protection Regulation (GDPR), Article 25 GDPR, p. 576 (Oxford University Press 2020).
  5. AEPD, Guía de Privacidad desde el Diseño, October 2019, p. 3.
  6. Martini, in Paal & Pauly, DS-GVO Art. 25, para 11 (C.H.Beck 2021) citing 'Cavoukian Privacy by Design - The 7 Foundational Principles', 2011, p. 1.
  7. Hartung, in Kühling & Buchner, DS-GVO BDSG, Art. 25, para 1 (C.H.Beck 2020).
  8. Martini, in Paal & Pauly, DS-GVO Art. 25, paras 9-10 (C.H.Beck 2021)
  9. Hartung, in Kühling & Buchner, DS-GVO BDSG, Art. 25, para 19 (C.H.Beck 2020).
  10. EDPB, 'Guidelines 4/2019 on Article 25 Data Protection by Design and by Default' Version 2.0 (2020). p. 8, paras 18-22.
  11. EDPB, 'Guidelines 4/2019 on Article 25 Data Protection by Design and by Default' Version 2.0 (2020). p. 8-9, paras 23-25.
  12. EDPB, 'Guidelines 4/2019 on Article 25 Data Protection by Design and by Default' Version 2.0 (2020). p. 9, paras 26-28.
  13. EDPB, 'Guidelines 4/2019 on Article 25 Data Protection by Design and by Default' Version 2.0 (2020). p. 9-10, paras 29-32.
  14. EDPB, 'Guidelines 4/2019 on Article 25 Data Protection by Design and by Default' Version 2.0 (2020). p. 10, paras 33-36, and Hartung, in Kühling & Buchner, DS-GVO BDSG, Art. 25, para 23 (C.H.Beck 2020).
  15. Nolte, Werkmeister, in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, para 14 (Beck 2018, 2nd ed.).
  16. EDPB, 'Guidelines 4/2019 on Article 25 Data Protection by Design and by Default' Version 2.0 (2020). p. 6, paras 7-10. See these guidelines for a more extensive elaboration on "effectiveness".
  17. Bygrave, in Kuner et al., The EU General Data Protection Regulation (GDPR), Article 25 GDPR, p. 577 (Oxford University Press 2020).
  18. Hartung, in Kühling & Buchner, DS-GVO BDSG, Art. 25, para 16 (C.H.Beck 2020
  19. Nolte, Werkmeister, in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 18 (Beck 2018, 2nd ed.) (accessed 19 August 2021).
  20. AEPD, Guía de Privacidad desde el Diseño, October 2019, p. 24: These practical examples consist out of (1) minimisation: Limit the needed data to the maximum needed (selection, exclusion, cutting of and delete by means of anonymization, pseudonymisation, bloc possibilities to connect data with each other), (2) hiding: Measures that prevent personal data to be public or known (Restrict access possibilities, disassociate and aggregate credential-based attributes, mixing data or encrypt them), (3) separating: Separate data in different containers, isolate data or distribute them by means of anonymous blacklists, homorphic encryption, physical and logical separation, (4) abstraction: by leaving out details to the highest extent possible (summarizing, grouping and disturbing with aggregation in time, K-anonymity, obfuscation of measurements by noise aggregation, dynamic location granularity).
  21. AEPD, Guía de Privacidad desde el Diseño, October 2019, p. 25: These practical examples consist out of: (1) information of data subjects on the processing and its conditions via simple explanation and notifications (also: notification of data breaches,  dynamic visualization of privacy policies,  privacy icons and processing alerts), (2) control – Giving data subjects control over their personal data by consent, alert, choice, actualization, reiterations (panels to choose preferences, active presence transmission, selection of credentials, informed consent), (3) compliance by respect and boost compliance with obligations imposed by current legislation and own privacy policies (definitions, maintenance and defense, evaluation of DPIAs, access control, management of obligations, compliance with policies), (4) demonstration – show that processing is respecting privacy by registering, audit and information..
  22. AEPD, Guía de Privacidad desde el Diseño, October 2019, pp. 17 et seqq: E.g. disconnecting information from each other – minimise, abstract, spate, occult; control – comply, show; transparency – inform).
  23. AEPD, Guía de Privacidad desde el Diseño, October 2019, p. 16, citing Commission, Communication from the Commission to the European Parliament and the Council on Promoting Data Protection by Privacy Enhancing Technologies (PETs), 2 May 2007, p. 3: “...the use of appropriate technological measures is an essential complement to legal means and  should be  an  integral part  in  any efforts  to  achieve a  sufficient  level of  privacy  protection...".
  24. AEPD, Guía de Privacidad desde el Diseño, October 2019, p. 15.
  25. Hartung, in Kühling & Buchner, DS-GVO BDSG, Art. 25, para 24 (C.H.Beck 2020).
  26. EDPB, 'Guidelines 4/2019 on Article 25 Data Protection by Design and by Default' Version 2.0 (2020). p. 11, paras 40-41.
  27. Bygrave, in Kuner et al., The EU General Data Protection Regulation (GDPR), Article 25 GDPR, p. 577 (Oxford University Press 2020).
  28. Bygrave, in Kuner et al., The EU General Data Protection Regulation (GDPR), Article 25 GDPR, p. 577 (Oxford University Press 2020).
  29. EDPB, 'Guidelines 4/2019 on Article 25 Data Protection by Design and by Default' Version 2.0 (2020) p. 6, para 5.
  30. EDPB, Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, 20 October 2020, p. 11.
  31. EDPB, Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, 20 October 2020, pp. 11 et seq.
  32. EDPB, Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, 20 October 2020, p. 13.
  33. Nolte, Werkmeister, in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 3 (Beck 2018, 2nd ed.) (accessed 19 August 2021).
  34. Nolte, Werkmeister, in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 8 (Beck 2018, 2nd ed.) (accessed 19 August 2021).
  35. EDPS, Guidelines on Assessing the proportionality of measures that limit the fundamental rights to privacy and to the protection of personal data, 19 December 2019, pp. 1 et seqq.
  36. Nolte, Werkmeister, in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 16 (Beck 2018, 2nd ed.) (accessed 19 August 2021): (1) pseudonymization (Article 4 nr. 5), (2) encryption, (3) access controls, (4) anonymization, (5) aggregation, (6) transparency on functions and processing, (7) control of processing via dashboards, (8) purpose principle.
  37. Nolte, Werkmeister, in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 17 (Beck 2018, 2nd ed.) (accessed 19 August 2021): (1) training, (2) internal checks/audits, (3) interdisciplinary project teams, (4) ethic committees for complex assessments (Article 5(1)(a) GDPR, (5) role and access concepts (Article 5(1)(c) GDPR, (6) deletion concepts (Article 5(1)(e) GDPR, (7) voluntary DPIAS (Articles 35 and 5(2) GDPR).
  38. Nolte, Werkmeister, in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 32 (Beck 2018, 2nd ed.) (accessed 19 August 2021).