Article 25 GDPR: Difference between revisions

From GDPRhub
No edit summary
(3 intermediate revisions by the same user not shown)
Line 197: Line 197:
Privacy by design and default was originally conceptualized in the 1990s by the Canadian Information and Privacy Commissioner of Ontario.<ref>''Nolte, Werkmeister'', in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 1 (Beck 2018, 2nd ed.) (accessed 19 August 2021).</ref> According to them, data protection must be thought of ''ex ante'' in order to be effective. The controller must define the privacy requirements that need to be taken into account while engineering, and determine the default settings of the final product.
Privacy by design and default was originally conceptualized in the 1990s by the Canadian Information and Privacy Commissioner of Ontario.<ref>''Nolte, Werkmeister'', in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 1 (Beck 2018, 2nd ed.) (accessed 19 August 2021).</ref> According to them, data protection must be thought of ''ex ante'' in order to be effective. The controller must define the privacy requirements that need to be taken into account while engineering, and determine the default settings of the final product.


Article 25 GDPR aims to implement the data protection principles of [[Article 5 GDPR]] and to protect the rights of the data subjects.<ref>''Nolte, Werkmeister'', in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 8 (Beck 2018, 2nd ed.) (accessed 19 August 2021).</ref> The approach should be proactive. Accordingly, privacy by design requires that developers and controllers embrace a culture of responsibility and systematically indicate processes which could infringe the GDPR.<ref>AEPD,Guía de Privacidad desde el Diseño, October 2019, [https://www.aepd.es/sites/default/files/2019-11/guia-privacidad-desde-diseno.pdf p. 3].</ref>
Article 25 GDPR aims to implement the data protection principles of [[Article 5 GDPR]] and to protect the rights of the data subjects.<ref>''Nolte, Werkmeister'', in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 8 (Beck 2018, 2nd ed.) (accessed 19 August 2021).</ref> The approach should be proactive. Accordingly, privacy by design requires that developers and controllers embrace a culture of responsibility and systematically indicate processes which could infringe the GDPR.<ref>AEPD, Guía de Privacidad desde el Diseño, October 2019, [https://www.aepd.es/sites/default/files/2019-11/guia-privacidad-desde-diseno.pdf p. 3].</ref>


===(1) Controller Obligations===
===(1) Controller Obligations===
Article 25 GDPR addresses only the controller, and not producers of technical products, services or systems, as they do not decide on the concrete purposes and means of processing. [[Article 25 GDPR#%20ftn5|[5]]] However, recital 78 “encourages” producers to take into account the right to data protection so as to enable controllers and processors to fulfill their data protection obligations. Although they are not directly obliged, the invisible hand of demand and supply should favor producers who deliver products that adhere to the principles of data protection by design and default.
Article 25 GDPR addresses only the controller, and not producers of technical products, services or systems, as they do not decide on the concrete purposes and means of processing.<ref>''Nolte, Werkmeister'', in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 11 (Beck 2018, 2nd ed.) (accessed 19 August 2021).</ref> However, Recital 78 GDPR “encourages” producers to take into account the right to data protection so as to enable controllers and processors to fulfill their data protection obligations. Although they are not directly obliged, the invisible hand of demand and supply should favor producers who deliver products that adhere to the principles of data protection by design and default.


====Data Protection by Design====
====Data Protection by Design====
To have data processing which follows the principle of data protection by design, a controller needs to have a data strategy in place. A data strategy may consist of data guidelines, documentation, monitoring and the evaluation of measures.[[Article 25 GDPR#%20ftn6|[6]]]
To have data processing which follows the principle of data protection by design, a controller needs to have a data strategy in place. A data strategy may consist of data guidelines, documentation, monitoring and the evaluation of measures.<ref>''Nolte, Werkmeister'', in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 18 (Beck 2018, 2nd ed.) (accessed 19 August 2021).</ref>


The GDPR does not contain concrete examples of data protection by design. However, the Spanish Data Protection Authority has published a useful guide with practical examples regarding a strategy for data collection[[Article 25 GDPR#%20ftn7|[7]]] and processing.[[Article 25 GDPR#%20ftn8|[8]]]
The GDPR does not contain concrete examples of data protection by design. However, the Spanish Data Protection Authority has published a useful guide with practical examples regarding a strategy for data collection<ref>AEPD, Guía de Privacidad desde el Diseño, October 2019, [https://www.aepd.es/sites/default/files/2019-11/guia-privacidad-desde-diseno.pdf p. 24]: These practical examples consist out of (1) minimisation''':''' Limit the needed data to the maximum needed (selection, exclusion, cutting of and delete by means of anonymization, pseudonymisation, bloc possibilities to connect data with each other), (2) hiding''':''' Measures that prevent personal data to be public or known (Restrict access possibilities, disassociate and aggregate credential-based attributes, mixing data or encrypt them), (3) separating''':''' Separate data in different containers, isolate data or distribute them by means of anonymous blacklists, homorphic encryption, physical and logical separation, (4) abstraction: by leaving out details to the highest extent possible (summarizing, grouping and disturbing with aggregation in time, K-anonymity, obfuscation of measurements by noise aggregation, dynamic location granularity).</ref> and processing.<ref>AEPD, Guía de Privacidad desde el Diseño, October 2019, [https://www.aepd.es/sites/default/files/2019-11/guia-privacidad-desde-diseno.pdf p. 25]: These practical examples consist out of: (1) information of data subjects on the processing and its conditions via simple explanation and notifications (also: notification of data breaches,  dynamic visualization of privacy policies,  privacy icons and processing alerts), (2) control – Giving data subjects control over their personal data by consent, alert, choice, actualization, reiterations (panels to choose preferences, active presence transmission, selection of credentials, informed consent), (3) compliance by respect and boost compliance with obligations imposed by current legislation and own privacy policies (definitions, maintenance and defense, evaluation of DPIAs, access control, management of obligations, compliance with policies), (4) demonstration – show that processing is respecting privacy by registering, audit and information..</ref>


An important part of Article 25 GDPR is the so-called “privacy engineering."[[Article 25 GDPR#%20ftn9|[9]]] Tactics for privacy engineering are needed in each step of the software design pattern and in the final PETS (Privacy Enhancing technologies). [[Article 25 GDPR#%20ftn10|[10]]]
An important part of Article 25 GDPR is the so-called “''privacy engineering''".<ref>AEPD, Guía de Privacidad desde el Diseño, October 2019, [https://www.aepd.es/sites/default/files/2019-11/guia-privacidad-desde-diseno.pdf pp. 17 et seqq:] E.g. disconnecting information from each other – minimise, abstract, spate, occult; control – comply, show; transparency – inform).</ref> Tactics for privacy engineering are needed in each step of the software design pattern and in the final PETS (Privacy Enhancing technologies).<ref>AEPD, Guía de Privacidad desde el Diseño, October 2019, [https://www.aepd.es/sites/default/files/2019-11/guia-privacidad-desde-diseno.pdf p. 16], citing Commission, Communication from the Commission to the European Parliament and the Council on Promoting Data Protection by Privacy Enhancing Technologies (PETs), 2 May 2007, [https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52007DC0228&from=EN p. 3]: “''...the use of appropriate technological measures is an essential complement to legal means and  should be  an  integral part  in  any efforts  to  achieve a  sufficient  level of  privacy  protection...''".</ref>


The design and development of the system needs a privacy verification and validation process, which consists of integration of the system, proof, evaluations, and continuous maintenance. [[Article 25 GDPR#%20ftn11|[11]]]
The design and development of the system needs a privacy verification and validation process, which consists of integration of the system, proof, evaluations, and continuous maintenance.<ref>AEPD, Guía de Privacidad desde el Diseño, October 2019, [https://www.aepd.es/sites/default/files/2019-11/guia-privacidad-desde-diseno.pdf p. 15].</ref>


Privacy by design must also correspond to the following criterion:
Privacy by design must also correspond to the following criterion:


====State of the art technology====
====State of the Art Technology====
Article 25 GDPR also refers to technical and organizational measures regarding processing. In general, this means, that the controller has to take into account the latest developments in its field and has to stay up-to-date with technology.  
Article 25 GDPR also refers to technical and organizational measures regarding processing. In general, this means, that the controller has to take into account the latest developments in its field and has to stay up-to-date with technology.  


====Costs of implementation====
====Costs of Implementation====
According to the EDPB Guidelines 4/2019 on Article 25, the “incapacity to bear the costs is no excuse for non-compliance with the GDPR”[[Article 25 GDPR#%20ftn12|[12]]]. These “business costs” need to take into account not only the implementation costs, but also the costs of maintaining compliance. [[Article 25 GDPR#%20ftn13|[13]]]
According to the EDPB Guidelines 4/2019 on Article 25 GDPR, the “''incapacity to bear the costs is no excuse for non-compliance with the GDPR''”.<ref>EDPB, Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, 20 October 2020, [https://edpb.europa.eu/our-work-tools/our-documents/guidelines/guidelines-42019-article-25-data-protection-design-and_en p. 9].</ref> These “''business costs''” need to take into account not only the implementation costs, but also the costs of maintaining compliance.<ref>EDPB, Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, 20 October 2020, [https://edpb.europa.eu/our-work-tools/our-documents/guidelines/guidelines-42019-article-25-data-protection-design-and_en pp. 8 et seq].</ref>


====Nature, scope, context and purpose of processing====
====Nature, Scope, Context and Purpose of Processing====
The nature of processing is “the inherent characteristics of the processing”[[Article 25 GDPR#%20ftn14|[14]]]; the scope concerns the “size and range of processing” [[Article 25 GDPR#%20ftn15|[15]]]; the context “relates to the circumstances of the processing, which may influence the expectations of the data subject”; and the purpose “pertains to the aims of the processing”[[Article 25 GDPR#%20ftn16|[16]]].
The nature of processing is “''the inherent characteristics of the processing''”; the scope concerns the “''size and range of processing''”; the context “''relates to the circumstances of the processing, which may influence the expectations of the data subject”; and the purpose “pertains to the aims of the processing''”.<ref>EDPB, Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, 20 October 2020, [https://edpb.europa.eu/our-work-tools/our-documents/guidelines/guidelines-42019-article-25-data-protection-design-and_en p. 9].</ref>


====Risks of varying likelihood and severity for rights and freedoms of natural persons====
====Risks of Varying Likelihood and Severity for Rights and Freedoms of Natural Persons====
The GDPR foresees a risk based approach. In order to assess these risks, the EDPB Guideline 4/2019 refers to the “EDPB Guidelines on Data Protection Impact Assessments (DPIA), which can be used to help determine risk.
The GDPR foresees a risk based approach. In order to assess these risks, the EDPB Guideline 4/2019 refers to the “EDPB Guidelines on Data Protection Impact Assessments (DPIA), which can be used to help determine risk [''Reference?''].


====Time of determination of the means====
====Time of Determination of the Means====
The determination of the means of data processing “ranges from the abstract to the concrete detailed design elements of the processing, such as the architecture, procedures, protocols, layout and appearance”[[Article 25 GDPR#%20ftn17|[17]]]. The controller has to assess the appropriate measures and safeguards in order to effectively implement the obligations arising out of the GDPR.  
The determination of the means of data processing “''ranges from the abstract to the concrete detailed design elements of the processing, such as the architecture, procedures, protocols, layout and appearance''”.<ref>EDPB, Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, 20 October 2020, [https://edpb.europa.eu/our-work-tools/our-documents/guidelines/guidelines-42019-article-25-data-protection-design-and_en p. 10].</ref> The controller has to assess the appropriate measures and safeguards in order to effectively implement the obligations arising out of the GDPR.  


More problematic is what to do with an existent system (which predates the GDPR coming into force in 2018) that cannot easily be changed. Outdated systems incompatible with the GDPR require that companies and institutions reassess their means of processing. Because the the state of the art continuously changes, updating systems will be a continuous and necessary practical component of adhering to the privacy by design principle during ongoing processing activities. [[Article 25 GDPR#%20ftn18|[18]]]
More problematic is what to do with an existent system (which predates the GDPR coming into force in 2018) that cannot easily be changed. Outdated systems incompatible with the GDPR require that companies and institutions reassess their means of processing. Because the the state of the art continuously changes, updating systems will be a continuous and necessary practical component of adhering to the privacy by design principle during ongoing processing activities.<ref>''Nolte, Werkmeister'', in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 14 (Beck 2018, 2nd ed.) (accessed 19 August 2021).</ref>


====Time of the processing====
====Time of the Processing====
During the processing operation, regular re-assessments have to take place in order to verify and maintain GDPR compliance. [[Article 25 GDPR#%20ftn19|[19]]]
During the processing operation, regular re-assessments have to take place in order to verify and maintain GDPR compliance.<ref>EDPB, Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, 20 October 2020, [https://edpb.europa.eu/our-work-tools/our-documents/guidelines/guidelines-42019-article-25-data-protection-design-and_en pp. 10 et seq].</ref>


====Necessary safeguards====
====Necessary Safeguards====
On a technical level there need to be safeguards on processing to guarantee the rights of data subjects. For example, according to Article 20, the controller needs to be able to produce a document listing processing activities carried out with regard to each data subject.   
On a technical level there need to be safeguards on processing to guarantee the rights of data subjects. For example, according to [[Article 20 GDPR]], the controller needs to be able to produce a document listing processing activities carried out with regard to each data subject.   


==(2) Data Protection by Default==
===(2) Data Protection by Default===
Art. 25 GDPR can result in a violation of the GDPR only if it is violated in connection with other GDPR principles. [[Article 25 GDPR#%20ftn20|[20]]] Article 25 (2) GDPR is lex specialis in relation to Article 25 (1) GDPR,. Article 25 (2) GDPR states that only the personal data which is necessary for each specific purpose shall be processed, while Article 25 (1) GDPR regulates general privacy design obligations. [[Article 25 GDPR#%20ftn21|[21]]]
Article 25 GDPR can result in a violation of the GDPR only if it is violated in connection with other GDPR principles.<ref>''Nolte, Werkmeister'', in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 3 (Beck 2018, 2nd ed.) (accessed 19 August 2021).</ref> Article 25 (2) GDPR is ''lex specialis'' in relation to Article 25(1) GDPR. Article 25(2) GDPR states that only the personal data which is necessary for each specific purpose shall be processed, while Article 25(1) GDPR regulates general privacy design obligations.<ref>''Nolte, Werkmeister'', in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 8 (Beck 2018, 2nd ed.) (accessed 19 August 2021).</ref>


====By default====
====Privacy by Default====
“A ‘default’, as commonly defined in computer science, refers to the pre-existing or preselected value of a configurable setting that is assigned to a software application, computer program or device. Such settings are also called “presets” or “factory presets”, especially for electronic devices.” [[Article 25 GDPR#%20ftn22|[22]]] It follows that if third party software is used, controllers are obliged to disable features that collect personal data without a basis in Art. 6 (1) GDPR. [[Article 25 GDPR#%20ftn23|[23]]] Defaults are also relevant where roles are allocated to staff who have access to data. [[Article 25 GDPR#%20ftn24|[24]]] Finally, the storage period needs to be objectively justified and if possible, data shall be deleted by default.[[Article 25 GDPR#%20ftn25|[25]]]
“''A ‘default’, as commonly defined in computer science, refers to the pre-existing or preselected value of a configurable setting that is assigned to a software application, computer program or device. Such settings are also called “presets” or “factory presets”, especially for electronic devices.''<ref>EDPB, Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, 20 October 2020, [https://edpb.europa.eu/our-work-tools/our-documents/guidelines/guidelines-42019-article-25-data-protection-design-and_en p. 11].</ref> It follows that if third party software is used, controllers are obliged to disable features that collect personal data without a basis in [[Article 6 GDPR|Article 6(1) GDPR]].<ref>EDPB, Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, 20 October 2020, [https://edpb.europa.eu/our-work-tools/our-documents/guidelines/guidelines-42019-article-25-data-protection-design-and_en p. 11].</ref> Defaults are also relevant where roles are allocated to staff who have access to data.<ref>EDPB, Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, 20 October 2020, [https://edpb.europa.eu/our-work-tools/our-documents/guidelines/guidelines-42019-article-25-data-protection-design-and_en pp. 11 et seq].</ref> Finally, the storage period needs to be objectively justified and if possible, data shall be deleted by default.<ref>EDPB, Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, 20 October 2020, [https://edpb.europa.eu/our-work-tools/our-documents/guidelines/guidelines-42019-article-25-data-protection-design-and_en p. 13].</ref>


====Appropriate technical and organizational measures====
====Appropriate Technical and Organisational Measures====
Before analyzing technical and organizational measures, it needs to be clarified what “appropriate” means. The EDPS Guidelines on assessing the proportionality of measures that limit the fundamental rights to privacy and to the protection of personal data[[Article 25 GDPR#%20ftn26|[26]]], described in Art. 24 GDPR and 32 GDPR, can be used for insight. Technical measures[[Article 25 GDPR#%20ftn27|'''[27]''']] and organizational measures that implement data protection principles[[Article 25 GDPR#%20ftn28|[28]]] are also named as examples in some commentaries.
Before analysing technical and organisational measures, it needs to be clarified what “appropriate” means. The EDPS Guidelines on assessing the proportionality of measures that limit the fundamental rights to privacy and to the protection of personal data<ref>EDPS, Guidelines on Assessing the proportionality of measures that limit the fundamental rights to privacy and to the protection of personal data, 19 December 2019, [https://edps.europa.eu/data-protection/our-work/publications/guidelines/assessing-proportionality-measures-limit_en pp. 1 et seqq]. </ref>, described in [[Article 24 GDPR]] and [[Article 32 GDPR]], can be used for insight. Technical measures<ref>''Nolte, Werkmeister'', in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 16 (Beck 2018, 2nd ed.) (accessed 19 August 2021): (1) pseudonymization (Article 4 nr. 5), (2) encryption, (3) access controls, (4) anonymization, (5) aggregation, (6) transparency on functions and processing, (7) control of processing via dashboards, (8) purpose principle.</ref> and organisational measures that implement data protection principles<ref>''Nolte, Werkmeister'', in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 17 (Beck 2018, 2nd ed.) (accessed 19 August 2021): (1) training, (2) internal checks/audits, (3) interdisciplinary project teams, (4) ethic committees for complex assessments ([[Article 5 GDPR|Article 5(1)(a) GDPR]], (5) role and access concepts ([[Article 5 GDPR|Article 5(1)(c) GDPR]], (6) deletion concepts ([[Article 5 GDPR|Article 5(1)(e) GDPR]], (7) voluntary DPIAS ([[Article 35 GDPR|Articles 35]] and [[Article 5 GDPR|5(2) GDPR]]).</ref> are also named as examples in some commentaries  


Above all, controllers have to demonstrate that they have implemented measures to be effective.
Above all, controllers have to demonstrate that they have implemented measures to be effective.


====Certification mechanism====
====Certification Mechanism====
A certification mechanism could be the certification described in Article 42, but this remains debated.[[Article 25 GDPR#%20ftn29|[29]]]
A certification mechanism could be the certification described in [[Article 42 GDPR]], but this remains debated.<ref>''Nolte, Werkmeister'', in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 32 (Beck 2018, 2nd ed.) (accessed 19 August 2021).</ref>
----[[Article 25 GDPR#%20ftnref1|[1]]]
 
[[Article 25 GDPR#%20ftnref2|[2]]] Nolte/Werkmeister in Gola, DS-GVO, Kommentar, 2nd edition, 2018, Art. 25 para 8.
 
[[Article 25 GDPR#%20ftnref3|[3]]] <nowiki>https://www.aepd.es/sites/default/files/2019-11/guia-privacidad-desde-diseno.pdf</nowiki> , p. 7
 
[[Article 25 GDPR#%20ftnref4|[4]]] <nowiki>https://www.aepd.es/sites/default/files/2019-11/guia-privacidad-desde-diseno.pdf</nowiki> , p. 7
 
[[Article 25 GDPR#%20ftnref5|[5]]] Nolte/Werkmeister in Gola, DS-GVO, Kommentar, 2nd edition, 2018, Art. 25 para 11.
 
[[Article 25 GDPR#%20ftnref6|[6]]] Nolte/Werkmeister in Gola, DS-GVO, Kommentar, 2nd edition, 2018, Art. 25 para 18 ff.8.
 
[[Article 25 GDPR#%20ftnref7|[7]]] <nowiki>https://www.aepd.es/sites/default/files/2019-11/guia-privacidad-desde-diseno.pdf</nowiki>, p. 24:
 
These practical examples consist out of (1) '''Minimisation:''' Limit the needed data to the maximum needed (selection, exclusion, cutting of and delete by means of anonymization, pseudonymisation, bloc possibilities to connect data with each other), (2) '''Hiding:''' Measures that prevent personal data to be public or known (Restrict access possibilities, disassociate and aggregate credential-based attributes, mixing data or encrypt them), (3) '''Separating:''' Separate data in different containers, isolate data or distribute them by means of anonymous blacklists, homorphic encryption, physical and logical separation, (4) '''Abstraction''': by leaving out details to the highest extent possible (summarizing, grouping and disturbing with aggregation in time, K-anonymity, obfuscation of measurements by noise aggregation, dynamic location granularity).
 
[[Article 25 GDPR#%20ftnref8|[8]]] <nowiki>https://www.aepd.es/sites/default/files/2019-11/guia-privacidad-desde-diseno.pdf</nowiki>, p. 25:
 
These practical examples consist out of: (1) '''Information''' of data subjects on the processing and its conditions via simple explanation and notifications (also: notification of data breaches,  dynamic visualization of privacy policies,  privacy icons and processing alerts), (2) '''Control''' – Giving data subjects control over their personal data by consent, alert, choice, actualization, reiterations (panels to choose preferences, active presence transmission, selection of credentials, informed consent), (3) '''Compliance''' by respect and boost compliance with obligations imposed by current legislation and own privacy policies (definitions, maintenance and defense, evaluation of DPIAs, access control, management of obligations, compliance with policies), (4) '''Demonstration''' – show that processing is respecting privacy by registering, audit and information.
 
[[Article 25 GDPR#%20ftnref9|[9]]] <nowiki>https://www.aepd.es/sites/default/files/2019-11/guia-privacidad-desde-diseno.pdf</nowiki> , p. 17 et seqq.: E.g. disconnecting information from each other – minimize, abstract, spate, occult; control – comply, show; transparency – inform).
 
[[Article 25 GDPR#%20ftnref10|[10]]] <nowiki>https://www.aepd.es/sites/default/files/2019-11/guia-privacidad-desde-diseno.pdf</nowiki> , p. 16: “...the use of appropriate technological measures is an essential complement to legal means and  should be  an  integral part  in  any efforts  to  achieve a  sufficient  level of  privacy  protection..." (<nowiki>https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52007DC0228&from=EN</nowiki> , p.3).
 
[[Article 25 GDPR#%20ftnref11|[11]]] <nowiki>https://www.aepd.es/sites/default/files/2019-11/guia-privacidad-desde-diseno.pdf</nowiki> , p. 15.
 
[[Article 25 GDPR#%20ftnref12|[12]]] EDPB Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, adopted on 13 November 2019, Version 1.0, para 24.
 
[[Article 25 GDPR#%20ftnref13|[13]]] EDPB Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, adopted on 13 November 2019, Version 1.0, para 23 f.
 
[[Article 25 GDPR#%20ftnref14|[14]]] EDPB Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, adopted on 13 November 2019, Version 1.0, para 27.
 
[[Article 25 GDPR#%20ftnref15|[15]]] EDPB Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, adopted on 13 November 2019, Version 1.0, para 27.
 
[[Article 25 GDPR#%20ftnref16|[16]]] EDPB Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, adopted on 13 November 2019, Version 1.0, para 27.
 
[[Article 25 GDPR#%20ftnref17|[17]]] EDPB Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, adopted on 13 November 2019, Version 1.0, para 33.
 
[[Article 25 GDPR#%20ftnref18|[18]]] Nolte/Werkmeister in Gola, DS-GVO, Kommentar, 2nd edition, 2018, Art. 25 para 14.
 
[[Article 25 GDPR#%20ftnref19|[19]]] EDPB Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, adopted on 13 November 2019, Version 1.0, para 37.
 
[[Article 25 GDPR#%20ftnref20|[20]]] Nolte/Werkmeister in Gola, DS-GVO, Kommentar, 2nd edition, 2018, Art. 25 para 3.
 
[[Article 25 GDPR#%20ftnref21|[21]]] Nolte/Werkmeister in Gola, DS-GVO, Kommentar, 2nd edition, 2018, Art. 25 para 8.
 
[[Article 25 GDPR#%20ftnref22|[22]]] EDPB Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, adopted on 13 November 2019, Version 1.0, para 39.
 
[[Article 25 GDPR#%20ftnref23|[23]]] EDPB Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, adopted on 13 November 2019, Version 1.0, para 41.
 
[[Article 25 GDPR#%20ftnref24|[24]]] EDPB Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, adopted on 13 November 2019, Version 1.0, para 43.
 
[[Article 25 GDPR#%20ftnref25|[25]]] EDPB Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, adopted on 13 November 2019, Version 1.0, para 52.
 
[[Article 25 GDPR#%20ftnref26|[26]]] EDPS Guidelines on assessing the proportionality of measures that limit the fundamental rights to privacy and to the protection of personal data: <nowiki>https://edps.europa.eu/sites/edp/files/publication/19-02-25_proportionality_guidelines_en.pdf</nowiki> accessed on 3 September 2020.
 
[[Article 25 GDPR#%20ftnref27|[27]]] Nolte/Werkmeister in Gola, DS-GVO, Kommentar, 2nd edition, 2018, Art. 25 para 16: (1) pseudonymization (Article 4 nr. 5), (2) encryption, (3) access controls, (4) anonymization, (5) aggregation, (6) transparency on functions and processing, (7) control of processing via dashboards, (8) purpose principle.
 
[[Article 25 GDPR#%20ftnref28|[28]]] Nolte/Werkmeister in Gola, DS-GVO, Kommentar, 2nd edition, 2018, Art. 25 para 17: (1) training, (2) internal checks/audits, (3) interdisciplinary project teams, (4) ethic committees for complex assessments (Article 5 (1) (a) GDPR, (5) role and access concepts (Article 5 (1)(c) GDPR, (6) deletion concepts (Article 5 (1) (e) GDPR, (7) voluntary DPIAS (Article 35, 5 (2) GDPR).
 
[[Article 25 GDPR#%20ftnref29|[29]]] Nolte/Werkmeister in Gola, DS-GVO, Kommentar, 2nd edition, 2018, Art. 25 para 32.
---- ''[[Article 25 GDPR#%20msoanchor%201|[S1]]]Very good''
 
==Decisions==
==Decisions==
→ You can find all related decisions in [[:Category:Article 25 GDPR]]
→ You can find all related decisions in [[:Category:Article 25 GDPR]]

Revision as of 09:34, 19 August 2021

Article 25 - Data protection by design and by default
Gdpricon.png
Chapter 10: Delegated and implementing acts

Legal Text

Article 25 - Data protection by design and by default

1. Taking into account the state of the art, the cost of implementation and the nature, scope, context and purposes of processing as well as the risks of varying likelihood and severity for rights and freedoms of natural persons posed by the processing, the controller shall, both at the time of the determination of the means for processing and at the time of the processing itself, implement appropriate technical and organisational measures, such as pseudonymisation, which are designed to implement data-protection principles, such as data minimisation, in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of this Regulation and protect the rights of data subjects.

2. The controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed. That obligation applies to the amount of personal data collected, the extent of their processing, the period of their storage and their accessibility. In particular, such measures shall ensure that by default personal data are not made accessible without the individual's intervention to an indefinite number of natural persons.

3. An approved certification mechanism pursuant to Article 42 may be used as an element to demonstrate compliance with the requirements set out in paragraphs 1 and 2 of this Article.

Relevant Recitals

Recital 78: Appropriate Technical and Organisational Measures
The protection of the rights and freedoms of natural persons with regard to the processing of personal data require that appropriate technical and organisational measures be taken to ensure that the requirements of this Regulation are met. In order to be able to demonstrate compliance with this Regulation, the controller should adopt internal policies and implement measures which meet in particular the principles of data protection by design and data protection by default. Such measures could consist, inter alia, of minimising the processing of personal data, pseudonymising personal data as soon as possible, transparency with regard to the functions and processing of personal data, enabling the data subject to monitor the data processing, enabling the controller to create and improve security features. When developing, designing, selecting and using applications, services and products that are based on the processing of personal data or process personal data to fulfil their task, producers of the products, services and applications should be encouraged to take into account the right to data protection when developing and designing such products, services and applications and, with due regard to the state of the art, to make sure that controllers and processors are able to fulfil their data protection obligations. The principles of data protection by design and by default should also be taken into consideration in the context of public tenders.

Commentary on Article 25

Privacy by design and default was originally conceptualized in the 1990s by the Canadian Information and Privacy Commissioner of Ontario.[1] According to them, data protection must be thought of ex ante in order to be effective. The controller must define the privacy requirements that need to be taken into account while engineering, and determine the default settings of the final product.

Article 25 GDPR aims to implement the data protection principles of Article 5 GDPR and to protect the rights of the data subjects.[2] The approach should be proactive. Accordingly, privacy by design requires that developers and controllers embrace a culture of responsibility and systematically indicate processes which could infringe the GDPR.[3]

(1) Controller Obligations

Article 25 GDPR addresses only the controller, and not producers of technical products, services or systems, as they do not decide on the concrete purposes and means of processing.[4] However, Recital 78 GDPR “encourages” producers to take into account the right to data protection so as to enable controllers and processors to fulfill their data protection obligations. Although they are not directly obliged, the invisible hand of demand and supply should favor producers who deliver products that adhere to the principles of data protection by design and default.

Data Protection by Design

To have data processing which follows the principle of data protection by design, a controller needs to have a data strategy in place. A data strategy may consist of data guidelines, documentation, monitoring and the evaluation of measures.[5]

The GDPR does not contain concrete examples of data protection by design. However, the Spanish Data Protection Authority has published a useful guide with practical examples regarding a strategy for data collection[6] and processing.[7]

An important part of Article 25 GDPR is the so-called “privacy engineering".[8] Tactics for privacy engineering are needed in each step of the software design pattern and in the final PETS (Privacy Enhancing technologies).[9]

The design and development of the system needs a privacy verification and validation process, which consists of integration of the system, proof, evaluations, and continuous maintenance.[10]

Privacy by design must also correspond to the following criterion:

State of the Art Technology

Article 25 GDPR also refers to technical and organizational measures regarding processing. In general, this means, that the controller has to take into account the latest developments in its field and has to stay up-to-date with technology.

Costs of Implementation

According to the EDPB Guidelines 4/2019 on Article 25 GDPR, the “incapacity to bear the costs is no excuse for non-compliance with the GDPR”.[11] These “business costs” need to take into account not only the implementation costs, but also the costs of maintaining compliance.[12]

Nature, Scope, Context and Purpose of Processing

The nature of processing is “the inherent characteristics of the processing”; the scope concerns the “size and range of processing”; the context “relates to the circumstances of the processing, which may influence the expectations of the data subject”; and the purpose “pertains to the aims of the processing”.[13]

Risks of Varying Likelihood and Severity for Rights and Freedoms of Natural Persons

The GDPR foresees a risk based approach. In order to assess these risks, the EDPB Guideline 4/2019 refers to the “EDPB Guidelines on Data Protection Impact Assessments (DPIA), which can be used to help determine risk [Reference?].

Time of Determination of the Means

The determination of the means of data processing “ranges from the abstract to the concrete detailed design elements of the processing, such as the architecture, procedures, protocols, layout and appearance”.[14] The controller has to assess the appropriate measures and safeguards in order to effectively implement the obligations arising out of the GDPR.

More problematic is what to do with an existent system (which predates the GDPR coming into force in 2018) that cannot easily be changed. Outdated systems incompatible with the GDPR require that companies and institutions reassess their means of processing. Because the the state of the art continuously changes, updating systems will be a continuous and necessary practical component of adhering to the privacy by design principle during ongoing processing activities.[15]

Time of the Processing

During the processing operation, regular re-assessments have to take place in order to verify and maintain GDPR compliance.[16]

Necessary Safeguards

On a technical level there need to be safeguards on processing to guarantee the rights of data subjects. For example, according to Article 20 GDPR, the controller needs to be able to produce a document listing processing activities carried out with regard to each data subject.

(2) Data Protection by Default

Article 25 GDPR can result in a violation of the GDPR only if it is violated in connection with other GDPR principles.[17] Article 25 (2) GDPR is lex specialis in relation to Article 25(1) GDPR. Article 25(2) GDPR states that only the personal data which is necessary for each specific purpose shall be processed, while Article 25(1) GDPR regulates general privacy design obligations.[18]

Privacy by Default

A ‘default’, as commonly defined in computer science, refers to the pre-existing or preselected value of a configurable setting that is assigned to a software application, computer program or device. Such settings are also called “presets” or “factory presets”, especially for electronic devices.[19] It follows that if third party software is used, controllers are obliged to disable features that collect personal data without a basis in Article 6(1) GDPR.[20] Defaults are also relevant where roles are allocated to staff who have access to data.[21] Finally, the storage period needs to be objectively justified and if possible, data shall be deleted by default.[22]

Appropriate Technical and Organisational Measures

Before analysing technical and organisational measures, it needs to be clarified what “appropriate” means. The EDPS Guidelines on assessing the proportionality of measures that limit the fundamental rights to privacy and to the protection of personal data[23], described in Article 24 GDPR and Article 32 GDPR, can be used for insight. Technical measures[24] and organisational measures that implement data protection principles[25] are also named as examples in some commentaries

Above all, controllers have to demonstrate that they have implemented measures to be effective.

Certification Mechanism

A certification mechanism could be the certification described in Article 42 GDPR, but this remains debated.[26]

Decisions

→ You can find all related decisions in Category:Article 25 GDPR

References

  1. Nolte, Werkmeister, in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 1 (Beck 2018, 2nd ed.) (accessed 19 August 2021).
  2. Nolte, Werkmeister, in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 8 (Beck 2018, 2nd ed.) (accessed 19 August 2021).
  3. AEPD, Guía de Privacidad desde el Diseño, October 2019, p. 3.
  4. Nolte, Werkmeister, in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 11 (Beck 2018, 2nd ed.) (accessed 19 August 2021).
  5. Nolte, Werkmeister, in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 18 (Beck 2018, 2nd ed.) (accessed 19 August 2021).
  6. AEPD, Guía de Privacidad desde el Diseño, October 2019, p. 24: These practical examples consist out of (1) minimisation: Limit the needed data to the maximum needed (selection, exclusion, cutting of and delete by means of anonymization, pseudonymisation, bloc possibilities to connect data with each other), (2) hiding: Measures that prevent personal data to be public or known (Restrict access possibilities, disassociate and aggregate credential-based attributes, mixing data or encrypt them), (3) separating: Separate data in different containers, isolate data or distribute them by means of anonymous blacklists, homorphic encryption, physical and logical separation, (4) abstraction: by leaving out details to the highest extent possible (summarizing, grouping and disturbing with aggregation in time, K-anonymity, obfuscation of measurements by noise aggregation, dynamic location granularity).
  7. AEPD, Guía de Privacidad desde el Diseño, October 2019, p. 25: These practical examples consist out of: (1) information of data subjects on the processing and its conditions via simple explanation and notifications (also: notification of data breaches,  dynamic visualization of privacy policies,  privacy icons and processing alerts), (2) control – Giving data subjects control over their personal data by consent, alert, choice, actualization, reiterations (panels to choose preferences, active presence transmission, selection of credentials, informed consent), (3) compliance by respect and boost compliance with obligations imposed by current legislation and own privacy policies (definitions, maintenance and defense, evaluation of DPIAs, access control, management of obligations, compliance with policies), (4) demonstration – show that processing is respecting privacy by registering, audit and information..
  8. AEPD, Guía de Privacidad desde el Diseño, October 2019, pp. 17 et seqq: E.g. disconnecting information from each other – minimise, abstract, spate, occult; control – comply, show; transparency – inform).
  9. AEPD, Guía de Privacidad desde el Diseño, October 2019, p. 16, citing Commission, Communication from the Commission to the European Parliament and the Council on Promoting Data Protection by Privacy Enhancing Technologies (PETs), 2 May 2007, p. 3: “...the use of appropriate technological measures is an essential complement to legal means and  should be  an  integral part  in  any efforts  to  achieve a  sufficient  level of  privacy  protection...".
  10. AEPD, Guía de Privacidad desde el Diseño, October 2019, p. 15.
  11. EDPB, Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, 20 October 2020, p. 9.
  12. EDPB, Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, 20 October 2020, pp. 8 et seq.
  13. EDPB, Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, 20 October 2020, p. 9.
  14. EDPB, Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, 20 October 2020, p. 10.
  15. Nolte, Werkmeister, in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 14 (Beck 2018, 2nd ed.) (accessed 19 August 2021).
  16. EDPB, Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, 20 October 2020, pp. 10 et seq.
  17. Nolte, Werkmeister, in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 3 (Beck 2018, 2nd ed.) (accessed 19 August 2021).
  18. Nolte, Werkmeister, in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 8 (Beck 2018, 2nd ed.) (accessed 19 August 2021).
  19. EDPB, Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, 20 October 2020, p. 11.
  20. EDPB, Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, 20 October 2020, p. 11.
  21. EDPB, Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, 20 October 2020, pp. 11 et seq.
  22. EDPB, Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, 20 October 2020, p. 13.
  23. EDPS, Guidelines on Assessing the proportionality of measures that limit the fundamental rights to privacy and to the protection of personal data, 19 December 2019, pp. 1 et seqq.
  24. Nolte, Werkmeister, in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 16 (Beck 2018, 2nd ed.) (accessed 19 August 2021): (1) pseudonymization (Article 4 nr. 5), (2) encryption, (3) access controls, (4) anonymization, (5) aggregation, (6) transparency on functions and processing, (7) control of processing via dashboards, (8) purpose principle.
  25. Nolte, Werkmeister, in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 17 (Beck 2018, 2nd ed.) (accessed 19 August 2021): (1) training, (2) internal checks/audits, (3) interdisciplinary project teams, (4) ethic committees for complex assessments (Article 5(1)(a) GDPR, (5) role and access concepts (Article 5(1)(c) GDPR, (6) deletion concepts (Article 5(1)(e) GDPR, (7) voluntary DPIAS (Articles 35 and 5(2) GDPR).
  26. Nolte, Werkmeister, in Gola, Datenschutz-Grundverordnung, Article 25 GDPR, margin number 32 (Beck 2018, 2nd ed.) (accessed 19 August 2021).