Article 32 GDPR
|← Article 32 - Security of processing →|
Legal Text[edit | edit source]
1. Taking into account the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons, the controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk, including inter alia as appropriate:
- (a) the pseudonymisation and encryption of personal data;
- (b) the ability to ensure the ongoing confidentiality, integrity, availability and resilience of processing systems and services;
- (c) the ability to restore the availability and access to personal data in a timely manner in the event of a physical or technical incident;
- (d) a process for regularly testing, assessing and evaluating the effectiveness of technical and organisational measures for ensuring the security of the processing.
2. In assessing the appropriate level of security account shall be taken in particular of the risks that are presented by processing, in particular from accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to personal data transmitted, stored or otherwise processed.
3. Adherence to an approved code of conduct as referred to in Article 40 or an approved certification mechanism as referred to in Article 42 may be used as an element by which to demonstrate compliance with the requirements set out in paragraph 1 of this Article.
4. The controller and processor shall take steps to ensure that any natural person acting under the authority of the controller or the processor who has access to personal data does not process them except on instructions from the controller, unless he or she is required to do so by Union or Member State law.
Relevant Recitals[edit | edit source]
Commentary[edit | edit source]
Article 32(1) GDPR reflects the principle of integrity and confidentiality enshrined in Article 5(1)(f) GDPR. The controller and the processor must implement appropriate technical and organizational measures in order to realise an appropriate level of security, taking into account the state of the art, the implementation costs as well as the nature, scope, context and purposes of processing. Consideration must also be given to the processing operation’s impact on the rights and freedoms of natural persons.
(1) Measures appropriate to the risk[edit | edit source]
Article 32 GDPR requires controllers and processors to implement measures that ensure an appropriate level of security. The word 'appropriate' appears not less than three times in Article 32(1): "the controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk, including inter alia as appropriate". This indicates that controllers and processors must identify the situation specific risks, assess their potential impact having regard to the particular circumstances of the processing and implement measures to mitigate (at least) those risks which are the most likely to materialise and those whose impact would be the most severe.
The controller and the processor[edit | edit source]
Article 32 addresses both the controller and processor. Hence, in contrast to the general controller's GDPR responsibility, the provision sets out an additional and direct responsibility of the processor for ensuring security. The provision does not directly apply to software and device manufacturers. However, similarly to what happens under Article 25, Article 32 GDPR holds practical significance when it comes to the controller or processor's selection of software and hardware, which must meet the required security standard. In situations of uncertainty, products that do not enable data protection-compliant usage should not be employed.
Case-law: The Finnish DPA held that four cities in Finland had, among others, unlawfully transferred personal data to the US by using Google Analytics and Google Tag Manager on their public library online services, among the others in breach of Article 32 GDPR. The DPA ordered them to delete the data collected through these tools.
Taking into account[edit | edit source]
Article 32(1) outlines the factors that the controller and processor must consider when conducting the assessment on which measures are "appropriate" to contain the risks. These factors include the state of the art, the implementation costs, the nature, scope, context, and purposes of the processing, as well as the varying likelihood and severity of risks to the rights and freedoms of individuals.
State of the art[edit | edit source]
Article 32(1) GDPR makes use of the same concept - "state of the art" - that was already used in Article 25(1). For these reason, we refer to the comment on that provision. The practical question whether "state of the art" encompasses the latest technological advancements or if adherence to industry standards and practices is sufficient is particularly relevant in the context of security systems. According to the majority of scholars, "state of the art" does not necessarily refer to the absolute best or most advanced technologies. Rather, it encompasses well-established, proven, and effective measures that are currently available in the market.
Case-law: Sending an e-mail containing sensitive data with ordinary TLS-encryption instead of end-to-end encryption, widely available on the market, was deemed insufficiently secured under Article 32(1) GDPR. The controller received a reprimand instead of a fine as it had increased the security of its communication solutions.
Costs of implementation[edit | edit source]
The second criterion for selecting appropriate technical measures and organisation are the implementation costs. When calculating costs, all necessary expenses related to the implementation of the measures should be considered, including acquisition and installation costs prior to processing initiation, as well as operational and maintenance costs during the ongoing processing. An economic analysis should be conducted to assess the relationship between costs and risks. The higher the level of risk associated with the data processing, the more economically justifiable it is for the controller to invest in implementing the necessary technical measures. Cost-intensive measures that offer minimal additional risk reduction may not be required.
Nature, scope, context and purpose of processing[edit | edit source]
This includes the nature of the processing (manual or automated), the scope of the processing (amount of data subjects affected, amount of data collected, bulk or individual processing, sensitivity of the data), the context of the processing (how many parties are involved, which systems are used, etc.), and the purposes of the processing.
Risks of varying likelihood and severity for rights and freedoms of natural persons[edit | edit source]
Finally, controllers and processors should complete a risk assessment before carrying out a processing operation. Although the provision applies to all types of risks, Article 32(2) GDPR indicates that specific attention should be paid to certain categories of risk, such as the accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to personal data transmitted, stored or otherwise processed. Recital 83 GDPR also suggests that not all of these are actually relevant. Indeed, the risk should also “lead to physical, material or non-material damage”. It can therefore be argued that a potential risk of e.g. “destruction” will not be relevant if it is no tangible harm is caused to data subjects.
For further elements, see commentary under Article 32(2) GDPR below.
Shall implement measures to ensure a level of security appropriate to the risk[edit | edit source]
Controllers and processors should identify security measures that, taking into account the above criteria, can mitigate identified risks.
The GDPR does not require the use of any particular technology or technical standard with regard to data security. Indeed, Recital 15 stipulates that "the protection of natural persons should be technologically neutral and should not depend on the techniques used". Technical measures typically encompass hardware, software, and network components that are directly involved in data processing. Additionally, organisational measures should also be implemented. For example, controllers and processors should: allocate responsibility between themselves; require each person authorised to process personal data to participate in training activities; draft and follow internal policies, disciplinary measures, and internal guidelines; and adhere to codes of conduct or certification mechanisms.
After having identified a list of theoretically applicable measures, controllers and processors must choose and implement measures which can ensure an “appropriate” level of security. This indicates that they must implement measures that can mitigate at least those risks which are the most likely to materialise and whose impact would be the most severe. The controller and the processor are required to undertake suitable technical and organizational measures within their respective domains. They must document the rationale behind their selection of these measures and how they evaluated the mentioned criteria.
Including inter alia as appropriate[edit | edit source]
Article 32(1) GDPR enumerates four examples of technical security measures that controllers and processors should implement "as appropriate". These include pseudonymisation and encryption of personal data (Article 32(1)(a) GDPR); the ability to ensure the ongoing confidentiality, integrity, availability and resilience of processing systems (Article 32(1)(b) GDPR); the ability to restore the availability and access to personal data in a timely manner in case of an incident (Article 32(1)(c) GDPR) and a process for regularly testing, assessing and evaluating the effectiveness of security measures (Article 32(1)(d) GDPR).
(a) Pseudonimysation and encryption[edit | edit source]
According to Article 4(5) GDPR, "pseudonymisation" means the processing of personal data in such a manner that the personal data can no longer be attributed to a specific data subject without the use of additional information, provided that such additional information is kept separately and is subject to technical and organisational measures to ensure that the personal data are not attributed to an identified or identifiable natural person.
Although the General Data Protection Regulation (GDPR) does not explicitly define the concept of encryption, it is described in Article 34(3)(a) according to which encryption is an effective method for rendering personal data inaccessible to unauthorized individuals. The cryptographic techniques employed for encryption, in accordance with the current state of the art, play a vital role in ensuring information security in line with the provisions of Article 32 GDPR.
(b) Ongoing confidentiality, integrity, availability and resilience of processing systems and services[edit | edit source]
Controllers and processors must implement measures which are able to ensure ongoing confidentiality, integrity, availability, and resilience of the processing systems and services used. The objective is to continuously maintain security requirements over the entire duration of processing ("ongoing..."). This ensures that security measures are consistently applied in a dynamic and ongoing manner.
The term "systems" should be interpreted broadly, encompassing not only technical systems for automated processing but also systems supporting paper-based processing. Similarly, the concept of "services" falls under the same understanding. The individuals involved in handling these systems and services play a role in the processing of personal data. Regular training to maintain an ongoing awareness of risks and the importance of adhering to established rules for data protection are also essential and must not be bypassed.
"Confidentiality" is not only a technical notion but a GDPR principle established in Article 5(1)(f) GDPR. It entails safeguarding information from unauthorized disclosure, allowing access to confidential data and information solely by authorized individuals through approved means. Measures implemented to ensure or facilitate confidentiality include data encryption, access control systems, permissions, and authentication based on one or multiple factors.
"Integrity" is also referred to as a principle in Article 5(1)(f) GDPR. Various references to integrity can be found in different contexts, such as the completeness or authenticity of data (Recital 49) and the right to rectification (Article 16 GDPR). Integrity encompasses both the integrity of data and the proper functioning of systems. Unauthorized modifications constitute a breach of information integrity, impacting not only the actual content but also meta-data such as the author, sender, and creation timestamp. To protect integrity, measures such as electronic signatures or check digits, input controls (logging) to track who accessed, modified, or deleted personal data and when, as well as authorization systems for managing access, are employed.
The term "availability" refers to the likelihood that a technical system will fulfill specific requests within an agreed-upon timeframe. Systems and services are considered available if they can be used at any given time. When systems and services are unavailable, controllers and processors are unable to maintain control over the personal data processed through them. Impairments to availability can range from minor delays in usability to complete system failures that last for an extended period. Various factors can contribute to such impairments, including defective hardware, faulty software, power supply issues, or disrupted network connections. To ensure availability, diverse security measures may need to be implemented, such as providing complete replacement systems or individual backup systems, emergency power supplies, and conducting regular functional tests.
Companies will also have to ensure the "resilience" of the systems and services related to processing. The GDPR does not describe which measures contribute positively to resilience. In IT, resilience is the ability of an information system to continue to: (i) operate under adverse conditions or stress, even if in a degraded or debilitated state, while maintaining essential operational capabilities; and (ii) recover to an effective operational posture in a time frame consistent with mission needs. In other words, it is about the tolerance and compensatory ability of a system against disturbances.
Example: Suppose there is a network consisting of multiple nodes. If one of the network nodes is targeted and disrupted by an attack, the system demonstrates resilience if the overall functionality of the network remains unaffected because another network node seamlessly takes over its function. In this scenario, the system's resilience is evaluated based on its ability to withstand a certain number of attacks without experiencing significant functional limitations. This measurement, known as "k-resilience," quantifies the system's capability to handle multiple attacks while maintaining its operational integrity.
Resilience measures involve minimizing attack possibilities, such as by implementing essential functionality and promptly installing security updates to improve systems security. Additionally, detecting attacks or disruptions and responding appropriately is crucial. Training employees to identify issues and learn from mistakes is also important. Processing should be designed to minimize damage during incidents, such as by transitioning to a fail-safe mode or automatically disconnecting from the Internet in given cases. Creating opportunities for intervention, including human intervention, can further enhance resilience.
(c) Ability to restore availability and access to personal data in a timely manner[edit | edit source]
Article 32(1)(c) requires that systems and services be capable of restoring the availability and access to personal data in the event of a physical or technical incident. Unlike in the preceding provision (b), in this case the incident has occurred (and it must have a technical or physical nature). For example, one can consider scenarios such as physical destruction or loss of storage devices, unauthorized data encryption resulting from a ransomware attack (technical incident), or the destruction of system components due to fire, flooding, or devastation (physical incident). In such cases, controllers and processors must be able to restore availability and access to personal data.
In practice, security measures had to be implemented before the incident occurred to be effective. Examples of measures include regular backups or even the necessity to store data on multiple in different locations.
The expression "in a timely manner" does not establish a specific deadline, as the timeline depends on the factual circumstances and the seriousness of the issues the controller is dealing with. However, this wording must be understood as indicating that the controller shall restore availability and access "as soon as possible".
(d) Regularly testing, assessing and evaluating...[edit | edit source]
As the wording of this provision suggests, these are not measures that directly protect data and systems security. These are instead "meta-measures" aiming at guaranteeing that security systems continue to be effective from a dynamic perspective, namely through time. Typical example here is a test where IT experts simulate an external attack, in order to evaluate the response of pre-existing security mechanisms. The GDPR does not specify a time framework for such testing operations. However, frequency in checks mainly depends on the type of risks that a given system must face and the ongoing technological developments in the field concerned.
(2) Certain risks must always be taken into account[edit | edit source]
The second paragraph builds on Article 32(1) in the sense that it provides examples of the risks that are most likely to occur, and/or the most severe (accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to processed personal data). Burton notes that these risks are somewhat similar to the criteria that need to be considered when a DPIA is carried out.
The first three typical risks - destruction, loss and alteration - are strictly connected with the principles of integrity, availability and resilience, whereas the last two - unauthorised disclosure and access - mainly concern the principle of confidentiality. "Destruction" is the irreversible erasure of personal data from their material support or a permanent impairment of the system enabling the processing. On the other hand, "loss" must be understood as unavailability of still existing data or processing systems. Finally, "alteration" means any significant change to the data or the systems.
"Disclosure" and "access" mirror phenomena tackled elsewhere in the GDPR and thus do not require further specification. What is fundamental here is that disclosure to and excess by third parties was unauthorised. The problem arises whether the lack of authorisation should be interpreted strictly or broadly. In other words, it is not clear whether security measures should protect data and systems from intrusion which were not authorised by the data subject, or if the obligation covers also objective obligation of confidentiality established by the law. The restrictive approach must be preferred, as obligations of confidentiality in national law differ in each member state and are usually set forth with the aim to guarantee rights and interests other than data protection.
Finally, there is no reason to consider the adjectives "accidental" and "unlawful" as more than two typical ways through which risks included in paragraph (2) can occur.
(3) Codes of conduct and certification mechanisms[edit | edit source]
In line with the accountability principle, Article 32(3) GDPR stipulates that adherence to codes of conduct or certification mechanisms (Articles 40 and 42 GDPR respectively) can be used as an element to demonstrate compliance with the Regulation, although neither of these reduce the controller or processor’s responsibility (Article 42(4) GDPR). However, if asked to prove they took all possible measures to avoid the violation, adherence to these would undoubtedly play an important role.
[edit | edit source]
Most processing operations involve the use of human resources. Whenever a person is authorised to access personal data, a data security issue arises. For these reasons, Article 32(4) GDPR requires the controller and processor to ensure that such persons act solely and exclusively on the instructions of the controller, with the only exception of those individual operations that are imposed by EU or national law. The verb 'ensure' indicates that controllers and processors must provide some form of guarantee with respect to the result.
The provision also speaks of "any natural person [...] who has access to personal data". This includes employees, freelancers, interns, external consultants or employees of service companies, insofar as they have access to personal data. In contrast, other third parties, including visitors or customers who access the data unlawfully, are not included in the definition, as they do not act "under the authority of the controller or processor".
To comply with this obligation, controllers and processors should establish clear rules of conduct, internal instructions and sanctioning procedures. Technical measures that prevent unauthorised access are also essential in certain cases. Controllers and processors must also regularly check whether these measures are effective and actually followed.
Decisions[edit | edit source]
→ You can find all related decisions in Category:Article 32 GDPR
References[edit | edit source]
- The CJEU has consequently recognised data security as an integral part of the right to data protection in Article 8 of the Charter of Fundamental Rights of the European Union. See, CJEU, Joined Cases C‑293/12 and C‑594/12, Digital Rights Ireland Ltd, 8 April 2014, margin number 29 (available here).
- References to "appropriateness" "can be seen as a way of expressing the importance of the principle of proportionality, which is a general principle of EU law, in determining how to ensure data security. A proportionality analysis generally inquires whether the means used to achieve an aim corresponds to the importance of the aim and whether it is necessary for its achievement". Burton, in Kuner, Bygrave, Docksey, The EU General Data Protection Regulation (GDPR): A Commentary, Article 32 GDPR, p. 635 (Oxford University Press 2020).
- Unlike the general responsibility for data processing under Articles 24 and 25, which lies on the controller, under Article 32 the processor bears direct responsibility for ensuring security. This creates an additional obligation beyond the one derived from the contract with the controller under Article 28(3)(c) of the GDPR. The underlying rationale for this choice is practical. While the controller theoretically has the ability to assess the processor's security standards (Article 28(3)(h) GDPR), in practice, it is the processor who possesses direct knowledge and control over the (portion of) processing operations it is entrusted with. The legislative decision is therefore sensible and logical. Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 4 (C.H. Beck 2020, 3rd Edition).
- Tietosuojavaltuutetun toimisto (Finland) - 4672/161/2022 (available here).
- When determining the available technology, it is essential to consider various factors, including market conditions. The assessment should take into account the prevailing market situation and the practices followed by competitors. Furthermore, it is important to consider information and recommendations provided by government agencies. For example, guidance from authorities like the Federal Office for Information Security (BSI) and their IT baseline protection catalogs can provide valuable insights and should be taken into consideration. Piltz, in Gola, Datenschutz-Grund-verordnung, Article 32 GDPR, margin number 19 (C. H. Beck 2018, 2nd edition).
- IMY (Sweden) - DI-2021-4355 (available here).
- Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 11 (C.H. Beck 2020, 3rd Edition).
- However, it is neither feasible nor practical to strictly separate these two categories. For instance, establishing a role and authorization framework to ensure confidentiality and purpose limitation necessitates both defining authorizations as an organizational measure and implementing them technically through the allocation of access credentials. See, Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 5 (C.H. Beck 2020, 3rd Edition).
- Burton, in Kuner, Bygrave, Docksey, The EU General Data Protection Regulation (GDPR): A Commentary, Article 32 GDPR, p. 636 (Oxford University Press 2020).
- Hansen, in Simitis, Hornung, Spiecker gen. Döhmann, Datenschutzrecht, Article 32 GDPR, margin number 20 (C.H. Beck 2019).
- Hence, this list is non-exhaustive, see Hladjk, in Ehmann, Selmayr, Datenschutz-Grundverordnung, Article 32 GDPR, margin number 6 (C.H. Beck 2018, 2nd Edition).
- Piltz correctly points out that the German version of the GDPR contains a translation error according to which the measures listed could be considered mandatory. The other language versions, however, clearly suggest that the measures contained in the list are merely illustrative. The English version, for example, uses the expression 'as appropriate'. Piltz, in Gola, Datenschutz-Grundverordnung, Article 32 GDPR, margin number 24 (C.H. Beck 2018, 2nd Edition).
- This list lacks legal certainty for the controller or processor. In other words, even implementing all measures stated in the list does not guarantee that the requirements outlined Article 32(1) are considered fulfilled. This provision functions as an incomplete catalog of measures, serving only as guidance for the controller or processor. See, Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 14 (C.H. Beck 2020, 3rd Edition).
- By encrypting data, for instance, it becomes possible to avoid the need for notifying data subjects in case of a breach of personal data protection as stated in Article 34 GDPR.
- Symmetric encryption methods, such as AES, Blowfish, IDEA, RC6, and Twofish, employ the same parameter (key) for both encryption and decryption. They are well-suited for tasks like disk or file encryption. On the other hand, asymmetric cryptographic methods, like ECC, ElGamal, and RSA, rely on a key pair consisting of a public key and a private key. The private key is known only to the authorized person. When sending a message to that person, the public key is used to encrypt it, and only the authorized person can decrypt the message using the corresponding private key. Asymmetric cryptographic methods can also be used for electronic signing, ensuring the integrity and authenticity of data. See, Hansen, in Simitis, Hornung, Spiecker gen. Döhmann, Datenschutzrecht, Article 32 GDPR, margin number 35 (C.H. Beck 2019).
- Confidentiality, integrity, and availability are the fundamental objectives of IT security, representing the core protection goals. These goals are widely recognized and form the foundation of any Information Security Management System (ISMS) in practical implementation. The primary purpose of an ISMS is to plan, implement, monitor, and continuously enhance information security concepts, ensuring the fulfillment of these objectives. See, Hladjkin Ehmann, Selmayr, Datenschutz-Grundverordnung, Article 32 GDPR, margin number 8 (C.H. Beck 2018, 2nd Edition).
- Hansen, in Simitis, Hornung, Spiecker gen. Döhmann, Datenschutzrecht, Article 32 GDPR, margin number 37 (C.H. Beck 2019).
- Hansen, in Simitis, Hornung, Spiecker gen. Döhmann, Datenschutzrecht, Article 32 GDPR, margin number 39 (C.H. Beck 2019).
- Hansen, in Simitis, Hornung, Spiecker gen. Döhmann, Datenschutzrecht, Article 32 GDPR, margin number 40 (C.H. Beck 2019).
- Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 25 (C.H. Beck 2020, 3rd Edition).
- https://nvlpubs.nist.gov/nistpubs/Legacy/SP/nistspecialpublication800-39.pdf (accessed on 12.6.2023).
- Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 26 (C.H. Beck 2020, 3rd Edition).
- Hansen, in Simitis, Hornung, Spiecker gen. Döhmann, Datenschutzrecht, Article 32 GDPR, margin number 45 (C.H. Beck 2019).
- Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 27 (C.H. Beck 2020, 3rd Edition).
- Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 28 (C.H. Beck 2020, 3rd Edition).
- Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 29 (C.H. Beck 2020, 3rd Edition).
- Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 30 (C.H. Beck 2020, 3rd Edition).
- Burton, in Kuner, Bygrave, Docksey, The EU General Data Protection Regulation (GDPR): A Commentary, Article 32 GDPR, p. 636 (Oxford University Press 2020).
- Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 35 (C.H. Beck 2020, 3rd Edition)
- Riccio, Scorza, Belisario, GDPR e normativa privacy, p. 299 (Wolters Kluwer 2018).
- Martini, in Paal, Pauly, DS-GVO Article 32, margin number 65 (C.H.Beck 2021, 3rd Edition).
- Martini, in Paal, Pauly, DS-GVO Article 32, margin number 66 (C.H.Beck 2021, 3rd Edition).