Article 32 GDPR
Legal Text
1. Taking into account the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons, the controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk, including inter alia as appropriate:
- (a) the pseudonymisation and encryption of personal data;
- (b) the ability to ensure the ongoing confidentiality, integrity, availability and resilience of processing systems and services;
- (c) the ability to restore the availability and access to personal data in a timely manner in the event of a physical or technical incident;
- (d) a process for regularly testing, assessing and evaluating the effectiveness of technical and organisational measures for ensuring the security of the processing.
2. In assessing the appropriate level of security account shall be taken in particular of the risks that are presented by processing, in particular from accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to personal data transmitted, stored or otherwise processed.
3. Adherence to an approved code of conduct as referred to in Article 40 or an approved certification mechanism as referred to in Article 42 may be used as an element by which to demonstrate compliance with the requirements set out in paragraph 1 of this Article.
4. The controller and processor shall take steps to ensure that any natural person acting under the authority of the controller or the processor who has access to personal data does not process them except on instructions from the controller, unless he or she is required to do so by Union or Member State law.
Relevant Recitals
Commentary
Article 32 GDPR provides for an obligation to implement technical and organisational measures ("TOMs") ensuring the security of processing, which is an integral part of the right to data protection under Article 8 of the Charter of Fundamental Rights of the European Union ("CFR").[1] Therefore, this provision reflects the principle of integrity and confidentiality enshrined in Article 5(1)(f) GDPR. Any breach of the security of personal data demanded by this provision leads to a 'personal data breach' as defined in Article 4(12) GDPR and might be subject to a notice to the supervisory authority ("SA") or to the data subject (see Articles 33 and 34 GDPR respectively).[2]
Since this provision deals with the implementation of appropriate TOMs, it is closely related to Articles 24 (Responsibility of the controller) and 25 (Data protection by design and by default) GDPR which similarly demand the implementation of appropriate TOMs. However, other than these provisions, Article 32 GDPR addresses controllers as well as processors.
Article 32(1) GDPR obliges the controller and the processor to implement appropriate TOMs in order to realise an appropriate level of security, taking into account the state of the art, the implementation costs as well as the nature, scope, context and purposes of processing. Consideration must also be given to the processing operation’s impact on the rights and freedoms of natural persons.
Article 32(2) GDPR lists some specific risks that should be considered in the assessment of the appropriate level of security, such as risks from accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to personal data.
Article 32(3) GDPR provides for the possibility to use adherence to approved codes of conduct (Article 40 GDPR) or an approved certification mechanism (Article 42 GDPR) as an element to demonstrate compliance with the obligation to implement appropriate security measures.
Article 32(4) GDPR obliges controllers and processors to ensure than all persons under their authority with access to personal data do not process them except on instructions from the controller or when it is required by law.
EDPB and WP29 Guidelines:
- EDPB, 'Guidelines 01/2021 on Examples regarding Personal Data Breach Notification', 14 December 2021 (Version 2.0) (available here).
- EDPB, 'Guidelines 07/2020 on the concepts of controller and processor in the GDPR', 7 July 2021 (Version 2.1) (available here),
- WP29, 'Opinion 3/2010 on the principle of accountability', 00062/10/EN WP173, 13 July 2010 (available here),
- WP29, 'Guidelines on Data Protection Impact Assessment (DPIA) and determining whether processing is “likely to result in a high risk” for the purposes of Regulation 2016/679' 17/EN WP248 rev.01, 4 April 2017 (available here).
(1) Obligation to implement appropriate TOMs
Article 32(1) GDPR requires controllers and processors to implement TOMs that ensure an appropriate level of security. The security of personal data is a requirement and core principle (Article 5(1)(f) GDPR - integrity and confidentiality) relating to the processing of personal data. It is noteworthy that this provision not only provides for the confidentiality or secrecy of personal data, but also requires availability and integrity. Therefore, this provision is characterised by its intersection with information security.[3] In practice, this means that the implementation of appropriate TOMs by the controller and processor should be closely coordinated with the controller's or processor's information security department.
The obligation to implement appropriate TOMs is closely connected to other provisions stipulating the implementation of TOMs (i.e. Article 24 and 25 GDPR). However, while these provisions relate more generally to compliance with the GDPR as a whole, Article 32 GDPR is directed specifically at ensuring an appropriate level of security of personal data.[4]
While this provision obliges the controller and processor to implement appropriate TOMs, it is subject to debate whether it is possible for a data subject to voluntarily waive specific TOMs and therefore accept a higher risk.[5]
The controller and the processor
Article 32 GDPR addresses both the controller (Article 4(7) GDPR) and processor (Article 4(8) GDPR). Hence, in contrast to the general principle that the controller is responsible for GDPR compliance (e.g. for the lawfulness under Article 6(1) GDPR), this provision sets out an additional and direct responsibility of the processor for ensuring security.[6]
It is noteworthy that according to Article 28(3)(c) GDPR, the processing agreement between a controller and a processor must also provide for an obligation of the processor to take all measures required pursuant to Article 32 GDPR. Additionally, Article 28(3)(f) GDPR requires the processor to assists the controller in ensuring compliance with the obligations pursuant to Articles 32 GDPR. The processor is therefore obliged to implement security measures by various norms.[7]
The provision does not, however, directly apply to software and device manufacturers (see also commentary on Article 25 GDPR). Nonetheless, Article 32 GDPR holds immense practical significance when it comes to the controller or processor's selection of software and hardware, which must meet the required security standard. In situations of uncertainty, products that do not enable data protection-compliant usage should not be employed. Software and device manufacturers are therefore indirectly affected by this provision.[8]
Taking into account
Article 32(1) GDPR outlines the factors that the controller and processor must consider when conducting the assessment on which measures are "appropriate" to contain the risks. These factors are the state of the art, the implementation costs, the nature, scope, context, and purposes of the processing, as well as the varying likelihood and severity of risks to the rights and freedoms of individuals.
It should be noted that these factors include the state of the art and the cost of implementation which are not specifically mentioned in the more general obligation of the controller to implement appropriate TOMs under Article 24 GDPR.
State of the art
Article 32(1) GDPR makes use of the same concept - "state of the art" - that was already used in Article 25(1) GDPR (notably, not in Article 24(1) GDPR). Therefore, see comment on Article 25(1) GDPR for more information.
Costs of implementation
Another criterion for selecting appropriate TOMs are the implementation costs. Also this factor was already subject to consideration in the parallel assessment under Article 25(1) GDPR (but not under Article 24(1) GDPR). Therefore, see comment on Article 25(1) GDPR for more information.
Nature, scope, context and purpose of processing
the controller has to take into account the nature, scope, context and purpose of the processing. For these terms, see commentary on Article 24 GDPR which also includes this factors as a necessary consideration.
Risks of varying likelihood and severity for rights and freedoms of natural persons
Finally, controllers and processors have to consider the the risk of varying likelihood and severity for the rights and freedoms of natural persons when deciding on appropriate TOMs. In other words, they should complete a risk assessment before carrying out a processing operation. Since such an assessment is also necessary under other provisions, it is recommended to perform one holistic risk assessment, covering all the requirements. See Article 24 GDPR for more information.
It should be noted, however, that Article 32(2) GDPR requires specific attention on certain categories of risk, such as the accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to personal data transmitted, stored or otherwise processed. However, the risk should always be considered with the potential harm (i.e. damage); therefore it could be argued that a potential risk of e.g. “destruction” will not be (very) relevant if it causes no tangible harm to a data subject.
For further elements, see commentary under Article 32(2) GDPR below.
Shall implement appropriate TOMs to ensure a level of security appropriate to the risk
Appropriate
The word 'appropriate' appears not less than three times in Article 32(1): "the controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk, including inter alia as appropriate". This indicates that controllers and processors must identify the situation specific risks, assess their potential impact having regard to the particular circumstances of the processing and implement measures to mitigate (at least) those risks which are the most likely to materialise and those whose impact would be the most severe.[9] In other words, the appropriateness refers to the aim of ensuring the security of the personal data by reducing risks connected to the processing activity.[10]
While the level of security must not be less than what is appropriate, it is completely valid to implement measures that go even further than what would normally be considered appropriate. However, it should be noted that a complete certainty of security will in most cases be impossible to achieve and it would in any case be not proportionate to demand such a level of security.[11]
Reference to appropriateness follows from the principle of proportionality under Article 52(1) CFR.[12] See commentary on Article 24(1) GDPR for more information.
Technical and organisational measures
Article 32 GDPR is one of various provisions in the GDPR (e.g. Articles 24 and 25 GDPR) that obliges controllers (and in this case also processors) to implement technical and organisational measures. In the implementation of these measures, these provisions should be read and applied together.
More information on TOMs, see commentary on Article 24(1) GDPR.
Article 32 GDPR provides a list of examples of TOMs which are supposed to increase the level of security of the processing of personal data (see below for more information on these examples). But an appropriate level of security will never be realised by just one isolated security measures; rather, the combination of all TOMs should ensure an appropriate level of security.[13]
Since the security of processing has to be ensured at all times when personal data is processed, the respective TOMs should be chosen and implemented before the processing takes place.[14] The controller and the processor are required to implement suitable TOMs within their respective domains. They must document the rationale behind their selection of these measures and how they evaluated the mentioned criteria.[15]
level of security appropriate to the risk
The aim of the implementation of TOMs is to achieve a level of security which is appropriate to the risk. In other words, the TOMs under this provisions should improve and ensure the security of the personal data processed by the controller or processor.
As mentioned above, the appropriate level of security demanded by this provision stems from the general principle of 'integrity and confidentiality’ under Article 5(1)(f) GDPR. And it should be noted that this principle should be interpreted in the context of the other principles in order to strike a balance between these goals, e.g. the having of redundant storage as a security measure needing to be balanced with the data minimisation principle.[16]
Including inter alia as appropriate
Article 32(1) GDPR enumerates four examples of technical security measures[17] that controllers and processors should implement "as appropriate". These include pseudonymisation and encryption of personal data; the ability to ensure the ongoing confidentiality, integrity, availability and resilience of processing systems; the ability to restore the availability and access to personal data in a timely manner in case of an incident; and a process for regularly testing, assessing and evaluating the effectiveness of security measures.
As this list is purely exemplary, it does not provide for any certainty on the side of the processor or controller. Even implementing all measures stated in the list does not guarantee that the requirements outlined Article 32(1) GDPR could be considered fulfilled. This provision functions as an incomplete catalogue of measures, serving only as guidance for the controller or processor.[18]
(a) Pseudonymisation and encryption
According to Article 4(5) GDPR, "pseudonymisation" means the processing of personal data in such a manner that the personal data can no longer be attributed to a specific data subject without the use of additional information, provided that such additional information is kept separately and is subject to technical and organisational measures to ensure that the personal data are not attributed to an identified or identifiable natural person. See commentary on Article 4(5) GDPR for more information on pseudonymisation.
For example: A controller's marketing department runs an analysis of the efficiency of different advertisements sent out to different customer segments. Therefore, the department receives the new sales numbers following the ad campaign. However, in order to reduce the risk for data subjects the department receives only pseudonymised numbers without any means to identify specific data subjects. Only with additional information, it would be possible to attribute the numbers to data subjects.
Although the GDPR does not explicitly define the concept of encryption, it is described in Article 34(3)(a) GDPR according to which encryption is an effective method for rendering personal data inaccessible to unauthorised individuals.[19] The cryptographic techniques employed for encryption, in accordance with the current state of the art, play a vital role in ensuring information security in line with the provisions of Article 32 GDPR.
Symmetric encryption methods, such as AES, Blowfish, IDEA, RC6, and Twofish, employ the same parameter (key) for both encryption and decryption. They are well-suited for tasks like disk or file encryption. On the other hand, asymmetric cryptographic methods, like ECC, ElGamal, and RSA, rely on a key pair consisting of a public key and a private key. The private key is known only to the authorized person. When sending a message to that person, the public key is used to encrypt it, and only the authorized person can decrypt the message using the corresponding private key. Asymmetric cryptographic methods can also be used for electronic signing, ensuring the integrity and authenticity of data.[20]
(b) Ongoing confidentiality, integrity, availability and resilience of processing systems and services
Controllers and processors must implement measures which are able to ensure ongoing confidentiality, integrity, availability,[21] and resilience of the processing systems and services used. The objective is to continuously maintain security requirements over the entire duration of processing ("ongoing..."). This ensures that security measures are consistently applied in a dynamic and ongoing manner.
The term "systems" should be interpreted broadly, encompassing not only technical systems for automated processing but also systems supporting paper-based processing. Similarly, the concept of "services" falls under the same understanding. The individuals involved in handling these systems and services play a role in the processing of personal data. Regular training to maintain an ongoing awareness of risks and the importance of adhering to established rules for data protection are also essential and must not be bypassed.[22]
"Confidentiality" is not only a technical notion but a GDPR principle established in Article 5(1)(f) GDPR. It entails safeguarding information from unauthorized disclosure, allowing access to confidential data and information solely by authorized individuals through approved means. Measures implemented to ensure or facilitate confidentiality include data encryption, access control systems, permissions, and authentication based on one or multiple factors.[23]
"Integrity" is also referred to as a principle in Article 5(1)(f) GDPR. Various references to integrity can be found in different contexts, such as the completeness or authenticity of data (Recital 49) and the right to rectification (Article 16 GDPR). Integrity encompasses both the integrity of data and the proper functioning of systems. Unauthorised modifications constitute a breach of information integrity, impacting not only the actual content but also meta-data such as the author, sender, and creation timestamp. To protect integrity, measures such as electronic signatures or check digits, input controls (logging) to track who accessed, modified, or deleted personal data and when, as well as authorization systems for managing access, are employed.[24] One big risk factor comes from computer viruses which could be reduced by the implementation of firewalls, antivirus software or email filters.[25]
The term "availability" refers to the likelihood that a technical system will fulfill specific requests within an agreed-upon time-frame. Systems and services are considered available if they can be used at any given time. When systems and services are unavailable, controllers and processors are unable to maintain control over the personal data processed through them. Impairments to availability can range from minor delays in usability to complete system failures that last for an extended period. Various factors can contribute to such impairments, including defective hardware, faulty software, power supply issues, or disrupted network connections. To ensure availability, diverse security measures may need to be implemented, such as providing complete replacement systems or individual backup systems, emergency power supplies, and conducting regular functional tests.[26]
Companies will also have to ensure the "resilience" of the systems and services related to processing. The GDPR does not describe which measures contribute positively to resilience. In IT, resilience is the ability of an information system to continue to: (i) operate under adverse conditions or stress, even if in a degraded or debilitated state, while maintaining essential operational capabilities; and (ii) recover to an effective operational posture in a time frame consistent with mission needs.[27] In other words, it is about the tolerance and compensatory ability of a system against disturbances.
For example: Suppose there is a network consisting of multiple nodes. If one of the network nodes is targeted and disrupted by an attack, the system demonstrates resilience if the overall functionality of the network remains unaffected because another network node seamlessly takes over its function. In this scenario, the system's resilience is evaluated based on its ability to withstand a certain number of attacks without experiencing significant functional limitations. This measurement, known as "k-resilience," quantifies the system's capability to handle multiple attacks while maintaining its operational integrity.[28]
Resilience measures involve minimising attack possibilities, such as by implementing essential functionality and promptly installing security updates to improve systems security. Additionally, detecting attacks or disruptions and responding appropriately is crucial. Training employees to identify issues and learn from mistakes is also important. Processing should be designed to minimise damage during incidents, such as by transitioning to a fail-safe mode or automatically disconnecting from the Internet in given cases. Creating opportunities for intervention, including human intervention, can further enhance resilience.[29]
(c) Ability to restore availability and access to personal data in a timely manner
Article 32(1)(c) GDPR requires that systems and services be capable of restoring the availability and access to personal data in the event of a physical or technical incident. Unlike in the preceding provision (Article 32(1)(b) GDPR), in this case the incident has occurred (and it must have a technical or physical nature). For example, one can consider scenarios such as physical destruction or loss of storage devices, unauthorised data encryption resulting from a ransomware attack (technical incident), or the destruction of system components due to fire, flooding, or devastation (physical incident). In such cases, controllers and processors must be able to restore availability and access to personal data.[30]
In practice, this security measures have to be implemented before the incident occurred to be effective. Examples of measures include regular backups or even the necessity to store data on multiple in different locations.[31]
The expression "in a timely manner" does not establish a specific deadline, as the timeline depends on the factual circumstances and the seriousness of the issues the controller is dealing with. However, this wording must be understood as indicating that the controller shall restore availability and access as soon as possible.[32]
(d) Regularly testing, assessing and evaluating
Article 30(1)(d) GDPR mentions, as an example of a security measure, the implementation of a process for regularly testing, assessing and evaluating the effectiveness of TOMs for ensuring the security of the processing. This example is related to the accountability principle stipulated in Article 5(2) GDPR.[33]
As the wording of this provision suggests, these are not measures that directly protect data and systems security. These are instead "meta-measures" aiming at guaranteeing that security systems continue to be effective from a dynamic perspective, namely through time. Typical example here is a test where IT experts simulate an external attack, in order to evaluate the response of pre-existing security mechanisms.[34] The GDPR does not specify a time framework for such testing operations. However, frequency in checks mainly depends on the type of risks that a given system must face and the ongoing technological developments in the field concerned.[35]
(2) Certain risks must always be taken into account
Article 32(2) GDPR stipulates that the assessment of the appropriate level of security, as described in Article 32(1) GDPR, has to take into account the risks that are presented by processing, in particular from accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to the processed personal data. The second paragraph therefore builds on Article 32(1) GDPR in the sense that it provides examples of the risks that are most likely to occur, and/or the most severe. Notably, the risks mentioned here are the risks described in case of a personal data breach under Article 4(12) GDPR.[36] These risks are further somewhat similar to the criteria that need to be considered when a Data Protection Impact Assessment ("DPIA") is carried out in accordance with Article 35 GDPR.[37]
Accidental or unlawful
The risks refer to failures that happen either per accident or unlawful. See commentary on Article 4(12) GDPR for more information.
For example: The purposeful deletion of personal data when it is no longer necessary for the processing is not a risk that has to be considered under this provision.
destruction, loss, alteration, unauthorised disclosure, or access
The specific risks mentioned in this provision are identical to those listed in the definition of a personal data breach under Article 4(12) GDPR. Therefore, see commentary on Article 4(12) GDPR for more information.
(3) Codes of conduct and certification mechanisms
Article 32(3) GDPR stipulates that adherence to codes of conduct or certification mechanisms (Articles 40 and 42 GDPR respectively) can be used as an element to demonstrate compliance with the controller's or processor's obligation to implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk. This provision is supposed to support compliance with the accountability principle established in Article 5(2) GDPR and Article 24(1) GDPR.
It should be noted that neither of these options reduce the controller or processor’s responsibility. However, if asked to prove they took all possible measures to avoid the violation, adherence to these would undoubtedly play an important role.[38]
This provision is basically identical to Article 24(3) GDPR, therefore see commentary on this provision for more information. See also commentary on Article 42(4) GDPR.
(4) Natural persons acting under the authority of the controller or the processor
Most processing operations involve the use of human resources (i.e. the involvement of a natural person in the processing activity). Whenever a person is authorised to access personal data, a data security issue arises. For these reasons, Article 32(4) GDPR requires the controller and processor to ensure that such persons act solely and exclusively on the instructions of the controller,[39] with the only exception of those individual operations that are imposed by EU or national law. The verb 'ensure' indicates that controllers and processors must provide some form of guarantee with respect to the result.
The provision also speaks of "any natural person [...] who has access to personal data". This includes employees, freelancers, interns, external consultants or employees of service companies, insofar as they have access to personal data (see commentary on Article 29 GDPR). In contrast, other third parties, including visitors or customers who access the data unlawfully, are not included in the definition, as they do not act "under the authority of the controller or processor".[40]
To comply with this obligation, controllers and processors should establish clear rules of conduct, internal instructions and sanctioning procedures. Technical measures that prevent unauthorised access are also essential in certain cases. Controllers and processors must also regularly check whether these measures are effective and actually followed.[41]
For example: Potential measures include the selection of trustworthy employees who will reliable follow the controller's or processor's instructions as well as implementing clear internal procedures and instructions and regular training for employees.[42]
Decisions
→ You can find all related decisions in Category:Article 32 GDPR
References
- ↑ The CJEU has consequently recognised data security as an integral part of the right to data protection in Article 8 of the Charter of Fundamental Rights of the European Union. See, CJEU, Joined Cases C‑293/12 and C‑594/12, Digital Rights Ireland Ltd, 8 April 2014, margin number 29 (available here).
- ↑ Compare Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 1 (C.H. Beck 2024, 4th Edition).
- ↑ Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 1 (C.H. Beck 2024, 4th Edition).
- ↑ Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 2 (C.H. Beck 2024, 4th Edition).
- ↑ Regarding this debate see Hansen, in Simitis, Hornung, Spiecker gen. Döhmann, Datenschutzrecht, Article 32 GDPR, margin numbers 25 et seq. (NOMOS 2025, 2nd Edition).
- ↑ Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 4 (C.H. Beck 2024, 4th Edition).
- ↑ Compare also Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 4 (C.H. Beck 2024, 4th Edition).
- ↑ Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 4 (C.H. Beck 2024, 4th Edition).
- ↑ Burton, in Kuner, Bygrave, Docksey, The EU General Data Protection Regulation (GDPR): A Commentary, Article 32 GDPR, p. 635 (Oxford University Press 2020).
- ↑ Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 5 (C.H. Beck 2024, 4th Edition).
- ↑ Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 8 (C.H. Beck 2024, 4th Edition).
- ↑ Burton, in Kuner, Bygrave, Docksey, The EU General Data Protection Regulation (GDPR): A Commentary, Article 32 GDPR, p. 635 (Oxford University Press 2020).
- ↑ Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 5 (C.H. Beck 2024, 4th Edition); Hansen, in Simitis, Hornung, Spiecker gen. Döhmann, Datenschutzrecht, Article 32 GDPR, margin numbers 20 (NOMOS 2025, 2nd Edition).
- ↑ Compare with Article 25 GDPR, where such an obligation is explicitly provided; see also Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 6 (C.H. Beck 2024, 4th Edition).
- ↑ Hansen, in Simitis, Hornung, Spiecker gen. Döhmann, Datenschutzrecht, Article 32 GDPR, margin numbers 23 (NOMOS 2025, 2nd Edition).
- ↑ Hansen, in Simitis, Hornung, Spiecker gen. Döhmann, Datenschutzrecht, Article 32 GDPR, margin numbers 11 et seqq. (NOMOS 2025, 2nd Edition).
- ↑ Hence, this list is non-exhaustive, see Hladjk, in Ehmann, Selmayr, Datenschutz-Grundverordnung, Article 32 GDPR, margin number 8 (C.H. Beck 2024, 3rd Edition).
- ↑ Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 14 (C.H. Beck 2024, 4th Edition).
- ↑ By encrypting data, for instance, it becomes possible to avoid the need for notifying data subjects in case of a breach of personal data protection as stated in Article 34 GDPR.
- ↑ Hansen, in Simitis, Hornung, Spiecker gen. Döhmann, Datenschutzrecht, Article 32 GDPR, margin numbers 41 (NOMOS 2025, 2nd Edition).
- ↑ Confidentiality, integrity, and availability are the fundamental objectives of IT security, representing the core protection goals. These goals are widely recognized and form the foundation of any Information Security Management System (ISMS) in practical implementation. The primary purpose of an ISMS is to plan, implement, monitor, and continuously enhance information security concepts, ensuring the fulfillment of these objectives. See, Hladjkin Ehmann, Selmayr, Datenschutz-Grundverordnung, Article 32 GDPR, margin number 11 (C.H. Beck 2024, 3rd Edition).
- ↑ Hansen, in Simitis, Hornung, Spiecker gen. Döhmann, Datenschutzrecht, Article 32 GDPR, margin numbers 43 (NOMOS 2025, 2nd Edition).
- ↑ Hansen, in Simitis, Hornung, Spiecker gen. Döhmann, Datenschutzrecht, Article 32 GDPR, margin numbers 45 (NOMOS 2025, 2nd Edition).
- ↑ Hansen, in Simitis, Hornung, Spiecker gen. Döhmann, Datenschutzrecht, Article 32 GDPR, margin numbers 46 (NOMOS 2025, 2nd Edition).
- ↑ Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 24 (C.H. Beck 2024, 4th Edition).
- ↑ Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 25 (C.H. Beck 2024, 4th Edition); Hansen, in Simitis, Hornung, Spiecker gen. Döhmann, Datenschutzrecht, Article 32 GDPR, margin numbers 47 (NOMOS 2025, 2nd Edition).
- ↑ https://nvlpubs.nist.gov/nistpubs/Legacy/SP/nistspecialpublication800-39.pdf (accessed on 12.6.2023).
- ↑ Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 26 (C.H. Beck 2020, 3rd Edition).
- ↑ Hansen, in Simitis, Hornung, Spiecker gen. Döhmann, Datenschutzrecht, Article 32 GDPR, margin numbers 48 et seqq. (NOMOS 2025, 2nd Edition).
- ↑ Compare Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 27 (C.H. Beck 2024, 4th Edition).
- ↑ Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 27 (C.H. Beck 2020, 3rd Edition).
- ↑ Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 28 (C.H. Beck 2024, 4th Edition).
- ↑ Compare Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 15 (C.H. Beck 2024, 4th Edition).
- ↑ Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 29 et seq. (C.H. Beck 2024, 4th Edition).
- ↑ Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 30 (C.H. Beck 2024, 4th Edition).
- ↑ Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 31 (C.H. Beck 2024, 4th Edition).
- ↑ Burton, in Kuner, Bygrave, Docksey, The EU General Data Protection Regulation (GDPR): A Commentary, Article 32 GDPR, p. 636 (Oxford University Press 2020).
- ↑ Riccio, Scorza, Belisario, GDPR e normativa privacy, p. 299 (Wolters Kluwer 2018).
- ↑ For more information regarding the processing under the authority of the controller or processor see commentary on Article 29 GDPR.
- ↑ Martini, in Paal, Pauly, DS-GVO Article 32, margin number 65 (C.H.Beck 2021, 3rd Edition).
- ↑ Martini, in Paal, Pauly, DS-GVO Article 32, margin number 66 (C.H.Beck 2021, 3rd Edition).
- ↑ Jandt, in Kühling, Buchner, DS-GVO BDSG, Article 32 GDPR, margin number 38 (C.H. Beck 2024, 4th Edition).