https://gdprhub.eu/api.php?action=feedcontributions&user=94.198.41.212&feedformat=atomGDPRhub - User contributions [en]2024-03-29T00:49:50ZUser contributionsMediaWiki 1.39.6https://gdprhub.eu/index.php?title=Article_22_GDPR&diff=15755Article 22 GDPR2021-05-06T21:56:01Z<p>94.198.41.212: </p>
<hr />
<div>{| class="wikitable" style="width: 25%; margin-left: 10px; float:right;"<br />
![[Article 21 GDPR|←]] Article 22 - Automated individual decision-making, including profiling [[Article 23 GDPR|→]]<br />
|-<br />
| style="padding: 20px; background-color:#003399;" |[[File:Gdpricon.png|100px|center|link=Overview_of_GDPR]]<br />
|-<br />
|<br />
<br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><br />
<div style="font-weight:bold;line-height:1.6;">Chapter 1: General provisions</div><br />
<div class="mw-collapsible-content"><br />
<small><br />
[[Article 1 GDPR|Article 1: Subject-matter and objectives]]<br /><br />
[[Article 2 GDPR|Article 2: Material scope]]<br /><br />
[[Article 3 GDPR|Article 3: Territorial scope]]<br /><br />
[[Article 4 GDPR|Article 4: Definitions]]<br /><br />
</small><br />
</div></div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><br />
<div style="font-weight:bold;line-height:1.6;">Chapter 2: Principles</div><br />
<div class="mw-collapsible-content"><br />
<small><br />
[[Article 5 GDPR|Article 5: Principles relating to processing of personal data]]<br /><br />
[[Article 6 GDPR|Article 6: Lawfulness of processing]]<br /><br />
[[Article 7 GDPR|Article 7: Conditions for consent]]<br /><br />
[[Article 8 GDPR|Article 8: Conditions applicable to child’s consent in relation to information society services]]<br /><br />
[[Article 9 GDPR|Article 9: Processing of special categories of personal data]]<br /><br />
[[Article 10 GDPR|Article 10: Processing of personal data relating to criminal convictions and offences]]<br /><br />
[[Article 11 GDPR|Article 11: Processing which does not require identification]]<br /><br />
</small><br />
</div></div><br />
<br />
<div class="toccolours mw-collapsible" style="border-width: 0px" overflow:auto;"><br />
<div style="font-weight:bold;line-height:1.6;">Chapter 3: Rights of the data subject</div><br />
<div class="mw-collapsible-content"><br />
<small><br />
[[Article 12 GDPR|Article 12: Transparent information, communication and modalities for the exercise of the rights of the data subject]]<br /><br />
[[Article 13 GDPR|Article 13: Information to be provided where personal data are collected from the data subject]]<br /><br />
[[Article 14 GDPR|Article 14: Information to be provided where personal data have not been obtained from the data subject]]<br /><br />
[[Article 15 GDPR|Article 15: Right of access by the data subject]]<br /><br />
[[Article 16 GDPR|Article 16: Right to rectification]]<br /><br />
[[Article 17 GDPR|Article 17: Right to erasure (‘right to be forgotten’)]]<br /><br />
[[Article 18 GDPR|Article 18: Right to restriction of processing]]<br /><br />
[[Article 19 GDPR|Article 19: Notification obligation regarding rectification or erasure of personal data or restriction of processing]]<br /><br />
[[Article 20 GDPR|Article 20: Right to data portability]]<br /><br />
[[Article 21 GDPR|Article 21: Right to object]]<br /><br />
[[Article 22 GDPR|Article 22: Automated individual decision-making, including profiling]]<br /><br />
[[Article 23 GDPR|Article 23: Restrictions]]<br /><br />
</small><br />
</div></div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><br />
<div style="font-weight:bold;line-height:1.6;">Chapter 4: Controller and processor</div><br />
<div class="mw-collapsible-content"><br />
<small><br />
[[Article 24 GDPR|Article 24: Responsibility of the controller]]<br /><br />
[[Article 25 GDPR|Article 25: Data protection by design and by default]]<br /><br />
[[Article 26 GDPR|Article 26: Joint controllers]]<br /><br />
[[Article 27 GDPR|Article 27: Representatives of controllers or processors not established in the Union]]<br /><br />
[[Article 28 GDPR|Article 28: Processor]]<br /><br />
[[Article 29 GDPR|Article 29: Processing under the authority of the controller or processor]]<br /><br />
[[Article 30 GDPR|Article 30: Records of processing activities]]<br /><br />
[[Article 31 GDPR|Article 31: Cooperation with the supervisory authority]]<br /><br />
[[Article 32 GDPR|Article 32: Security of processing]]<br /><br />
[[Article 33 GDPR|Article 33: Notification of a personal data breach to the supervisory authority]]<br /><br />
[[Article 34 GDPR|Article 34: Communication of a personal data breach to the data subject]]<br /><br />
[[Article 35 GDPR|Article 35: Data protection impact assessment]]<br /><br />
[[Article 36 GDPR|Article 36: Prior consultation]]<br /><br />
[[Article 37 GDPR|Article 37: Designation of the data protection officer]]<br /><br />
[[Article 38 GDPR|Article 38: Position of the data protection officer]]<br /><br />
[[Article 39 GDPR|Article 39: Tasks of the data protection officer]]<br /><br />
[[Article 40 GDPR|Article 40: Codes of conduct]]<br /><br />
[[Article 41 GDPR|Article 41: Monitoring of approved codes of conduct]]<br /><br />
[[Article 42 GDPR|Article 42: Certification]]<br /><br />
[[Article 43 GDPR|Article 43: Certification bodies]]<br /><br />
</small><br />
</div></div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><br />
<div style="font-weight:bold;line-height:1.6;">Chapter 5: Transfers of personal data</div><br />
<div class="mw-collapsible-content"><br />
<small><br />
[[Article 44 GDPR|Article 44: General principle for transfers]]<br /><br />
[[Article 45 GDPR|Article 45: Transfers on the basis of an adequacy decision]]<br /><br />
[[Article 46 GDPR|Article 46: Transfers subject to appropriate safeguards]]<br /><br />
[[Article 47 GDPR|Article 47: Binding corporate rules]]<br /><br />
[[Article 48 GDPR|Article 48: Transfers or disclosures not authorised by Union law]]<br /><br />
[[Article 49 GDPR|Article 49: Derogations for specific situations]]<br /><br />
[[Article 50 GDPR|Article 50: International cooperation for the protection of personal data]]<br /><br />
</small><br />
</div></div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><br />
<div style="font-weight:bold;line-height:1.6;">Chapter 6: Supervisory authorities</div><br />
<div class="mw-collapsible-content"><br />
<small><br />
[[Article 51 GDPR|Article 51: Supervisory authority]]<br /><br />
[[Article 52 GDPR|Article 52: Independence]]<br /><br />
[[Article 53 GDPR|Article 53: General conditions for the members of the supervisory authority]]<br /><br />
[[Article 54 GDPR|Article 54: Rules on the establishment of the supervisory authority]]<br /><br />
[[Article 55 GDPR|Article 55: Competence]]<br /><br />
[[Article 56 GDPR|Article 56: Competence of the lead supervisory authority]]<br /><br />
[[Article 57 GDPR|Article 57: Tasks]]<br /><br />
[[Article 58 GDPR|Article 58: Powers]]<br /><br />
[[Article 59 GDPR|Article 59: Activity reports]]<br /><br />
</small><br />
</div></div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><br />
<div style="font-weight:bold;line-height:1.6;">Chapter 7: Cooperation and consistency</div><br />
<div class="mw-collapsible-content"><br />
<small><br />
[[Article 60 GDPR|Article 60: Cooperation between the lead supervisory authority and the other supervisory authorities concerned]]<br /><br />
[[Article 61 GDPR|Article 61: Mutual assistance]]<br /><br />
[[Article 62 GDPR|Article 62: Joint operations of supervisory authorities]]<br /><br />
[[Article 63 GDPR|Article 63: Consistency mechanism]]<br /><br />
[[Article 64 GDPR|Article 64: Opinion of the Board]]<br /><br />
[[Article 65 GDPR|Article 65: Dispute resolution by the Board]]<br /><br />
[[Article 66 GDPR|Article 66: Urgency procedure]]<br /><br />
[[Article 67 GDPR|Article 67: Exchange of information]]<br /><br />
[[Article 68 GDPR|Article 68: European Data Protection Board]]<br /><br />
[[Article 69 GDPR|Article 69: Independence]]<br /><br />
[[Article 70 GDPR|Article 70: Tasks of the Board]]<br /><br />
[[Article 71 GDPR|Article 71: Reports]]<br /><br />
[[Article 72 GDPR|Article 72: Procedure]]<br /><br />
[[Article 73 GDPR|Article 73: Chair]]<br /><br />
[[Article 74 GDPR|Article 74: Tasks of the Chair]]<br /><br />
[[Article 75 GDPR|Article 75: Secretariat]]<br /><br />
[[Article 76 GDPR|Article 76: Confidentiality]]<br /><br />
</small><br />
</div></div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><br />
<div style="font-weight:bold;line-height:1.6;">Chapter 8: Remedies, liability and penalties</div><br />
<div class="mw-collapsible-content"><br />
<small><br />
[[Article 77 GDPR|Article 77: Right to lodge a complaint with a supervisory authority]]<br /><br />
[[Article 78 GDPR|Article 78: Right to an effective judicial remedy against a supervisory authority]]<br /><br />
[[Article 79 GDPR|Article 79: Right to an effective judicial remedy against a controller or processor]]<br /><br />
[[Article 80 GDPR|Article 80: Representation of data subjects]]<br /><br />
[[Article 81 GDPR|Article 81: Suspension of proceedings]]<br /><br />
[[Article 82 GDPR|Article 82: Right to compensation and liability]]<br /><br />
[[Article 83 GDPR|Article 83: General conditions for imposing administrative fines]]<br /><br />
[[Article 84 GDPR|Article 84: Penalties]]<br /><br />
</small><br />
</div></div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><br />
<div style="font-weight:bold;line-height:1.6;">Chapter 9: Specific processing situations</div><br />
<div class="mw-collapsible-content"><br />
<small><br />
[[Article 85 GDPR|Article 85: Processing and freedom of expression and information]]<br /><br />
[[Article 86 GDPR|Article 86: Processing and public access to official documents]]<br /><br />
[[Article 87 GDPR|Article 87: Processing of the national identification number]]<br /><br />
[[Article 88 GDPR|Article 88: Processing in the context of employment]]<br /><br />
[[Article 89 GDPR|Article 89: Safeguards and derogations relating to processing for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes]]<br /><br />
[[Article 90 GDPR|Article 90: Obligations of secrecy]]<br /><br />
[[Article 91 GDPR|Article 91: Existing data protection rules of churches and religious associations]]<br /><br />
</small><br />
</div></div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><br />
<div style="font-weight:bold;line-height:1.6;">Chapter 10: Delegated and implementing acts</div><br />
<div class="mw-collapsible-content"><br />
<small><br />
[[Article 92 GDPR|Article 92: Exercise of the delegation]]<br /><br />
[[Article 93 GDPR|Article 93: Committee procedure]]<br /><br />
</small><br />
</div></div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><br />
<div style="font-weight:bold;line-height:1.6;">Chapter 11: Final provisions</div><br />
<div class="mw-collapsible-content"><br />
<small><br />
[[Article 94 GDPR|Article 94: Repeal of Directive 95: /46: /EC]]<br /><br />
[[Article 95 GDPR|Article 95: Relationship with Directive 20: 02: /58: /EC]]<br /><br />
[[Article 96 GDPR|Article 96: Relationship with previously concluded Agreements]]<br /><br />
[[Article 97 GDPR|Article 97: Commission reports]]<br /><br />
[[Article 98 GDPR|Article 98: Review of other Union legal acts on data protection]]<br /><br />
[[Article 99 GDPR|Article 99: Entry into force and application]]<br /><br />
</small><br />
</div><br />
</div><br />
|}<br />
<br />
==Legal Text==<br />
'''Article 22 - Automated individual decision-making, including profiling'''<br />
<br />
<span id="1">1. The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.</span><br />
<br />
<span id="2">2. Paragraph 1 shall not apply if the decision:</span><br />
<br />
<span id="2a">(a) is necessary for entering into, or performance of, a contract between the data subject and a data controller;</span><br />
<br />
<span id="2b">(b) is authorised by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject's rights and freedoms and legitimate interests; or</span><br />
<br />
<span id="2c">(c) is based on the data subject's explicit consent.</span><br />
<br />
<span id="3">3. In the cases referred to in points (a) and (c) of paragraph 2, the data controller shall implement suitable measures to safeguard the data subject's rights and freedoms and legitimate interests, at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision.</span><br />
<br />
<span id="4">4. Decisions referred to in paragraph 2 shall not be based on special categories of personal data referred to in Article 9(1), unless point (a) or (g) of Article 9(2) applies and suitable measures to safeguard the data subject's rights and freedoms and legitimate interests are in place.</span><br />
<br />
==Relevant Recitals==<br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><div>'''Recital 4:''' The processing of personal data should be designed to serve mankind </div><br />
<div class="mw-collapsible-content"><br />
The processing of personal data should be designed to serve mankind. The right to the protection of personal data is not an absolute right; it must be considered in relation to its function in society and be balanced against other fundamental rights, in accordance with the principle of proportionality. This Regulation respects all fundamental rights and observes the freedoms and principles recognised in the Charter as enshrined in the Treaties, in particular the respect for private and family life, home and communications, the protection of personal data, freedom of thought, conscience and religion, freedom of expression and information, freedom to conduct a business, the right to an effective remedy and to a fair trial, and cultural, religious and linguistic diversity.<br />
</div></div><span id="r24"><div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><div>'''Recital 24:''' Monitoring of data subject's behaviour </div><br />
<div class="mw-collapsible-content"><br />
The processing of personal data of data subjects who are in the Union by a controller or processor not established in the Union should also be subject to this Regulation when it is related to the monitoring of the behaviour of such data subjects in so far as their behaviour takes place within the Union. In order to determine whether a processing activity can be considered to monitor the behaviour of data subjects, it should be ascertained whether natural persons are tracked on the internet including potential subsequent use of personal data processing techniques which consist of profiling a natural person, particularly in order to take decisions concerning her or him or for analysing or predicting her or his personal preferences, behaviours and attitudes.<br />
</div></div><br />
<br />
<span id="r38"><br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><div>'''Recital 38:''' Specific protection of children </div><br />
<div class="mw-collapsible-content"><br />
Children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data. Such specific protection should, in particular, apply to the use of personal data of children for the purposes of marketing or creating personality or user profiles and the collection of personal data with regard to children when using services offered directly to a child. The consent of the holder of parental responsibility should not be necessary in the context of preventive or counselling services offered directly to a child.<br />
</div></div><br />
<br />
<span id="r60"><br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><div>'''Recital 60:''' Informing data subjects about profiling </div><br />
<div class="mw-collapsible-content"><br />
The principles of fair and transparent processing require that the data subject be informed of the existence of the processing operation and its purposes. The controller should provide the data subject with any further information necessary to ensure fair and transparent processing taking into account the specific circumstances and context in which the personal data are processed. Furthermore, the data subject should be informed of the existence of profiling and the consequences of such profiling. Where the personal data are collected from the data subject, the data subject should also be informed whether he or she is obliged to provide the personal data and of the consequences, where he or she does not provide such data. That information may be provided in combination with standardised icons in order to give in an easily visible, intelligible and clearly legible manner, a meaningful overview of the intended processing. Where the icons are presented electronically, they should be machine-readable.<br />
</div></div><br />
<br />
<span id="r63"><br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><div>'''Recital 63:''' Access to meaningful information </div><br />
<div class="mw-collapsible-content"><br />
The principles of fair and transparent processing require that the data subject be informed of the existence of the processing operation and its purposes. The controller should provide the data subject with any further information necessary to ensure fair and transparent processing taking into account the specific circumstances and context in which the personal data are processed. Furthermore, the data subject should be informed of the existence of profiling and the consequences of such profiling. Where the personal data are collected from the data subject, the data subject should also be informed whether he or she is obliged to provide the personal data and of the consequences, where he or she does not provide such data. That information may be provided in combination with standardised icons in order to give in an easily visible, intelligible and clearly legible manner, a meaningful overview of the intended processing. Where the icons are presented electronically, they should be machine-readable.<br />
</div></div><br />
<br />
<span id="r71"><br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><div>'''Recital 71:''' Automated processing/decision-making, including profiling </div><br />
<div class="mw-collapsible-content"><br />
The data subject should have the right not to be subject to a decision, which may include a measure, evaluating personal aspects relating to him or her which is based solely on automated processing and which produces legal effects concerning him or her or similarly significantly affects him or her, such as automatic refusal of an online credit application or e-recruiting practices without any human intervention. Such processing includes ‘profiling’ that consists of any form of automated processing of personal data evaluating the personal aspects relating to a natural person, in particular to analyse or predict aspects concerning the data subject's performance at work, economic situation, health, personal preferences or interests, reliability or behaviour, location or movements, where it produces legal effects concerning him or her or similarly significantly affects him or her. However, decision-making based on such processing, including profiling, should be allowed where expressly authorised by Union or Member State law to which the controller is subject, including for fraud and tax-evasion monitoring and prevention purposes conducted in accordance with the regulations, standards and recommendations of Union institutions or national oversight bodies and to ensure the security and reliability of a service provided by the controller, or necessary for the entering or performance of a contract between the data subject and a controller, or when the data subject has given his or her explicit consent. In any case, such processing should be subject to suitable safeguards, which should include specific information to the data subject and the right to obtain human intervention, to express his or her point of view, to obtain an explanation of the decision reached after such assessment and to challenge the decision. Such measure should not concern a child.<br />
In order to ensure fair and transparent processing in respect of the data subject, taking into account the specific circumstances and context in which the personal data are processed, the controller should use appropriate mathematical or statistical procedures for the profiling, implement technical and organisational measures appropriate to ensure, in particular, that factors which result in inaccuracies in personal data are corrected and the risk of errors is minimised, secure personal data in a manner that takes account of the potential risks involved for the interests and rights of the data subject and that prevents, inter alia, discriminatory effects on natural persons on the basis of racial or ethnic origin, political opinion, religion or beliefs, trade union membership, genetic or health status or sexual orientation, or that result in measures having such an effect. Automated decision-making and profiling based on special categories of personal data should be allowed only under specific conditions.<br />
</div></div><br />
<br />
<span id="r72"><br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><div>'''Recital 72:''' EDPB Guidance on profiling </div><br />
<div class="mw-collapsible-content"><br />
Profiling is subject to the rules of this Regulation governing the processing of personal data, such as the legal grounds for processing or data protection principles. The European Data Protection Board established by this Regulation (the ‘Board’) should be able to issue guidance in that context.<br />
</div></div><br />
<br />
<span id="r95"><br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><div>'''Recital 91:''' DPIA necessary in case of systematic and extensive evaluation of personal aspects </div><br />
<div class="mw-collapsible-content"><br />
This should in particular apply to large-scale processing operations which aim to process a considerable amount of personal data at regional, national or supranational level and which could affect a large number of data subjects and which are likely to result in a high risk, for example, on account of their sensitivity, where in accordance with the achieved state of technological knowledge a new technology is used on a large scale as well as to other processing operations which result in a high risk to the rights and freedoms of data subjects, in particular where those operations render it more difficult for data subjects to exercise their rights. A data protection impact assessment should also be made where personal data are processed for taking decisions regarding specific natural persons following any systematic and extensive evaluation of personal aspects relating to natural persons based on profiling those data or following the processing of special categories of personal data, biometric data, or data on criminal convictions and offences or related security measures. A data protection impact assessment is equally required for monitoring publicly accessible areas on a large scale, especially when using optic-electronic devices or for any other operations where the competent supervisory authority considers that the processing is likely to result in a high risk to the rights and freedoms of data subjects, in particular because they prevent data subjects from exercising a right or using a service or a contract, or because they are carried out systematically on a large scale. The processing of personal data should not be considered to be on a large scale if the processing concerns personal data from patients or clients by an individual physician, other health care professional or lawyer. In such cases, a data protection impact assessment should not be mandatory<br />
</div></div><br />
<br />
==Commentary==<br />
<br />
===Overview===<br />
Article 22 has its roots in Articles 15 and 12(a) of the 95/46/EC Data Protection Directive (DPD). One of the main differences is that the GDPR provision seems to have a broader scope of application, since it applies to “automated processing, including profiling”. On the other hand, the DPD provision was only applicable if a form of profiling was involved.<ref>Article 20 of GDPR proposal, COM(2012) 11 final <https://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=COM:2012:0011:FIN:EN:PDF</ref> A similar difference can be noticed between the initial proposal of the Commission, in which the provision was titled “Measures based on profiling”, and the final wording of the GDPR which extends the scope as to include automated decisions which are not based on profiling. Furthermore, in contrast to the DPD, Article 22(4) GDPR also explicitly addresses the use of sensitive data by laying down a qualified prohibition of decisions based on categories of data listed under Article 9(1).<br />
<br />
===Right or prohibition?===<br />
<br />
There have been conflicting arguments on whether Article 22(1) lays down a right or a general prohibition.<br />
<br />
On the one hand, if the provision is interpreted as a right, then the data subject would have to actively exercise the right in order to be protected from the types of impactful decisions that Article 22 deals with. One of the arguments invoked by proponents of this approach is a strictly literal interpretation which looks at the use of the word ‘right’ in Article 22.<ref>Lee A. Bygrave, ‘Article 22. Automated individual decision-making, including profiling’ Christopher Kuner, Lee A. Bygrave, Christopher Docksey, and and Laura Dreachsler (eds.), The EU General Data Protection Regulation (GDPR) – A Commentary (Oxford University Press), 531.</ref><br />
<br />
Another argument in favour of interpreting the provision as a right is related to the level of ex ante protection for data subjects. This line of reasoning recognises that interpreting Article 22 as a qualified prohibition would indeed offer data subjects a higher level of ex ante protections against automated decisions. However, this line of argumentation then proposes that such protections could theoretically also be offered in connection with the DPIA process. More specifically, some commentators point to the fact that if controllers, based on a self-assessment, are of the opinion that they cannot mitigate the high risk level of an automated decision system, they would have to consult the relevant DPA which could then ban the controller’s automated decision system.<ref>Ibid, 531-532.</ref> It is, however, questionable whether in practice this approach could attain a reliable and meaningful level of ex ante protection for data subjects.<br />
<br />
On the other hand, Article 22(1) can be framed as a general prohibition of decisions subject to Article 22. This interpretation seems more in line with the purpose of the provision, which seeks to protect data subjects from a general possibility of being subject to decisions covered by Article 22. A confirmation may be found in the Article 29 Working Party Guidelines on Automated individual decision-making and Profiling,<ref>WP29, Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679, adopted on 3 October 2017; as last revised and adopted on 6 February 2018, 17/EN, WP 251 rev.01, 19.</ref> which state that:<br />
<br />
“''The term “right” in the provision does not mean that Article 22(1) applies only when actively invoked by the data subject. Article 22(1) establishes a general prohibition for decision-making based solely on automated processing. This prohibition applies whether or not the data subject takes an action regarding the processing of their personal data''”.<br />
<br />
The Article 29 Working Party supports this argument by relying on the principles of the GDPR and its aim to give data subjects control over their personal data. Furthermore, the WP29 makes a reference to Recital 71, which implies that decisions under Article 22(1) are generally not allowed, by contrasting these with decisions regulated by Article 22(2) which should, “however”, be allowed. The WP29’s view of reading Article 22 as a qualified prohibition has not been changed by the EDPB.<br />
===Scope===<br />
The title of Article 22 mentions automated “individual” decision making, which indicates that the scope of the provision is not extended to decisions or processing operations which produce effects for groups of data subjects. Following this reasoning, Article 22 would not be applicable if decisions affect multiple data subjects or groups of individuals connected by common characteristics, such as age, gender, or postal code. However, considering the realities of machine learning and Big Data, pertinent arguments have also been made supporting the view that Article 22 should apply to group decisions as well.<ref>See Michael Veale and Lilian Edwards, ‘Clarity, surprises, and further questions in the Article 29 Working Party draft guidance on automated decision-making and profiling’ (2018) 34 Computer Law & Security Review, 402; In addition, another example could be that of individuals classified by categories such as ‘man/woman’, ‘married/single/divorced’, or ‘high/low income’. Solely automated decisions could then be taken for all ‘divorced low-income women’ which would render Article 22 inapplicable, despite their potentially significant effects. In such situations, the decisions could be seen as a bundle of individual decisions, which would make it more difficult to circumvent Article 22, and thereby better protect the fundamental rights of data subjects. However, in the absence of any CJEU decisions on this matter so far, it is still disputed whether Article 22(1) should apply to group decisions or not.</ref><br />
<br />
===Decision based solely on automated processing===<br />
<br />
====Decision====<br />
The first element required to trigger Article 22 is the presence of a ‘decision’, which can be interpreted in a broad sense.<ref>Bygrave (n 2) 532.</ref> Examples of a decision can be official acts of public authorities such as decisions on tax returns,<ref>Maja Brkan, ‘Do algorithms rule the world? Algorithmic decision-making and data protection in the framework of the GDPR and beyond’ (2019) 27 International Journal of Law and Information Technology 2, 102.</ref> as well as automatic refusals of online credit applications or similar decisions in the context of e-recruiting practices.<ref>Recital 71 GDPR</ref> In a more general sense, decisions could also be seen as a particular attitude or position taken with regards to a person, if this position is at least likely to be acted upon.<ref>Isak Mendoza and Lee A. Bygrave, ‘The Right not to be Subject to Automated Decisions based on Profiling’ (2017) University of Oslo Faculty of Law Legal Studies Research Paper Series No.2017-20, 10-11.</ref> Although there does not seem to be a specific requirement for the decision to be formalised in a particular way, it should at least be distinguishable from other stages of the (decision-making) process.<br />
====Solely====<br />
<br />
This second element of Article 22(1) first of all depends on whether human intervention is even possible from a technical perspective, or the decision-making process is constructed in a solely algorithmic way with no room for human involvement.<br />
<br />
If the process does at least technically allow for human intervention, then it must be assessed whether the action undertaken by a human is “meaningful” or merely a procedural “token gesture”.<ref>WP29 (n 4) 21.</ref> In order to meet this criterion, the intervention must be “carried out by someone who has the authority and competence to change the decision”. Furthermore, the human involved must not only have the power to change the decision, but actually exercise this competence by ”consider[ing] all the relevant data” and verifying the substance and correctness of the machine-generated decision.<ref>Ibid 8; Bygrave (n 2) 533.</ref><br />
====Automated processing====<br />
<br />
The ‘automated processing’ criterion in Article 22(1) is related to the final stage of the processing, which results in a solely automated decision based on already existing data. By contrast, the methods of collecting the initial data sources must not necessarily be automated but can be semi-automated or even manual.<ref>Bygrave (n 2) 533.</ref> Furthermore, the automated decision must not entirely be based on personal data related to the person affected by the decision. Instead, the data basis can also include non-personal data, or personal data related to other individuals.<br />
===Legal or similarly significant effect===<br />
====Legal effects====<br />
<br />
A decision has legal effects on a data subject when it is binding and affects the person’s legal rights or interests. Examples can be the cancellation of a contract, the decision of a tax authority on an individual’s tax return, or the denial of a social benefit granted by law.<ref>WP29 (n 4) 21, Brkan (n 7) 102.</ref><br />
====Similarly significant effects====<br />
<br />
In principle, satisfying this criterion means that the impacts of the decision must be sufficiently great despite not changing the legal position of the individual. While it can be difficult to establish this in practice, according to the WP29 some guiding criteria can be the potential of a decision to:<br />
<br />
*significantly affect the circumstances, behaviour or choices of the individuals concerned<br />
*have a prolonged or permanent impact on the data subject; or<br />
*at its most extreme, lead to the exclusion or discrimination of individuals.<br />
<br />
The WP29 further gives examples of decisions which can similarly significantly affect data subjects, when these:<br />
<br />
*affect someone’s financial circumstances, such as their eligibility to credit;<br />
*affect someone’s access to health services;<br />
*deny someone an employment opportunity or put them at a serious disadvantage;<br />
*affect someone’s access to education, for example university admissions.<br />
<br />
In any case, the decision should have more than a trivial effect which impacts someone’s position in relation to other persons or to their access to a service or opportunity. For example, Recital 71 mentions the “automatic refusal of an online credit application or e-recruiting practices without human intervention”.<br />
<br />
Decisions resulting in targeted advertisement based on profiling could also similarly significantly affect individuals, for example when someone is targeted with high interest loans because they are known to be in financial difficulties and are particularly susceptible to accept such offers. In this context, the WP29 lays down a non-exhaustive list of characteristics that can be decisive in the assessment of each case, such as:<br />
<br />
*the intrusiveness of the profiling process, including the tracking of individuals across different websites, devices and services;<br />
*the expectations and wishes of the individuals concerned;<br />
*the way the advert is delivered; or<br />
*using knowledge of the vulnerabilities of the data subjects targeted.<br />
<br />
===Exceptions===<br />
====Contract====<br />
<br />
The first exception from the prohibition laid down in Article 22(1) is if the decision is necessary for entering into, or performance of, a contract between the data subject and the controller.<br />
<br />
It is unclear how the ‘necessity’ criterion should be interpreted. A strictly textual interpretation would lead to most examples of solely automated decisions not being considered necessary. For example, while assessing an individual’s credit risk is necessary for a bank in order to protect its investments, algorithmic credit scoring is not in itself necessary since an assessment can also be carried by humans. As different arguments could be made for the threshold and meaning of the ‘necessity’ criterion, this aspect will need to be clarified by courts.<br />
<br />
Furthermore, an analysis of Articles 6(1)(b) and 22(2)(a) seems to indicate a difference in the scope of the two provisions with regards to the necessity of the processing for entering into a contract. Whereas for Article 6(1)(b) the processing would have to be necessary in order to take steps at the request of the data subject, Article 22(2)(a) does not mention any request from the data subject. Therefore, the scope of application of Article 22(2)(a) seems to be wider in this regard.<br />
<br />
In any case, the application of Article 22(2)(a) is always subjected to the presence of “suitable measures to safeguard the data subject's rights and freedoms and legitimate interests, at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision”. [Article 22(3) GDPR.]<br />
<br />
====Authorised by law====<br />
<br />
The second exemption in Article 22(2) is also subject to the presence of “suitable measures to safeguard the data subject's rights and freedoms and legitimate interests”. However, it seems that such measures do not necessarily need to be the same as those foreseen by Article 22(3). Instead, Members States have discretion in this aspect.<br />
<br />
====Explicit consent====<br />
<br />
The wording of Article 22(2)(c) (“explicit consent”) results in the same standard for this requirement as with Article 9(2)(a).<br />
<br />
Where particular attention must be given to it being freely given in the context of entering into or performance of a contract.<ref>See also Articles 4(11) and 7(4), as well as Recital 43.</ref> Indeed, discussions on whether consent is freely given, including in line with Articles 9(2) and 22(2(c), will lead to assessing the necessity element. In this context, it must be assessed whether the decision is necessary for the “performance of a contract, including the provision of a service” as mandated by Article 7(4). However, this provision does not mention entering into a contract, which would seem to exclude examples such as online credit applications where the algorithmic decision occurs in order to enter the contract and not to perform it.<br />
<br />
Furthermore, issues could arise with decisions based on profiling, where a data subject might have given their consent to the profiling, for example by accepting a cookie, but not to the decision resulting from it. In this context, data subjects might not even be aware of the solely automated decision occurring, and in any case the consent to the profiling would not be considered as satisfying the requirements of Article 22(2)(c).<ref>Brkan (n 7) 106.</ref><br />
<br />
Finally, decisions based on explicit consent are also subjected to the safeguards laid down in Article 22(3).<br />
<br />
===Safeguards===<br />
<br />
A crucial point worth emphasizing is that Article 22(3) lays down a non-exhaustive list of safeguards, which should always be available to the data subjects. This leaves the door open for additional safeguards, such the heavily disputed potential ‘right to explanation’ mentioned in Recital 71.<br />
<br />
However, clarifications will still be needed as to how the safeguards already mentioned by Article 22(3) can be operationalized and what their outcome will be. On the operational side it would be questionable how some systems would even allow for human intervention in practice, for example when the website or platform does not technically allow this. On the other hand, with regards to the legal consequences, it is not clear whether the data subject expressing their views or contesting a decision would lead to the decision being annulled.<ref>Brkan (n7) 108.</ref><br />
<br />
===Qualified prohibition of using special categories of data===<br />
Explicit consent in the context of Article 22(4) should be interpreted in a similar manner as for Article 22(2)(c). With regards to the “suitable measures to safeguard the data subject's rights and freedoms and legitimate interests”, these seem to have the same scope and interpretation as with Article 22(3).<br />
<br />
==Decisions==<br />
→ You can find all related decisions in [[:Category:Article 22 GDPR]]<br />
<br />
==References==<br />
<references /><br />
[[Category:GDPR Articles]]<br />
<span id="r24">[[Category:GDPR Articles]]</div>94.198.41.212https://gdprhub.eu/index.php?title=Article_22_GDPR&diff=15754Article 22 GDPR2021-05-06T21:55:26Z<p>94.198.41.212: /* Legal Text */</p>
<hr />
<div>{| class="wikitable" style="width: 25%; margin-left: 10px; float:right;"<br />
![[Article 21 GDPR|←]] Article 22 - Automated individual decision-making, including profiling [[Article 23 GDPR|→]]<br />
|-<br />
| style="padding: 20px; background-color:#003399;" |[[File:Gdpricon.png|100px|center|link=Overview_of_GDPR]]<br />
|-<br />
|<br />
<br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><br />
<div style="font-weight:bold;line-height:1.6;">Chapter 1: General provisions</div><br />
<div class="mw-collapsible-content"><br />
<small><br />
[[Article 1 GDPR|Article 1: Subject-matter and objectives]]<br /><br />
[[Article 2 GDPR|Article 2: Material scope]]<br /><br />
[[Article 3 GDPR|Article 3: Territorial scope]]<br /><br />
[[Article 4 GDPR|Article 4: Definitions]]<br /><br />
</small><br />
</div></div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><br />
<div style="font-weight:bold;line-height:1.6;">Chapter 2: Principles</div><br />
<div class="mw-collapsible-content"><br />
<small><br />
[[Article 5 GDPR|Article 5: Principles relating to processing of personal data]]<br /><br />
[[Article 6 GDPR|Article 6: Lawfulness of processing]]<br /><br />
[[Article 7 GDPR|Article 7: Conditions for consent]]<br /><br />
[[Article 8 GDPR|Article 8: Conditions applicable to child’s consent in relation to information society services]]<br /><br />
[[Article 9 GDPR|Article 9: Processing of special categories of personal data]]<br /><br />
[[Article 10 GDPR|Article 10: Processing of personal data relating to criminal convictions and offences]]<br /><br />
[[Article 11 GDPR|Article 11: Processing which does not require identification]]<br /><br />
</small><br />
</div></div><br />
<br />
<div class="toccolours mw-collapsible" style="border-width: 0px" overflow:auto;"><br />
<div style="font-weight:bold;line-height:1.6;">Chapter 3: Rights of the data subject</div><br />
<div class="mw-collapsible-content"><br />
<small><br />
[[Article 12 GDPR|Article 12: Transparent information, communication and modalities for the exercise of the rights of the data subject]]<br /><br />
[[Article 13 GDPR|Article 13: Information to be provided where personal data are collected from the data subject]]<br /><br />
[[Article 14 GDPR|Article 14: Information to be provided where personal data have not been obtained from the data subject]]<br /><br />
[[Article 15 GDPR|Article 15: Right of access by the data subject]]<br /><br />
[[Article 16 GDPR|Article 16: Right to rectification]]<br /><br />
[[Article 17 GDPR|Article 17: Right to erasure (‘right to be forgotten’)]]<br /><br />
[[Article 18 GDPR|Article 18: Right to restriction of processing]]<br /><br />
[[Article 19 GDPR|Article 19: Notification obligation regarding rectification or erasure of personal data or restriction of processing]]<br /><br />
[[Article 20 GDPR|Article 20: Right to data portability]]<br /><br />
[[Article 21 GDPR|Article 21: Right to object]]<br /><br />
[[Article 22 GDPR|Article 22: Automated individual decision-making, including profiling]]<br /><br />
[[Article 23 GDPR|Article 23: Restrictions]]<br /><br />
</small><br />
</div></div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><br />
<div style="font-weight:bold;line-height:1.6;">Chapter 4: Controller and processor</div><br />
<div class="mw-collapsible-content"><br />
<small><br />
[[Article 24 GDPR|Article 24: Responsibility of the controller]]<br /><br />
[[Article 25 GDPR|Article 25: Data protection by design and by default]]<br /><br />
[[Article 26 GDPR|Article 26: Joint controllers]]<br /><br />
[[Article 27 GDPR|Article 27: Representatives of controllers or processors not established in the Union]]<br /><br />
[[Article 28 GDPR|Article 28: Processor]]<br /><br />
[[Article 29 GDPR|Article 29: Processing under the authority of the controller or processor]]<br /><br />
[[Article 30 GDPR|Article 30: Records of processing activities]]<br /><br />
[[Article 31 GDPR|Article 31: Cooperation with the supervisory authority]]<br /><br />
[[Article 32 GDPR|Article 32: Security of processing]]<br /><br />
[[Article 33 GDPR|Article 33: Notification of a personal data breach to the supervisory authority]]<br /><br />
[[Article 34 GDPR|Article 34: Communication of a personal data breach to the data subject]]<br /><br />
[[Article 35 GDPR|Article 35: Data protection impact assessment]]<br /><br />
[[Article 36 GDPR|Article 36: Prior consultation]]<br /><br />
[[Article 37 GDPR|Article 37: Designation of the data protection officer]]<br /><br />
[[Article 38 GDPR|Article 38: Position of the data protection officer]]<br /><br />
[[Article 39 GDPR|Article 39: Tasks of the data protection officer]]<br /><br />
[[Article 40 GDPR|Article 40: Codes of conduct]]<br /><br />
[[Article 41 GDPR|Article 41: Monitoring of approved codes of conduct]]<br /><br />
[[Article 42 GDPR|Article 42: Certification]]<br /><br />
[[Article 43 GDPR|Article 43: Certification bodies]]<br /><br />
</small><br />
</div></div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><br />
<div style="font-weight:bold;line-height:1.6;">Chapter 5: Transfers of personal data</div><br />
<div class="mw-collapsible-content"><br />
<small><br />
[[Article 44 GDPR|Article 44: General principle for transfers]]<br /><br />
[[Article 45 GDPR|Article 45: Transfers on the basis of an adequacy decision]]<br /><br />
[[Article 46 GDPR|Article 46: Transfers subject to appropriate safeguards]]<br /><br />
[[Article 47 GDPR|Article 47: Binding corporate rules]]<br /><br />
[[Article 48 GDPR|Article 48: Transfers or disclosures not authorised by Union law]]<br /><br />
[[Article 49 GDPR|Article 49: Derogations for specific situations]]<br /><br />
[[Article 50 GDPR|Article 50: International cooperation for the protection of personal data]]<br /><br />
</small><br />
</div></div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><br />
<div style="font-weight:bold;line-height:1.6;">Chapter 6: Supervisory authorities</div><br />
<div class="mw-collapsible-content"><br />
<small><br />
[[Article 51 GDPR|Article 51: Supervisory authority]]<br /><br />
[[Article 52 GDPR|Article 52: Independence]]<br /><br />
[[Article 53 GDPR|Article 53: General conditions for the members of the supervisory authority]]<br /><br />
[[Article 54 GDPR|Article 54: Rules on the establishment of the supervisory authority]]<br /><br />
[[Article 55 GDPR|Article 55: Competence]]<br /><br />
[[Article 56 GDPR|Article 56: Competence of the lead supervisory authority]]<br /><br />
[[Article 57 GDPR|Article 57: Tasks]]<br /><br />
[[Article 58 GDPR|Article 58: Powers]]<br /><br />
[[Article 59 GDPR|Article 59: Activity reports]]<br /><br />
</small><br />
</div></div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><br />
<div style="font-weight:bold;line-height:1.6;">Chapter 7: Cooperation and consistency</div><br />
<div class="mw-collapsible-content"><br />
<small><br />
[[Article 60 GDPR|Article 60: Cooperation between the lead supervisory authority and the other supervisory authorities concerned]]<br /><br />
[[Article 61 GDPR|Article 61: Mutual assistance]]<br /><br />
[[Article 62 GDPR|Article 62: Joint operations of supervisory authorities]]<br /><br />
[[Article 63 GDPR|Article 63: Consistency mechanism]]<br /><br />
[[Article 64 GDPR|Article 64: Opinion of the Board]]<br /><br />
[[Article 65 GDPR|Article 65: Dispute resolution by the Board]]<br /><br />
[[Article 66 GDPR|Article 66: Urgency procedure]]<br /><br />
[[Article 67 GDPR|Article 67: Exchange of information]]<br /><br />
[[Article 68 GDPR|Article 68: European Data Protection Board]]<br /><br />
[[Article 69 GDPR|Article 69: Independence]]<br /><br />
[[Article 70 GDPR|Article 70: Tasks of the Board]]<br /><br />
[[Article 71 GDPR|Article 71: Reports]]<br /><br />
[[Article 72 GDPR|Article 72: Procedure]]<br /><br />
[[Article 73 GDPR|Article 73: Chair]]<br /><br />
[[Article 74 GDPR|Article 74: Tasks of the Chair]]<br /><br />
[[Article 75 GDPR|Article 75: Secretariat]]<br /><br />
[[Article 76 GDPR|Article 76: Confidentiality]]<br /><br />
</small><br />
</div></div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><br />
<div style="font-weight:bold;line-height:1.6;">Chapter 8: Remedies, liability and penalties</div><br />
<div class="mw-collapsible-content"><br />
<small><br />
[[Article 77 GDPR|Article 77: Right to lodge a complaint with a supervisory authority]]<br /><br />
[[Article 78 GDPR|Article 78: Right to an effective judicial remedy against a supervisory authority]]<br /><br />
[[Article 79 GDPR|Article 79: Right to an effective judicial remedy against a controller or processor]]<br /><br />
[[Article 80 GDPR|Article 80: Representation of data subjects]]<br /><br />
[[Article 81 GDPR|Article 81: Suspension of proceedings]]<br /><br />
[[Article 82 GDPR|Article 82: Right to compensation and liability]]<br /><br />
[[Article 83 GDPR|Article 83: General conditions for imposing administrative fines]]<br /><br />
[[Article 84 GDPR|Article 84: Penalties]]<br /><br />
</small><br />
</div></div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><br />
<div style="font-weight:bold;line-height:1.6;">Chapter 9: Specific processing situations</div><br />
<div class="mw-collapsible-content"><br />
<small><br />
[[Article 85 GDPR|Article 85: Processing and freedom of expression and information]]<br /><br />
[[Article 86 GDPR|Article 86: Processing and public access to official documents]]<br /><br />
[[Article 87 GDPR|Article 87: Processing of the national identification number]]<br /><br />
[[Article 88 GDPR|Article 88: Processing in the context of employment]]<br /><br />
[[Article 89 GDPR|Article 89: Safeguards and derogations relating to processing for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes]]<br /><br />
[[Article 90 GDPR|Article 90: Obligations of secrecy]]<br /><br />
[[Article 91 GDPR|Article 91: Existing data protection rules of churches and religious associations]]<br /><br />
</small><br />
</div></div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><br />
<div style="font-weight:bold;line-height:1.6;">Chapter 10: Delegated and implementing acts</div><br />
<div class="mw-collapsible-content"><br />
<small><br />
[[Article 92 GDPR|Article 92: Exercise of the delegation]]<br /><br />
[[Article 93 GDPR|Article 93: Committee procedure]]<br /><br />
</small><br />
</div></div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><br />
<div style="font-weight:bold;line-height:1.6;">Chapter 11: Final provisions</div><br />
<div class="mw-collapsible-content"><br />
<small><br />
[[Article 94 GDPR|Article 94: Repeal of Directive 95: /46: /EC]]<br /><br />
[[Article 95 GDPR|Article 95: Relationship with Directive 20: 02: /58: /EC]]<br /><br />
[[Article 96 GDPR|Article 96: Relationship with previously concluded Agreements]]<br /><br />
[[Article 97 GDPR|Article 97: Commission reports]]<br /><br />
[[Article 98 GDPR|Article 98: Review of other Union legal acts on data protection]]<br /><br />
[[Article 99 GDPR|Article 99: Entry into force and application]]<br /><br />
</small><br />
</div><br />
</div><br />
|}<br />
<br />
==Legal Text==<br />
'''Article 22 - Automated individual decision-making, including profiling'''<br />
<br />
<span id="1">1. The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.</span><br />
<br />
<span id="2">2. Paragraph 1 shall not apply if the decision:</span><br />
<br />
<span id="2a">(a) is necessary for entering into, or performance of, a contract between the data subject and a data controller;</span><br />
<br />
<span id="2b">(b) is authorised by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject's rights and freedoms and legitimate interests; or</span><br />
<br />
<span id="2c">(c) is based on the data subject's explicit consent.</span><br />
<br />
<span id="3">3. In the cases referred to in points (a) and (c) of paragraph 2, the data controller shall implement suitable measures to safeguard the data subject's rights and freedoms and legitimate interests, at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision.</span><br />
<br />
<span id="4">4. Decisions referred to in paragraph 2 shall not be based on special categories of personal data referred to in Article 9(1), unless point (a) or (g) of Article 9(2) applies and suitable measures to safeguard the data subject's rights and freedoms and legitimate interests are in place.</span><br />
<br />
==Relevant Recitals==<br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><div>'''Recital 4:''' The processing of personal data should be designed to serve mankind </div><br />
<div class="mw-collapsible-content"><br />
The processing of personal data should be designed to serve mankind. The right to the protection of personal data is not an absolute right; it must be considered in relation to its function in society and be balanced against other fundamental rights, in accordance with the principle of proportionality. This Regulation respects all fundamental rights and observes the freedoms and principles recognised in the Charter as enshrined in the Treaties, in particular the respect for private and family life, home and communications, the protection of personal data, freedom of thought, conscience and religion, freedom of expression and information, freedom to conduct a business, the right to an effective remedy and to a fair trial, and cultural, religious and linguistic diversity.<br />
</div></div><span id="r24"><br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><div>'''Recital 24:''' Monitoring of data subject's behaviour </div><br />
<div class="mw-collapsible-content"><br />
The processing of personal data of data subjects who are in the Union by a controller or processor not established in the Union should also be subject to this Regulation when it is related to the monitoring of the behaviour of such data subjects in so far as their behaviour takes place within the Union. In order to determine whether a processing activity can be considered to monitor the behaviour of data subjects, it should be ascertained whether natural persons are tracked on the internet including potential subsequent use of personal data processing techniques which consist of profiling a natural person, particularly in order to take decisions concerning her or him or for analysing or predicting her or his personal preferences, behaviours and attitudes.<br />
</div></div><br />
<br />
<span id="r38"><br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><div>'''Recital 38:''' Specific protection of children </div><br />
<div class="mw-collapsible-content"><br />
Children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data. Such specific protection should, in particular, apply to the use of personal data of children for the purposes of marketing or creating personality or user profiles and the collection of personal data with regard to children when using services offered directly to a child. The consent of the holder of parental responsibility should not be necessary in the context of preventive or counselling services offered directly to a child.<br />
</div></div><br />
<br />
<span id="r60"><br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><div>'''Recital 60:''' Informing data subjects about profiling </div><br />
<div class="mw-collapsible-content"><br />
The principles of fair and transparent processing require that the data subject be informed of the existence of the processing operation and its purposes. The controller should provide the data subject with any further information necessary to ensure fair and transparent processing taking into account the specific circumstances and context in which the personal data are processed. Furthermore, the data subject should be informed of the existence of profiling and the consequences of such profiling. Where the personal data are collected from the data subject, the data subject should also be informed whether he or she is obliged to provide the personal data and of the consequences, where he or she does not provide such data. That information may be provided in combination with standardised icons in order to give in an easily visible, intelligible and clearly legible manner, a meaningful overview of the intended processing. Where the icons are presented electronically, they should be machine-readable.<br />
</div></div><br />
<br />
<span id="r63"><br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><div>'''Recital 63:''' Access to meaningful information </div><br />
<div class="mw-collapsible-content"><br />
The principles of fair and transparent processing require that the data subject be informed of the existence of the processing operation and its purposes. The controller should provide the data subject with any further information necessary to ensure fair and transparent processing taking into account the specific circumstances and context in which the personal data are processed. Furthermore, the data subject should be informed of the existence of profiling and the consequences of such profiling. Where the personal data are collected from the data subject, the data subject should also be informed whether he or she is obliged to provide the personal data and of the consequences, where he or she does not provide such data. That information may be provided in combination with standardised icons in order to give in an easily visible, intelligible and clearly legible manner, a meaningful overview of the intended processing. Where the icons are presented electronically, they should be machine-readable.<br />
</div></div><br />
<br />
<span id="r71"><br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><div>'''Recital 71:''' Automated processing/decision-making, including profiling </div><br />
<div class="mw-collapsible-content"><br />
The data subject should have the right not to be subject to a decision, which may include a measure, evaluating personal aspects relating to him or her which is based solely on automated processing and which produces legal effects concerning him or her or similarly significantly affects him or her, such as automatic refusal of an online credit application or e-recruiting practices without any human intervention. Such processing includes ‘profiling’ that consists of any form of automated processing of personal data evaluating the personal aspects relating to a natural person, in particular to analyse or predict aspects concerning the data subject's performance at work, economic situation, health, personal preferences or interests, reliability or behaviour, location or movements, where it produces legal effects concerning him or her or similarly significantly affects him or her. However, decision-making based on such processing, including profiling, should be allowed where expressly authorised by Union or Member State law to which the controller is subject, including for fraud and tax-evasion monitoring and prevention purposes conducted in accordance with the regulations, standards and recommendations of Union institutions or national oversight bodies and to ensure the security and reliability of a service provided by the controller, or necessary for the entering or performance of a contract between the data subject and a controller, or when the data subject has given his or her explicit consent. In any case, such processing should be subject to suitable safeguards, which should include specific information to the data subject and the right to obtain human intervention, to express his or her point of view, to obtain an explanation of the decision reached after such assessment and to challenge the decision. Such measure should not concern a child.<br />
In order to ensure fair and transparent processing in respect of the data subject, taking into account the specific circumstances and context in which the personal data are processed, the controller should use appropriate mathematical or statistical procedures for the profiling, implement technical and organisational measures appropriate to ensure, in particular, that factors which result in inaccuracies in personal data are corrected and the risk of errors is minimised, secure personal data in a manner that takes account of the potential risks involved for the interests and rights of the data subject and that prevents, inter alia, discriminatory effects on natural persons on the basis of racial or ethnic origin, political opinion, religion or beliefs, trade union membership, genetic or health status or sexual orientation, or that result in measures having such an effect. Automated decision-making and profiling based on special categories of personal data should be allowed only under specific conditions.<br />
</div></div><br />
<br />
<span id="r72"><br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><div>'''Recital 72:''' EDPB Guidance on profiling </div><br />
<div class="mw-collapsible-content"><br />
Profiling is subject to the rules of this Regulation governing the processing of personal data, such as the legal grounds for processing or data protection principles. The European Data Protection Board established by this Regulation (the ‘Board’) should be able to issue guidance in that context.<br />
</div></div><br />
<br />
<span id="r95"><br />
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><div>'''Recital 91:''' DPIA necessary in case of systematic and extensive evaluation of personal aspects </div><br />
<div class="mw-collapsible-content"><br />
This should in particular apply to large-scale processing operations which aim to process a considerable amount of personal data at regional, national or supranational level and which could affect a large number of data subjects and which are likely to result in a high risk, for example, on account of their sensitivity, where in accordance with the achieved state of technological knowledge a new technology is used on a large scale as well as to other processing operations which result in a high risk to the rights and freedoms of data subjects, in particular where those operations render it more difficult for data subjects to exercise their rights. A data protection impact assessment should also be made where personal data are processed for taking decisions regarding specific natural persons following any systematic and extensive evaluation of personal aspects relating to natural persons based on profiling those data or following the processing of special categories of personal data, biometric data, or data on criminal convictions and offences or related security measures. A data protection impact assessment is equally required for monitoring publicly accessible areas on a large scale, especially when using optic-electronic devices or for any other operations where the competent supervisory authority considers that the processing is likely to result in a high risk to the rights and freedoms of data subjects, in particular because they prevent data subjects from exercising a right or using a service or a contract, or because they are carried out systematically on a large scale. The processing of personal data should not be considered to be on a large scale if the processing concerns personal data from patients or clients by an individual physician, other health care professional or lawyer. In such cases, a data protection impact assessment should not be mandatory<br />
</div></div><br />
<br />
==Commentary==<br />
<br />
===Overview===<br />
Article 22 has its roots in Articles 15 and 12(a) of the 95/46/EC Data Protection Directive (DPD). One of the main differences is that the GDPR provision seems to have a broader scope of application, since it applies to “automated processing, including profiling”. On the other hand, the DPD provision was only applicable if a form of profiling was involved.<ref>Article 20 of GDPR proposal, COM(2012) 11 final <https://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=COM:2012:0011:FIN:EN:PDF</ref> A similar difference can be noticed between the initial proposal of the Commission, in which the provision was titled “Measures based on profiling”, and the final wording of the GDPR which extends the scope as to include automated decisions which are not based on profiling. Furthermore, in contrast to the DPD, Article 22(4) GDPR also explicitly addresses the use of sensitive data by laying down a qualified prohibition of decisions based on categories of data listed under Article 9(1).<br />
<br />
===Right or prohibition?===<br />
<br />
There have been conflicting arguments on whether Article 22(1) lays down a right or a general prohibition.<br />
<br />
On the one hand, if the provision is interpreted as a right, then the data subject would have to actively exercise the right in order to be protected from the types of impactful decisions that Article 22 deals with. One of the arguments invoked by proponents of this approach is a strictly literal interpretation which looks at the use of the word ‘right’ in Article 22.<ref>Lee A. Bygrave, ‘Article 22. Automated individual decision-making, including profiling’ Christopher Kuner, Lee A. Bygrave, Christopher Docksey, and and Laura Dreachsler (eds.), The EU General Data Protection Regulation (GDPR) – A Commentary (Oxford University Press), 531.</ref><br />
<br />
Another argument in favour of interpreting the provision as a right is related to the level of ex ante protection for data subjects. This line of reasoning recognises that interpreting Article 22 as a qualified prohibition would indeed offer data subjects a higher level of ex ante protections against automated decisions. However, this line of argumentation then proposes that such protections could theoretically also be offered in connection with the DPIA process. More specifically, some commentators point to the fact that if controllers, based on a self-assessment, are of the opinion that they cannot mitigate the high risk level of an automated decision system, they would have to consult the relevant DPA which could then ban the controller’s automated decision system.<ref>Ibid, 531-532.</ref> It is, however, questionable whether in practice this approach could attain a reliable and meaningful level of ex ante protection for data subjects.<br />
<br />
On the other hand, Article 22(1) can be framed as a general prohibition of decisions subject to Article 22. This interpretation seems more in line with the purpose of the provision, which seeks to protect data subjects from a general possibility of being subject to decisions covered by Article 22. A confirmation may be found in the Article 29 Working Party Guidelines on Automated individual decision-making and Profiling,<ref>WP29, Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679, adopted on 3 October 2017; as last revised and adopted on 6 February 2018, 17/EN, WP 251 rev.01, 19.</ref> which state that:<br />
<br />
“''The term “right” in the provision does not mean that Article 22(1) applies only when actively invoked by the data subject. Article 22(1) establishes a general prohibition for decision-making based solely on automated processing. This prohibition applies whether or not the data subject takes an action regarding the processing of their personal data''”.<br />
<br />
The Article 29 Working Party supports this argument by relying on the principles of the GDPR and its aim to give data subjects control over their personal data. Furthermore, the WP29 makes a reference to Recital 71, which implies that decisions under Article 22(1) are generally not allowed, by contrasting these with decisions regulated by Article 22(2) which should, “however”, be allowed. The WP29’s view of reading Article 22 as a qualified prohibition has not been changed by the EDPB.<br />
===Scope===<br />
The title of Article 22 mentions automated “individual” decision making, which indicates that the scope of the provision is not extended to decisions or processing operations which produce effects for groups of data subjects. Following this reasoning, Article 22 would not be applicable if decisions affect multiple data subjects or groups of individuals connected by common characteristics, such as age, gender, or postal code. However, considering the realities of machine learning and Big Data, pertinent arguments have also been made supporting the view that Article 22 should apply to group decisions as well.<ref>See Michael Veale and Lilian Edwards, ‘Clarity, surprises, and further questions in the Article 29 Working Party draft guidance on automated decision-making and profiling’ (2018) 34 Computer Law & Security Review, 402; In addition, another example could be that of individuals classified by categories such as ‘man/woman’, ‘married/single/divorced’, or ‘high/low income’. Solely automated decisions could then be taken for all ‘divorced low-income women’ which would render Article 22 inapplicable, despite their potentially significant effects. In such situations, the decisions could be seen as a bundle of individual decisions, which would make it more difficult to circumvent Article 22, and thereby better protect the fundamental rights of data subjects. However, in the absence of any CJEU decisions on this matter so far, it is still disputed whether Article 22(1) should apply to group decisions or not.</ref><br />
<br />
===Decision based solely on automated processing===<br />
<br />
====Decision====<br />
The first element required to trigger Article 22 is the presence of a ‘decision’, which can be interpreted in a broad sense.<ref>Bygrave (n 2) 532.</ref> Examples of a decision can be official acts of public authorities such as decisions on tax returns,<ref>Maja Brkan, ‘Do algorithms rule the world? Algorithmic decision-making and data protection in the framework of the GDPR and beyond’ (2019) 27 International Journal of Law and Information Technology 2, 102.</ref> as well as automatic refusals of online credit applications or similar decisions in the context of e-recruiting practices.<ref>Recital 71 GDPR</ref> In a more general sense, decisions could also be seen as a particular attitude or position taken with regards to a person, if this position is at least likely to be acted upon.<ref>Isak Mendoza and Lee A. Bygrave, ‘The Right not to be Subject to Automated Decisions based on Profiling’ (2017) University of Oslo Faculty of Law Legal Studies Research Paper Series No.2017-20, 10-11.</ref> Although there does not seem to be a specific requirement for the decision to be formalised in a particular way, it should at least be distinguishable from other stages of the (decision-making) process.<br />
====Solely====<br />
<br />
This second element of Article 22(1) first of all depends on whether human intervention is even possible from a technical perspective, or the decision-making process is constructed in a solely algorithmic way with no room for human involvement.<br />
<br />
If the process does at least technically allow for human intervention, then it must be assessed whether the action undertaken by a human is “meaningful” or merely a procedural “token gesture”.<ref>WP29 (n 4) 21.</ref> In order to meet this criterion, the intervention must be “carried out by someone who has the authority and competence to change the decision”. Furthermore, the human involved must not only have the power to change the decision, but actually exercise this competence by ”consider[ing] all the relevant data” and verifying the substance and correctness of the machine-generated decision.<ref>Ibid 8; Bygrave (n 2) 533.</ref><br />
====Automated processing====<br />
<br />
The ‘automated processing’ criterion in Article 22(1) is related to the final stage of the processing, which results in a solely automated decision based on already existing data. By contrast, the methods of collecting the initial data sources must not necessarily be automated but can be semi-automated or even manual.<ref>Bygrave (n 2) 533.</ref> Furthermore, the automated decision must not entirely be based on personal data related to the person affected by the decision. Instead, the data basis can also include non-personal data, or personal data related to other individuals.<br />
===Legal or similarly significant effect===<br />
====Legal effects====<br />
<br />
A decision has legal effects on a data subject when it is binding and affects the person’s legal rights or interests. Examples can be the cancellation of a contract, the decision of a tax authority on an individual’s tax return, or the denial of a social benefit granted by law.<ref>WP29 (n 4) 21, Brkan (n 7) 102.</ref><br />
====Similarly significant effects====<br />
<br />
In principle, satisfying this criterion means that the impacts of the decision must be sufficiently great despite not changing the legal position of the individual. While it can be difficult to establish this in practice, according to the WP29 some guiding criteria can be the potential of a decision to:<br />
<br />
*significantly affect the circumstances, behaviour or choices of the individuals concerned<br />
*have a prolonged or permanent impact on the data subject; or<br />
*at its most extreme, lead to the exclusion or discrimination of individuals.<br />
<br />
The WP29 further gives examples of decisions which can similarly significantly affect data subjects, when these:<br />
<br />
*affect someone’s financial circumstances, such as their eligibility to credit;<br />
*affect someone’s access to health services;<br />
*deny someone an employment opportunity or put them at a serious disadvantage;<br />
*affect someone’s access to education, for example university admissions.<br />
<br />
In any case, the decision should have more than a trivial effect which impacts someone’s position in relation to other persons or to their access to a service or opportunity. For example, Recital 71 mentions the “automatic refusal of an online credit application or e-recruiting practices without human intervention”.<br />
<br />
Decisions resulting in targeted advertisement based on profiling could also similarly significantly affect individuals, for example when someone is targeted with high interest loans because they are known to be in financial difficulties and are particularly susceptible to accept such offers. In this context, the WP29 lays down a non-exhaustive list of characteristics that can be decisive in the assessment of each case, such as:<br />
<br />
*the intrusiveness of the profiling process, including the tracking of individuals across different websites, devices and services;<br />
*the expectations and wishes of the individuals concerned;<br />
*the way the advert is delivered; or<br />
*using knowledge of the vulnerabilities of the data subjects targeted.<br />
<br />
===Exceptions===<br />
====Contract====<br />
<br />
The first exception from the prohibition laid down in Article 22(1) is if the decision is necessary for entering into, or performance of, a contract between the data subject and the controller.<br />
<br />
It is unclear how the ‘necessity’ criterion should be interpreted. A strictly textual interpretation would lead to most examples of solely automated decisions not being considered necessary. For example, while assessing an individual’s credit risk is necessary for a bank in order to protect its investments, algorithmic credit scoring is not in itself necessary since an assessment can also be carried by humans. As different arguments could be made for the threshold and meaning of the ‘necessity’ criterion, this aspect will need to be clarified by courts.<br />
<br />
Furthermore, an analysis of Articles 6(1)(b) and 22(2)(a) seems to indicate a difference in the scope of the two provisions with regards to the necessity of the processing for entering into a contract. Whereas for Article 6(1)(b) the processing would have to be necessary in order to take steps at the request of the data subject, Article 22(2)(a) does not mention any request from the data subject. Therefore, the scope of application of Article 22(2)(a) seems to be wider in this regard.<br />
<br />
In any case, the application of Article 22(2)(a) is always subjected to the presence of “suitable measures to safeguard the data subject's rights and freedoms and legitimate interests, at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision”. [Article 22(3) GDPR.]<br />
<br />
====Authorised by law====<br />
<br />
The second exemption in Article 22(2) is also subject to the presence of “suitable measures to safeguard the data subject's rights and freedoms and legitimate interests”. However, it seems that such measures do not necessarily need to be the same as those foreseen by Article 22(3). Instead, Members States have discretion in this aspect.<br />
<br />
====Explicit consent====<br />
<br />
The wording of Article 22(2)(c) (“explicit consent”) results in the same standard for this requirement as with Article 9(2)(a).<br />
<br />
Where particular attention must be given to it being freely given in the context of entering into or performance of a contract.<ref>See also Articles 4(11) and 7(4), as well as Recital 43.</ref> Indeed, discussions on whether consent is freely given, including in line with Articles 9(2) and 22(2(c), will lead to assessing the necessity element. In this context, it must be assessed whether the decision is necessary for the “performance of a contract, including the provision of a service” as mandated by Article 7(4). However, this provision does not mention entering into a contract, which would seem to exclude examples such as online credit applications where the algorithmic decision occurs in order to enter the contract and not to perform it.<br />
<br />
Furthermore, issues could arise with decisions based on profiling, where a data subject might have given their consent to the profiling, for example by accepting a cookie, but not to the decision resulting from it. In this context, data subjects might not even be aware of the solely automated decision occurring, and in any case the consent to the profiling would not be considered as satisfying the requirements of Article 22(2)(c).<ref>Brkan (n 7) 106.</ref><br />
<br />
Finally, decisions based on explicit consent are also subjected to the safeguards laid down in Article 22(3).<br />
<br />
===Safeguards===<br />
<br />
A crucial point worth emphasizing is that Article 22(3) lays down a non-exhaustive list of safeguards, which should always be available to the data subjects. This leaves the door open for additional safeguards, such the heavily disputed potential ‘right to explanation’ mentioned in Recital 71.<br />
<br />
However, clarifications will still be needed as to how the safeguards already mentioned by Article 22(3) can be operationalized and what their outcome will be. On the operational side it would be questionable how some systems would even allow for human intervention in practice, for example when the website or platform does not technically allow this. On the other hand, with regards to the legal consequences, it is not clear whether the data subject expressing their views or contesting a decision would lead to the decision being annulled.<ref>Brkan (n7) 108.</ref><br />
<br />
===Qualified prohibition of using special categories of data===<br />
Explicit consent in the context of Article 22(4) should be interpreted in a similar manner as for Article 22(2)(c). With regards to the “suitable measures to safeguard the data subject's rights and freedoms and legitimate interests”, these seem to have the same scope and interpretation as with Article 22(3).<br />
<br />
==Decisions==<br />
→ You can find all related decisions in [[:Category:Article 22 GDPR]]<br />
<br />
==References==<br />
<references /><br />
<br />
[[Category:GDPR Articles]]</div>94.198.41.212