Article 22 GDPR: Difference between revisions

From GDPRhub
Line 185: Line 185:


==Legal Text==
==Legal Text==
<br /><center>'''Article 22 - Automated individual decision-making, including profiling'''</center><br />
'''Article 22 - Automated individual decision-making, including profiling'''


<span id="1">1.  The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.</span>
<span id="1">1.  The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.</span>
Line 191: Line 191:
<span id="2">2.  Paragraph 1 shall not apply if the decision:</span>
<span id="2">2.  Paragraph 1 shall not apply if the decision:</span>


::<span id="2a">(a)  is necessary for entering into, or performance of, a contract between the data subject and a data controller;</span>
<span id="2a">(a)  is necessary for entering into, or performance of, a contract between the data subject and a data controller;</span>


::<span id="2b">(b)  is authorised by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject's rights and freedoms and legitimate interests; or</span>
<span id="2b">(b)  is authorised by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject's rights and freedoms and legitimate interests; or</span>


::<span id="2c">(c)  is based on the data subject's explicit consent.</span>
<span id="2c">(c)  is based on the data subject's explicit consent.</span>


<span id="3">3.  In the cases referred to in points (a) and (c) of paragraph 2, the data controller shall implement suitable measures to safeguard the data subject's rights and freedoms and legitimate interests, at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision.</span>
<span id="3">3.  In the cases referred to in points (a) and (c) of paragraph 2, the data controller shall implement suitable measures to safeguard the data subject's rights and freedoms and legitimate interests, at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision.</span>
Line 202: Line 202:


==Relevant Recitals==
==Relevant Recitals==
<span id="r39">
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><div>'''Recital 4:''' The processing of personal data should be designed to serve mankind </div>
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><div>'''Recital 4:''' The processing of personal data should be designed to serve mankind </div>
<div class="mw-collapsible-content">
<div class="mw-collapsible-content">
The processing of personal data should be designed to serve mankind. The right to the protection of personal data is not an absolute right; it must be considered in relation to its function in society and be balanced against other fundamental rights, in accordance with the principle of proportionality. This Regulation respects all fundamental rights and observes the freedoms and principles recognised in the Charter as enshrined in the Treaties, in particular the respect for private and family life, home and communications, the protection of personal data, freedom of thought, conscience and religion, freedom of expression and information, freedom to conduct a business, the right to an effective remedy and to a fair trial, and cultural, religious and linguistic diversity.
The processing of personal data should be designed to serve mankind. The right to the protection of personal data is not an absolute right; it must be considered in relation to its function in society and be balanced against other fundamental rights, in accordance with the principle of proportionality. This Regulation respects all fundamental rights and observes the freedoms and principles recognised in the Charter as enshrined in the Treaties, in particular the respect for private and family life, home and communications, the protection of personal data, freedom of thought, conscience and religion, freedom of expression and information, freedom to conduct a business, the right to an effective remedy and to a fair trial, and cultural, religious and linguistic diversity.
</div></div>
</div></div><span id="r24">
 
<span id="r24">
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><div>'''Recital 24:''' Monitoring of data subject's behaviour </div>
<div class="toccolours mw-collapsible mw-collapsed" style="border-width: 0px" overflow:auto;"><div>'''Recital 24:''' Monitoring of data subject's behaviour </div>
<div class="mw-collapsible-content">
<div class="mw-collapsible-content">

Revision as of 21:55, 6 May 2021

Article 22 - Automated individual decision-making, including profiling
Gdpricon.png
Chapter 10: Delegated and implementing acts

Legal Text

Article 22 - Automated individual decision-making, including profiling

1. The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.

2. Paragraph 1 shall not apply if the decision:

(a) is necessary for entering into, or performance of, a contract between the data subject and a data controller;

(b) is authorised by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject's rights and freedoms and legitimate interests; or

(c) is based on the data subject's explicit consent.

3. In the cases referred to in points (a) and (c) of paragraph 2, the data controller shall implement suitable measures to safeguard the data subject's rights and freedoms and legitimate interests, at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision.

4. Decisions referred to in paragraph 2 shall not be based on special categories of personal data referred to in Article 9(1), unless point (a) or (g) of Article 9(2) applies and suitable measures to safeguard the data subject's rights and freedoms and legitimate interests are in place.

Relevant Recitals

Recital 4: The processing of personal data should be designed to serve mankind

The processing of personal data should be designed to serve mankind. The right to the protection of personal data is not an absolute right; it must be considered in relation to its function in society and be balanced against other fundamental rights, in accordance with the principle of proportionality. This Regulation respects all fundamental rights and observes the freedoms and principles recognised in the Charter as enshrined in the Treaties, in particular the respect for private and family life, home and communications, the protection of personal data, freedom of thought, conscience and religion, freedom of expression and information, freedom to conduct a business, the right to an effective remedy and to a fair trial, and cultural, religious and linguistic diversity.

Recital 24: Monitoring of data subject's behaviour

The processing of personal data of data subjects who are in the Union by a controller or processor not established in the Union should also be subject to this Regulation when it is related to the monitoring of the behaviour of such data subjects in so far as their behaviour takes place within the Union. In order to determine whether a processing activity can be considered to monitor the behaviour of data subjects, it should be ascertained whether natural persons are tracked on the internet including potential subsequent use of personal data processing techniques which consist of profiling a natural person, particularly in order to take decisions concerning her or him or for analysing or predicting her or his personal preferences, behaviours and attitudes.

Recital 38: Specific protection of children

Children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data. Such specific protection should, in particular, apply to the use of personal data of children for the purposes of marketing or creating personality or user profiles and the collection of personal data with regard to children when using services offered directly to a child. The consent of the holder of parental responsibility should not be necessary in the context of preventive or counselling services offered directly to a child.

Recital 60: Informing data subjects about profiling

The principles of fair and transparent processing require that the data subject be informed of the existence of the processing operation and its purposes. The controller should provide the data subject with any further information necessary to ensure fair and transparent processing taking into account the specific circumstances and context in which the personal data are processed. Furthermore, the data subject should be informed of the existence of profiling and the consequences of such profiling. Where the personal data are collected from the data subject, the data subject should also be informed whether he or she is obliged to provide the personal data and of the consequences, where he or she does not provide such data. That information may be provided in combination with standardised icons in order to give in an easily visible, intelligible and clearly legible manner, a meaningful overview of the intended processing. Where the icons are presented electronically, they should be machine-readable.

Recital 63: Access to meaningful information

The principles of fair and transparent processing require that the data subject be informed of the existence of the processing operation and its purposes. The controller should provide the data subject with any further information necessary to ensure fair and transparent processing taking into account the specific circumstances and context in which the personal data are processed. Furthermore, the data subject should be informed of the existence of profiling and the consequences of such profiling. Where the personal data are collected from the data subject, the data subject should also be informed whether he or she is obliged to provide the personal data and of the consequences, where he or she does not provide such data. That information may be provided in combination with standardised icons in order to give in an easily visible, intelligible and clearly legible manner, a meaningful overview of the intended processing. Where the icons are presented electronically, they should be machine-readable.

Recital 71: Automated processing/decision-making, including profiling

The data subject should have the right not to be subject to a decision, which may include a measure, evaluating personal aspects relating to him or her which is based solely on automated processing and which produces legal effects concerning him or her or similarly significantly affects him or her, such as automatic refusal of an online credit application or e-recruiting practices without any human intervention. Such processing includes ‘profiling’ that consists of any form of automated processing of personal data evaluating the personal aspects relating to a natural person, in particular to analyse or predict aspects concerning the data subject's performance at work, economic situation, health, personal preferences or interests, reliability or behaviour, location or movements, where it produces legal effects concerning him or her or similarly significantly affects him or her. However, decision-making based on such processing, including profiling, should be allowed where expressly authorised by Union or Member State law to which the controller is subject, including for fraud and tax-evasion monitoring and prevention purposes conducted in accordance with the regulations, standards and recommendations of Union institutions or national oversight bodies and to ensure the security and reliability of a service provided by the controller, or necessary for the entering or performance of a contract between the data subject and a controller, or when the data subject has given his or her explicit consent. In any case, such processing should be subject to suitable safeguards, which should include specific information to the data subject and the right to obtain human intervention, to express his or her point of view, to obtain an explanation of the decision reached after such assessment and to challenge the decision. Such measure should not concern a child. In order to ensure fair and transparent processing in respect of the data subject, taking into account the specific circumstances and context in which the personal data are processed, the controller should use appropriate mathematical or statistical procedures for the profiling, implement technical and organisational measures appropriate to ensure, in particular, that factors which result in inaccuracies in personal data are corrected and the risk of errors is minimised, secure personal data in a manner that takes account of the potential risks involved for the interests and rights of the data subject and that prevents, inter alia, discriminatory effects on natural persons on the basis of racial or ethnic origin, political opinion, religion or beliefs, trade union membership, genetic or health status or sexual orientation, or that result in measures having such an effect. Automated decision-making and profiling based on special categories of personal data should be allowed only under specific conditions.

Recital 72: EDPB Guidance on profiling

Profiling is subject to the rules of this Regulation governing the processing of personal data, such as the legal grounds for processing or data protection principles. The European Data Protection Board established by this Regulation (the ‘Board’) should be able to issue guidance in that context.

Recital 91: DPIA necessary in case of systematic and extensive evaluation of personal aspects

This should in particular apply to large-scale processing operations which aim to process a considerable amount of personal data at regional, national or supranational level and which could affect a large number of data subjects and which are likely to result in a high risk, for example, on account of their sensitivity, where in accordance with the achieved state of technological knowledge a new technology is used on a large scale as well as to other processing operations which result in a high risk to the rights and freedoms of data subjects, in particular where those operations render it more difficult for data subjects to exercise their rights. A data protection impact assessment should also be made where personal data are processed for taking decisions regarding specific natural persons following any systematic and extensive evaluation of personal aspects relating to natural persons based on profiling those data or following the processing of special categories of personal data, biometric data, or data on criminal convictions and offences or related security measures. A data protection impact assessment is equally required for monitoring publicly accessible areas on a large scale, especially when using optic-electronic devices or for any other operations where the competent supervisory authority considers that the processing is likely to result in a high risk to the rights and freedoms of data subjects, in particular because they prevent data subjects from exercising a right or using a service or a contract, or because they are carried out systematically on a large scale. The processing of personal data should not be considered to be on a large scale if the processing concerns personal data from patients or clients by an individual physician, other health care professional or lawyer. In such cases, a data protection impact assessment should not be mandatory

Commentary

Overview

Article 22 has its roots in Articles 15 and 12(a) of the 95/46/EC Data Protection Directive (DPD). One of the main differences is that the GDPR provision seems to have a broader scope of application, since it applies to “automated processing, including profiling”. On the other hand, the DPD provision was only applicable if a form of profiling was involved.[1] A similar difference can be noticed between the initial proposal of the Commission, in which the provision was titled “Measures based on profiling”, and the final wording of the GDPR which extends the scope as to include automated decisions which are not based on profiling. Furthermore, in contrast to the DPD, Article 22(4) GDPR also explicitly addresses the use of sensitive data by laying down a qualified prohibition of decisions based on categories of data listed under Article 9(1).

Right or prohibition?

There have been conflicting arguments on whether Article 22(1) lays down a right or a general prohibition.

On the one hand, if the provision is interpreted as a right, then the data subject would have to actively exercise the right in order to be protected from the types of impactful decisions that Article 22 deals with. One of the arguments invoked by proponents of this approach is a strictly literal interpretation which looks at the use of the word ‘right’ in Article 22.[2]

Another argument in favour of interpreting the provision as a right is related to the level of ex ante protection for data subjects. This line of reasoning recognises that interpreting Article 22 as a qualified prohibition would indeed offer data subjects a higher level of ex ante protections against automated decisions. However, this line of argumentation then proposes that such protections could theoretically also be offered in connection with the DPIA process. More specifically, some commentators point to the fact that if controllers, based on a self-assessment, are of the opinion that they cannot mitigate the high risk level of an automated decision system, they would have to consult the relevant DPA which could then ban the controller’s automated decision system.[3] It is, however, questionable whether in practice this approach could attain a reliable and meaningful level of ex ante protection for data subjects.

On the other hand, Article 22(1) can be framed as a general prohibition of decisions subject to Article 22. This interpretation seems more in line with the purpose of the provision, which seeks to protect data subjects from a general possibility of being subject to decisions covered by Article 22. A confirmation may be found in the Article 29 Working Party Guidelines on Automated individual decision-making and Profiling,[4] which state that:

The term “right” in the provision does not mean that Article 22(1) applies only when actively invoked by the data subject. Article 22(1) establishes a general prohibition for decision-making based solely on automated processing. This prohibition applies whether or not the data subject takes an action regarding the processing of their personal data”.

The Article 29 Working Party supports this argument by relying on the principles of the GDPR and its aim to give data subjects control over their personal data. Furthermore, the WP29 makes a reference to Recital 71, which implies that decisions under Article 22(1) are generally not allowed, by contrasting these with decisions regulated by Article 22(2) which should, “however”, be allowed. The WP29’s view of reading Article 22 as a qualified prohibition has not been changed by the EDPB.

Scope

The title of Article 22 mentions automated “individual” decision making, which indicates that the scope of the provision is not extended to decisions or processing operations which produce effects for groups of data subjects. Following this reasoning, Article 22 would not be applicable if decisions affect multiple data subjects or groups of individuals connected by common characteristics, such as age, gender, or postal code. However, considering the realities of machine learning and Big Data, pertinent arguments have also been made supporting the view that Article 22 should apply to group decisions as well.[5]

Decision based solely on automated processing

Decision

The first element required to trigger Article 22 is the presence of a ‘decision’, which can be interpreted in a broad sense.[6] Examples of a decision can be official acts of public authorities such as decisions on tax returns,[7] as well as automatic refusals of online credit applications or similar decisions in the context of e-recruiting practices.[8] In a more general sense, decisions could also be seen as a particular attitude or position taken with regards to a person, if this position is at least likely to be acted upon.[9] Although there does not seem to be a specific requirement for the decision to be formalised in a particular way, it should at least be distinguishable from other stages of the (decision-making) process.

Solely

This second element of Article 22(1) first of all depends on whether human intervention is even possible from a technical perspective, or the decision-making process is constructed in a solely algorithmic way with no room for human involvement.

If the process does at least technically allow for human intervention, then it must be assessed whether the action undertaken by a human is “meaningful” or merely a procedural “token gesture”.[10] In order to meet this criterion, the intervention must be “carried out by someone who has the authority and competence to change the decision”. Furthermore, the human involved must not only have the power to change the decision, but actually exercise this competence by ”consider[ing] all the relevant data” and verifying the substance and correctness of the machine-generated decision.[11]

Automated processing

The ‘automated processing’ criterion in Article 22(1) is related to the final stage of the processing, which results in a solely automated decision based on already existing data. By contrast, the methods of collecting the initial data sources must not necessarily be automated but can be semi-automated or even manual.[12] Furthermore, the automated decision must not entirely be based on personal data related to the person affected by the decision. Instead, the data basis can also include non-personal data, or personal data related to other individuals.

Legal or similarly significant effect

Legal effects

A decision has legal effects on a data subject when it is binding and affects the person’s legal rights or interests. Examples can be the cancellation of a contract, the decision of a tax authority on an individual’s tax return, or the denial of a social benefit granted by law.[13]

Similarly significant effects

In principle, satisfying this criterion means that the impacts of the decision must be sufficiently great despite not changing the legal position of the individual. While it can be difficult to establish this in practice, according to the WP29 some guiding criteria can be the potential of a decision to:

  • significantly affect the circumstances, behaviour or choices of the individuals concerned
  • have a prolonged or permanent impact on the data subject; or
  • at its most extreme, lead to the exclusion or discrimination of individuals.

The WP29 further gives examples of decisions which can similarly significantly affect data subjects, when these:

  • affect someone’s financial circumstances, such as their eligibility to credit;
  • affect someone’s access to health services;
  • deny someone an employment opportunity or put them at a serious disadvantage;
  • affect someone’s access to education, for example university admissions.

In any case, the decision should have more than a trivial effect which impacts someone’s position in relation to other persons or to their access to a service or opportunity. For example, Recital 71 mentions the “automatic refusal of an online credit application or e-recruiting practices without human intervention”.

Decisions resulting in targeted advertisement based on profiling could also similarly significantly affect individuals, for example when someone is targeted with high interest loans because they are known to be in financial difficulties and are particularly susceptible to accept such offers. In this context, the WP29 lays down a non-exhaustive list of characteristics that can be decisive in the assessment of each case, such as:

  • the intrusiveness of the profiling process, including the tracking of individuals across different websites, devices and services;
  • the expectations and wishes of the individuals concerned;
  • the way the advert is delivered; or
  • using knowledge of the vulnerabilities of the data subjects targeted.

Exceptions

Contract

The first exception from the prohibition laid down in Article 22(1) is if the decision is necessary for entering into, or performance of, a contract between the data subject and the controller.

It is unclear how the ‘necessity’ criterion should be interpreted. A strictly textual interpretation would lead to most examples of solely automated decisions not being considered necessary. For example, while assessing an individual’s credit risk is necessary for a bank in order to protect its investments, algorithmic credit scoring is not in itself necessary since an assessment can also be carried by humans. As different arguments could be made for the threshold and meaning of the ‘necessity’ criterion, this aspect will need to be clarified by courts.

Furthermore, an analysis of Articles 6(1)(b) and 22(2)(a) seems to indicate a difference in the scope of the two provisions with regards to the necessity of the processing for entering into a contract. Whereas for Article 6(1)(b) the processing would have to be necessary in order to take steps at the request of the data subject, Article 22(2)(a) does not mention any request from the data subject. Therefore, the scope of application of Article 22(2)(a) seems to be wider in this regard.

In any case, the application of Article 22(2)(a) is always subjected to the presence of “suitable measures to safeguard the data subject's rights and freedoms and legitimate interests, at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision”. [Article 22(3) GDPR.]

Authorised by law

The second exemption in Article 22(2) is also subject to the presence of “suitable measures to safeguard the data subject's rights and freedoms and legitimate interests”. However, it seems that such measures do not necessarily need to be the same as those foreseen by Article 22(3). Instead, Members States have discretion in this aspect.

Explicit consent

The wording of Article 22(2)(c) (“explicit consent”) results in the same standard for this requirement as with Article 9(2)(a).

Where particular attention must be given to it being freely given in the context of entering into or performance of a contract.[14] Indeed, discussions on whether consent is freely given, including in line with Articles 9(2) and 22(2(c), will lead to assessing the necessity element. In this context, it must be assessed whether the decision is necessary for the “performance of a contract, including the provision of a service” as mandated by Article 7(4). However, this provision does not mention entering into a contract, which would seem to exclude examples such as online credit applications where the algorithmic decision occurs in order to enter the contract and not to perform it.

Furthermore, issues could arise with decisions based on profiling, where a data subject might have given their consent to the profiling, for example by accepting a cookie, but not to the decision resulting from it. In this context, data subjects might not even be aware of the solely automated decision occurring, and in any case the consent to the profiling would not be considered as satisfying the requirements of Article 22(2)(c).[15]

Finally, decisions based on explicit consent are also subjected to the safeguards laid down in Article 22(3).

Safeguards

A crucial point worth emphasizing is that Article 22(3) lays down a non-exhaustive list of safeguards, which should always be available to the data subjects. This leaves the door open for additional safeguards, such the heavily disputed potential ‘right to explanation’ mentioned in Recital 71.

However, clarifications will still be needed as to how the safeguards already mentioned by Article 22(3) can be operationalized and what their outcome will be. On the operational side it would be questionable how some systems would even allow for human intervention in practice, for example when the website or platform does not technically allow this. On the other hand, with regards to the legal consequences, it is not clear whether the data subject expressing their views or contesting a decision would lead to the decision being annulled.[16]

Qualified prohibition of using special categories of data

Explicit consent in the context of Article 22(4) should be interpreted in a similar manner as for Article 22(2)(c). With regards to the “suitable measures to safeguard the data subject's rights and freedoms and legitimate interests”, these seem to have the same scope and interpretation as with Article 22(3).

Decisions

→ You can find all related decisions in Category:Article 22 GDPR

References

  1. Article 20 of GDPR proposal, COM(2012) 11 final <https://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=COM:2012:0011:FIN:EN:PDF
  2. Lee A. Bygrave, ‘Article 22. Automated individual decision-making, including profiling’ Christopher Kuner, Lee A. Bygrave, Christopher Docksey, and and Laura Dreachsler (eds.), The EU General Data Protection Regulation (GDPR) – A Commentary (Oxford University Press), 531.
  3. Ibid, 531-532.
  4. WP29, Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679, adopted on 3 October 2017; as last revised and adopted on 6 February 2018, 17/EN, WP 251 rev.01, 19.
  5. See Michael Veale and Lilian Edwards, ‘Clarity, surprises, and further questions in the Article 29 Working Party draft guidance on automated decision-making and profiling’ (2018) 34 Computer Law & Security Review, 402; In addition, another example could be that of individuals classified by categories such as ‘man/woman’, ‘married/single/divorced’, or ‘high/low income’. Solely automated decisions could then be taken for all ‘divorced low-income women’ which would render Article 22 inapplicable, despite their potentially significant effects. In such situations, the decisions could be seen as a bundle of individual decisions, which would make it more difficult to circumvent Article 22, and thereby better protect the fundamental rights of data subjects. However, in the absence of any CJEU decisions on this matter so far, it is still disputed whether Article 22(1) should apply to group decisions or not.
  6. Bygrave (n 2) 532.
  7. Maja Brkan, ‘Do algorithms rule the world? Algorithmic decision-making and data protection in the framework of the GDPR and beyond’ (2019) 27 International Journal of Law and Information Technology 2, 102.
  8. Recital 71 GDPR
  9. Isak Mendoza and Lee A. Bygrave, ‘The Right not to be Subject to Automated Decisions based on Profiling’ (2017) University of Oslo Faculty of Law Legal Studies Research Paper Series No.2017-20, 10-11.
  10. WP29 (n 4) 21.
  11. Ibid 8; Bygrave (n 2) 533.
  12. Bygrave (n 2) 533.
  13. WP29 (n 4) 21, Brkan (n 7) 102.
  14. See also Articles 4(11) and 7(4), as well as Recital 43.
  15. Brkan (n 7) 106.
  16. Brkan (n7) 108.