Right not to be Subject to Automated Decision-Making
Home » The GDPR » Data Subjects’ Rights » Right not to be Subject to Automated Decision-Making

Pursuant to Article 22 GDPR, the data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.As explained by Bygrave (2021), the rationale behind this provision lies in the potentiallyserious repercussions that profiling and other automated processing operations might have on the decision-making process of the data subject.[1]Researchers, for instance, could develop software to process a large amount of personal data, classify data subjects according to them, make predictions, and determine outcomes that could cause data discrimination, when later applied in the context of public administration (e.g., provision of welfare and health services) or the private sector (e.g., target advertisement and e-recruitment)

A much-debated question is the nature of Article 22 GDPR. The Article 29 Working Party,on the one hand, interprets this provision as a general prohibition and mostly justifies its reading based on Recital 71, which makes it clear that processing under Article 22 GDPR is not allowed generally.[2] On the other hand, Bygrave and other authors argue that this interpretation runs counter the actual wording of Article 22 GDPR, as well as its placement in the structure of the Regulation (namely, Chapter III on data subject’s rights) and its special consideration in Articles 13.2(f), 14.2(g), 15.1(h), and 35.3(a).[3]Whereas the interpretation of Article 22 GDPR as a prohibition requires the data controller to apply it regardless of the data subject’s action for this purpose, its interpretation as a right involves its exercise following the aforementioned requirements enshrined in Article 12 GDPR that will also be mentioned below.

The automated decision-making is the ability to make decisions by technological means without human involvement. Automated decisions can be based on any type of data, for example, data provided directly by the individuals concerned (such as responses to a questionnaire); data observed about the individuals (such as location data collected via an application); derived or inferred data such as a profile of the individual that has already been created (e.g. a credit score)[4].

Profiling is any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular, to analyze or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location or movements (see Article 4 GDPR).

Although the GDPR does not define the ‘legal’ and ‘similar effects’ arising from the automated decision-making, the Article 29 Working Party clarifies that a legal effect requires that the decision, which is based on solely automated processing, affects someone’s legal rights, such as the freedom to associate with others, vote in election or take a legal action. A legal effect may also be something that affects a person’s legal status or their rights under a contract[5]. Examples of legal effects encompass the termination of a contract, the denial of a social benefit granted by law, the denial of citizenship or resident permits. As regards the similar effects, the Article 29 Working Party considers them as the consequence ofdecisions that must have the potential to significantly affect the circumstances, behaviors or choices of the individual concerned: have prolonged or permanent impact on the data subject or; lead to the exclusion or discrimination of the individual[6]. This is evident in a e-recruitment practice that favor white men over women or people pertaining to minority or vulnerable groups.

Pursuant to Article 22.4 GDPR, where special categories of personal data are involved, automated decision-making can occur, on the condition that the data subject has explicitly consented to it or where it is necessary for reasons of substantial public interest provided for by EU or EU Member State Law. In this context, the controller must take all the appropriate measures to safeguard the data subject’s rights and freedoms.

As already mentioned, Article 12 GDPR provides the controller’s obligation to inform the data subject about the existence of the automated decision-making. In addition, the information should not be limited to the fact that such decision-making occurs, but it should also explain the logic involved and the potential consequences for the data subject[7].

Article 22.2 GDPR provides three exceptions from the prohibition of automated decision-making, namely:

  • The decision is necessary for entering into, or performance of, a contract between the data subject and a controller;
  • The decision is authorized by EU or EU Member State law to which the controller is subject;
  • The decision is based on the data subject’s explicit consent.

In cases one of these exceptions applies, the data controller shall implement specific safeguards other than the ones generally provided in Article 12 GDPR. Based on Article 22.3 GDPR, in the cases of derogations for contract and consent, the data subjects will still have the right to demand human review of the fully automated decision, in addition to the general safeguards that the data controller should implement to protect their fundamental rights and freedoms, as well as legitimate interests. Additionally, in order to guarantee a fair and transparent data processing, Recital 71 requires the data controller to use appropriate mathematical or statistical procedures for the profiling, implement technical and organizational measures appropriate to ensure, in particular, that factors which result in inaccuracies in personal data are corrected and the risk of errors is minimized, secure personal data in a manner that takes account of the potential risks involved for the interests and rights of the data subject and prevent, inter alia, discriminatory effects on natural persons on the basis of racial or ethnic origin, political opinion, religion or beliefs, trade union membership, genetic or health status or sexual orientation, or processing that results in measures having such an effect. For these purposes, the implementation of the principle of data protection by design and by default are of the utmost importance.

Furthermore, Recital 91 clarifies that a data protection impact assessment should be made in the context of automated decision-making processes, whenever thedata processing results indecisions regarding specific natural persons following any systematic and extensive evaluation of personal aspects relating to natural persons based on profiling those data or following the processing of special categories of personal data, biometric data, or data on criminal convictions and offences or related security measures.The provision continues by saying that [a] data protection impact assessment is equally required for monitoring publicly accessible areas on a large scale, especially when using optic- electronic devices or for any other operations where the competent supervisory authority considers that the processing is likely to result in a high risk to the rights and freedoms of data subjects, in particular because they prevent data subjects from exercising a right or using a service or a contract, or because they are carried out systematically on a large scale.

Checklist for complying with arequest not to be subject to automated decision-making

How to comply with all the GDPR obligations:

☐ Does the automated decision-making fall within one of the exemptions laid down in Articles 22.2 and 22.4? If yes, you can proceed with the data processing;

☐ Inform the data subject about the existence of the automated decision-making, including also an explanation of the logic involved and the potential consequences for the data subject.


1L. A. Bygrave, ‘Article 22. Automated individual decision-making, including profiling, in C.Kuner, L. A. Bygrave& C.DockseyThe EU General Data Protection Regulation (GDPR) A Commentary, Oxford: Oxford University Press, 2020, p. 526

2Article 29 Data Protection Working Party, ‘Guidelines on Automated Individual Decision-Making and Profiling for the Purposes of Regulation 2016/679’, 2018, WP251rev.01, pp. 19-20

3L. A. Bygrave, op. cit., pp. 531-532

4Article 29 Data Protection Working Party, ‘Guidelines on Automated Individual Decision-Making and Profiling for the Purposes of Regulation 2016/679’, op. cit., p. 8

5Article 29 Data Protection Working Party, ‘Guidelines on Automated Individuals Decision-Making and Profiling for the Purpose of Regulation no. 2016/679’, op. cit., p. 21


7Fundamental Rights Agency (ed.), op. cit., p. 234

Skip to content