Fairness with respect to data subjects’ rights
Home » AI » General exposition » Privacy and data governance » GDPR provisions » Fairness with respect to data subjects’ rights

Fairness is an essential concept in data protection (see the “Fairness” subsection in the “Lawfulness, fairness and transparency” section of the “Principles”, as well as “Data subjects’ rights”, both within Part II of these Guidelines), one that can hardly be reached without an awareness that the development of AI tools can damage data subjects’ interests, rights and freedoms. This is why it makes sense to ensure that adequate safeguards are implemented not only to avoid unfair consequences, but also to provide data subjects with enforceable rights that ensure adequate protection against unfair processing.

In this section, we explore how the key rights recognized by the GDPR apply to the AI development framework. To this purpose, we will concentrate on some rights that are particularly relevant in this area: (a) right to information (b) the right to access; (c) the right to data portability; (d) the right to rectification; and (e) the right to erasure; and (f) the right to object.

However, before considering this, researchers should check whether their research could be considered as scientific research under Article 89 of the GDPR. This is extremely important: if this is the case, EU or Member State law may provide for derogations from the rights referred to in Articles 15, 16, 18 and 21 (address, rectification, restriction and object – and, indirectly, portability). These are subject to the conditions and safeguards referred to in paragraph 1 of Article 89, insofar as such rights are likely to render impossible, or seriously impair, the achievement of the specific purposes, and such derogations are necessary for the fulfilment of those purposes (see the “Data protection and scientific research” section in the “Main Concepts”, Part II of these Guidelines).

a) Right to information

According to Article 13 of the GDPR, before processing personal data, the controller should provide the data subjects with complete information about the processing and their rights in an understandable format. If “the controller intends to further process the personal data for a purpose other than that for which the personal data were collected, the controller shall provide the data subject prior to that further processing with information on that other purpose and with any relevant further information” (Article 13(3)).

However, the controllers are exempted of providing information to the data subjects if: the provision of such information proves impossible; would involve a disproportionate effort, in particular for processing for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes; or the obligation to provide information is likely to render impossible or seriously impair the achievement of the objectives of that processing. Under such circumstances, controllers should take appropriate measures to protect the data subject’s rights, freedoms and legitimate interests, including making the information publicly available. This derogation, however, is conditioned to the adoption of the safeguards imposed by Article 89 (see the “Data protection and scientific research”within Part II section“Main Concepts”of these Guidelines).

b) Right to access

Data subjects’ right to access their data must be guaranteed in all the steps of an AI tool’s life cycle. Controllers are encouraged to implement adequate technical measures to ensure that such access is easily reached by the data subject. Indeed, Article 15 of the GDPR gives the data subject the right to obtain details of any personal data used for profiling, including the categories of data used to construct a profile. Furthermore, pursuant to Article 15(3), the controller has a duty to make available the data used as input to create the profile, as well as access to information on the profile and details of which segments the data subject has been placed into. Similarly, Recital 63 of the GDPR states that “[w]here possible, the controller should be able to provide remote access to a secure system which would provide the data subjectswith direct access to their personal data”. This includes observed, derived and inferred data.[1]

Box 7. The inferred data issue

One of the most urgent issues we face in the realm of AI is the concrete status of inferred data. These are data that are not provided by the data subjects, but ‘attributed’ to them from available data, either from the same persons or from other persons. Sometimes, these inferred data provide information about an identifiable person. Regardless of whether this information is accurate or not, these data must be considered as personal data and the GDPR therefore applies to them. As a result, data subjects’ rights should be strictly respected, including the right to access to such data.[2] However, as discussed elsewhere in these Guidelines, inferred data are not included in the right to portability (see “Right to portability” within Part II section “Data subject’s rights” of these Guidelines).

One of the main problems embedded in AI and big data processing is that the right to access may, at times, collide with the interest of a company in keeping its commercial secrets. Indeed, Recital 63 of the GDPR provides some protection for controllers who are unwilling to unveil trade secrets or intellectual property, which may be particularly relevant in relation to profiling.[3] However, AI developers cannot rely on the protection of their trade secrets as an excuse to deny access or refuse to provide information to the data subjects. Instead, organizations need to find pragmatic solutions.[4]

The right to access might be more or less applicable, depending on the lifecycle stage at which the AI development is. For instance, providing access to training data to an individual data subject, might be hard since they usually only include information relevant to predictions (e.g. past transactions, demographics, location), but not contact details or unique customer identifiers. Moreover, they are often pre-processed to make them more amenable to machine learning algorithms. However, this does not mean at all that these data can be considered as anonymized. Thus, they continue to be personal data. For instance, in the case of a purchase prediction model, the training might include a pattern of purchases unique to one customer. In this example, if a customer were to provide a list of their recent purchases as part of their request, the organization may be able to identify the portion of the training data that relates to that individual.

Under such circumstances, controllers must respond to data subjects’ request to gain access to their personal data, assuming they have taken reasonable measures to verify the identity of the data subject, and no other exceptions apply. And, as the ICO states, “requests for access, rectification or erasure of training data should not be regarded as manifestly unfounded or excessive just because they may be harder to fulfil or the motivation for requesting them may be unclear in comparison to other access requests an organization typically receives”.[5] However, organizations do not have to collect or maintain additional personal data to enable identification of data subjects in training data for the sole purposes of complying with the regulation. If the controllers cannot identify a data subject in the training data, and the data subject cannot provide additional information that would enable their identification, they are not obliged to fulfil a request that is not possible to satisfy.[6]

Checklist: right to access[7]

Preparing for subject access requests

☐ The controllers know how to recognize a subject access request and they understand when the right of access applies.

☐ The controllers understand that the right of access is to be applied at each stage of the life cycle of the AI solution, if it uses personal data.

☐ The controllers have a policy for how to record requests they receive verbally.

☐ The controllers understand when they can refuse a request and are aware of the information they need to provide to individuals when doing so.

☐ The controllers understand the nature of the supplementary information they need to provide in response to a subject access request.

 

Complying with subject access requests

☐ The controllers have processes in place to ensure that they respond to a subject access request without undue delay and within one month of receipt.

☐ The controllers are aware of the circumstances in which they can extend the time limit to respond to a request.

☐ The controllers understand that there is a particular emphasis on using clear and plain language if they are disclosing information to a child.

☐ The controllers understand what they need to consider if a request includes information about others.

☐ The controllers understand how to apply the right to access in training stages.

Additional information

Article 29 Working Party (2014) Opinion 06/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 95/46/EC. European Commission, Brussels. Available at: https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2014/wp217_en.pdf

ICO (2013) Big data, artificial intelligence, machine learning and data protection. Information Commissioner’s Office, Wilmslow. Available at: https://ico.org.uk/media/for-organisations/documents/2013559/big-data-ai-ml-and-data-protection.pdf

ICO (no date) Right of access. Information Commissioner’s Office, Wilmslow. Available at: https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/individual-rights/right-of-access/

Norwegian Data Protection Authority. (2018) Artificial intelligence and privacy. Norwegian Data Protection Authority, Oslo. Available at: https://iapp.org/media/pdf/resource_center/ai-and-privacy.pdf

c) Right to data portability

Article 20 of the GDPR created a new right: the right to data portability.[8] This right provides data subjects with control of the use of their data by redirecting it where it is most useful (see the “Right to data portability” section in the “Data Subject Rights” within Part II of these Guidelines). However, right to data portability might be hard to implement in the AI arena, for several reasons. One must keep in mind the cost and feasibility of providing extremely large, complex datasets accumulated over many years. This could make it hard for a company to fulfil their right to data portability requirements.

There are different types of personal data that a machine learning system can process. According to the Article 29 Data Protection Working Party, some categories of data are linked to the right of data portability, namely: personal data concerning the data subject and data which they have provided to a controller.In general, the term ‘provided by the data subject’ must be interpreted broadly. Thus, it includes data gathered by observing data subjects’ behavior (e.g. raw data processed by smart meters, activity logs, or website history). However, it should exclude ‘inferred data’ and ‘derived data’, which include personal data that are created by a service provider (e.g. algorithmic results). Different to observed or gathered data, inferred data are created by the service itself, based on the observed data, not provided by the data subject.[9]Therefore, the right to data portability does not include data inferred by a machine-learning process.

Checklist: data portability[10]

Preparing for requests for data portability

☐ The controllers know how to recognize a request for data portability and understand when the right applies.

☐ The controllers take into account the requirement for data portability from the earliest stages of conception and design of the AI processing.

☐ The controllers have a policy for how to record requests they receive verbally.

☐ The controllers understand when they can refuse a request and are aware of the information they need to provide to individuals if they proceed with such refusal.

 

Complying with requests for data portability

☐ The controllers can transmit personal data in structured, commonly used and machine-readable formats.

☐ The controllers inform users in advance when it is not technically possible to exercise the right of portability by means of a protocol.

☐ The controllers use secure methods to transmit personal data.

☐ The controllers have processes to ensure that they respond to a request for data portability without undue delay and within one month of receipt.

☐ The controllers are aware of the circumstances under which they can extend the time limit to respond to a request.

Additional information

Article 29 Working Party (2016) Guidelines on the right to data portability. European Commission, Brussels. Available at: https://ec.europa.eu/information_society/newsroom/image/document/2016-51/wp242_en_40852.pdf

EBF (2017) European Banking Federation’s comments to the Working Party 29 guidelines on the right to data portability. European Banking Federation, Brussels, p.4. Available at: www.ebf.eu/wp-content/uploads/2017/04/EBF_025448E-EBF-Comments-to-the-WP-29-Guidelines_Right-of-data-portabi.._.pdf

ICO (no date) Right to data portability. Information Commissioner’s Office, Wilmslow. Available at: https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/individual-rights/right-to-data-portability/

Wallace, N. and Castro, D. (2018) The impact of the EU’s new data protection regulation on AI. Center for Data Innovation, Washington, DC / Brussels / London. Available at: www2.datainnovation.org/2018-impact-gdpr-ai.pdf

d) Right to rectification

The right to correct inaccurate data is particularly important in the case of AI, since machine learning algorithms often infer data. These data might affect the data subject, especially if they are produced in advanced steps of the AI life cycle. Inaccurate data inferred during the training phase is not as worrying as in the final phases. Since the purpose of training data is to train models based on general patterns in large datasets, so individual inaccuracies are less likely to have any direct effect on a data subject.[11] For example, if personal data used to provide information to customers is not correct, such as an erroneous phone number in a dataset, the data subject might suffer more serious harm than if an inferred phone number is used to train a model. However, this most certainly does not mean that the right to rectification does not apply at this stage.

Some concrete types of algorithms, such as Support Vector Machines (SVMs), use some key examples from the training data in order to help distinguish between new examples during deployment. If the data subject requests rectification or erasure of any of this data, the above would not be possible to achieve without having to retrain the model with the rectified data or without deleting the model altogether.[12] This does not, however, render the right to rectification inapplicable.

It is particularly important to keep in mind that if the controller finds that, contrary to the views of the data subject, the data is not inaccurate with regard to the purposes of processing, the controller does not have to rectify the data.[13] However, the burden of the proof is placed in the controllers’ shoulders. They must provide a good reason to deny rectification, and it is hard to conclude that the damage this could bring to the AI system might serve as a convincing reason. The EDPS has criticized systems that do not include the option to have a set of individual personal data rectified without creating considerable harm to the whole system.[14] In any case, if the controller opts to deny the data subjects’ request, they must reply to the data subject with a justified reason for not rectifying the data and, if they wish to, the data subject can then refer the matter to the supervisory authority.[15]

Checklist: right to rectification[16]

Preparing for requests for rectification

☐ The controllers know how to recognize a request for rectification and understand when this right applies.

☐ The controllers have a policy for how to record requests they receive verbally.

☐ The controllers understand when they can refuse a request, and are aware of the information they need to provide to individuals when asked to do so.

 

Complying with requests for rectification

☐ The controllers are prepared to address the right of rectification of data subjects’ data, especially those generated by the inferences and profiles made by the AI solution.

☐ The controllers have processes in place to ensure that they respond to a request for rectification without undue delay and within one month of receipt.

☐ The controllers are aware of the circumstances when they can extend the time limit to respond to a request.

☐ The controllers have appropriate systems to rectify or complete information, or provide a supplementary statement.

☐ The controllers have procedures in place to inform any recipients if they rectify any data they have shared with them.

Additional information

Binns, R. (2019) Enabling access, erasure, and rectification rights in AI systems. Information Commissioner’s Office, Wilmslow. Available at: https://ico.org.uk/about-the-ico/news-and-events/ai-blog-enabling-access-erasure-and-rectification-rights-in-ai-systems/

ICO (no date) Right to rectification. Information Commissioner’s Office, Wilmslow. Available at: https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/individual-rights/right-to-rectification/

e) Right to erasure

Data subjects have a permanent right to ask the controller for the deletion of their personal data. This might be extremely complicated in some cases, however.[17] Indeed, one must keep in mind that sometimes it may be impossible to fulfil the legal aims of the right to erasure – also known as the right to be forgotten – in AI environments, since the obscurity of the processing might hide some personal data event to the processor (see “Understanding transparency and opacity” within this Part III on AI).

However, the main problem with the right to erasure is that it might ruin a whole AI system trained on the basis of the data that a subject is asking to erase. Put simply, algorithms need to retain the data they used for their training. If these data are erased, it could make algorithms less accurate or even break down entirely. Thus, controllers should keep in mind that amending a database that is seriously affected by data erasure might be impossible.

Controllers could consider this to be unacceptable, but the fact is that GDPR do not include any exception to the right to erasure on the basis of the damage caused to a database containing personal data. Some authors, such as Humerick have suggested that “rather than requiring a complete erasure of personal data, controllers and processors should be able to retain information up to the point of erasure. In this way, the AI’s machine learning would remain at the point where it progressed, rather than creating forced amnesia.” According to him, this could serve well to protect the interests of the data subjects without causing the AI to data-regress. However, it is not easy to be sure that this solution complies with the requirements of the GDPR.

Any recommendation should always focus on the first steps of the life cycle of the product. Technically, it is hard to find secure solutions to the dilemmas posed by the right to erasure once a database has been created. Therefore, controllers should always try to arrive at a simple conclusion: the best way to avoid catastrophic damage is to prepare for a possible loss of data from the very beginning.

Finally, the controller must always keep in mind the restrictions to the right to erasure introduced by Article 17(3) of the GDPR. Moreover, national authorities might pose additional restrictions that must be considered.

Checklist: right to erasure[18]

Preparing for requests for erasure

☐ The controllers know how to recognize a request for erasure and they understand when the right applies.

☐ The controllers have a policy for how to record requests they receive verbally.

☐ The controllers understand when they can refuse a request and are aware of the information they need to provide to individuals when doing so.

 

Complying with requests for erasure

☐ The controllers have processes in place to ensure that they respond to a request for erasure without undue delay and within one month of receipt.

☐ The controllers are aware of the circumstances under which they can extend the time limit to respond to a request.

☐ The controllers understand that there is a particular emphasis on the right to erasure if the request relates to data collected from children.

☐ The controllers have procedures to inform any recipients if they erase any data they shared with them.

☐ The controllers have appropriate methods to erase information.

Additional information

An interview with Tiffany Li on the right to erasure and AI can be found here: www.youtube.com/watch?v=Sozg6yJJkHk

Binns, R. (2019) Enabling access, erasure, and rectification rights in AI systems. ICO blog, 15 October. Information Commissioner’s Office, Wilmslow. Available at: https://ico.org.uk/about-the-ico/news-and-events/ai-blog-enabling-access-erasure-and-rectification-rights-in-ai-systems/

Fosch-Villaronga, E., Kieseberg, P. and Li, T. (2018) ‘Humans forget, machines remember: artificial intelligence and the right to be forgotten’, Computer Law & Security Review 34(2): 304-313.

Humerick, M. (2018) Taking AI personally: how the E.U. must learn to balance the interests of personal data privacy & artificial intelligence, 34 Santa Clara High Tech. L.J.393. Available at: https://digitalcommons.law.scu.edu/chtlj/vol34/iss4/3

ICO (no date) Right to erasure. Information Commissioner’s Office, Wilmslow. Available at: https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/individual-rights/right-to-erasure/

Wallace, N. and Castro, D. (2018) The impact of the EU’s new data protection regulation on AI. Center for Data Innovation, Washington, DC / Brussels / London. Available at: www2.datainnovation.org/2018-impact-gdpr-ai.pdf

e) Right to object

Data subjects have the right to object to the processing of their personal data when the controller processes them on the basis of a legitimate interest, or for a task in the public interest. This does not apply to cases where the legal ground for processing was informed consent, since in those cases data subjects could simply withdraw their consent and the controller could no longer process their data. Once data subjects make their request, controllers must cease to process the data, unless they can prove they have compelling and justifiable grounds for continuing to do so, and that these grounds outweigh the data subjects’ interests, rights and freedoms.[19]

Once the controllers receive an objection to the processing of personal data, and provided that no grounds to refuse apply, they must stop processing the data immediately. This may mean that they have to erase stored personal data, as the broad definition of processing under the GDPR includes storing data.

Checklist: right to object

Preparing for objections to processing

☐ The controllers know how to recognize an objection and they understand when the right applies.

☐ The controllers have a policy for how to record objections they receive verbally.

☐ The controllers understand when they can refuse an objection and are aware of the information they need to provide to individuals when doing so.

☐ The controllers have clear information in their privacy notice about individuals’ right to object, which is presented separately from other information on their rights.

☐ The controllers understand when they need to inform individuals of their right to object, in addition to including it in their privacy notice.

 

Complying with requests which object to processing

☐ The controllers have processes in place to ensure that they respond to an objection without undue delay and within one month of receipt.

☐ The controllers are aware of the circumstances when they can extend the time limit to respond to an objection.

☐ The controllers have appropriate methods in place to erase, suppress or otherwise cease processing personal data.

Additional information

EDPS (2020) A preliminary opinion on data protection and scientific research. European Data Protection Supervisor, Brussels. Available at: https://edps.europa.eu/sites/edp/files/publication/20-01-06_opinion_research_en.pdf

ICO (no date) The right to object to use your data. Information Commissioner’s Office, Wilmslow. Available at: https://ico.org.uk/your-data-matters/the-right-to-object-to-the-use-of-your-data/

Norwegian Data Protection Authority (2018)Artificial intelligence and privacy. Norwegian Data Protection Authority, Oslo. Available at: https://iapp.org/media/pdf/resource_center/ai-and-privacy.pdf

 

References


1ICO (2014) Big data and data protection. Information Commissioner’s Office, Wilmslow, pp.99-10. Available at: https://rm.coe.int/big-data-and-data-protection-ico-information-commissioner-s-office/1680591220 (accessed 28 May 2020).

2See: Custers, B. (2018) ‘Profiling as inferred data. Amplifier effects and positive feedback loops’, pp.112-115 in Bayamlıoğlu, E. et al. (eds) Being profiled: cogitas ergo sum. 10 years of profiling the European Citizen. Amsterdam University Press, Amsterdam. DOI 10.5117/9789463722124/CH19. Available at: https://ssrn.com/abstract=3466857 or http://dx.doi.org/10.2139/ssrn.3466857(accessed 28 May 2020).

3A29WP (2016) Guidelines on automated individual decision-making and profiling for the purposes of Regulation 2016/679. European Commission, Brussels, p.17. Available at: https://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=612053 (accessed 28 May 2020).

4Norwegian Data Protection Authority (2018) Artificial intelligence and privacy. Norwegian Data Protection Authority, Oslo, p.19. Available at: https://iapp.org/media/pdf/resource_center/ai-and-privacy.pdf (accessed 28 May 2020).

5ICO (2019) Enabling access, erasure, and rectification rights in AI systems. Information Commissioner’s Office, Wilmslow. Available at: https://ico.org.uk/about-the-ico/news-and-events/ai-blog-enabling-access-erasure-and-rectification-rights-in-ai-systems/(accessed 28 May 2020).

6Ibid.

7ICO (no date) Right of access. Information Commissioner’s Office, Wilmslow. Available at: https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/individual-rights/right-of-access/ (accessed 28 May 2020).

8Article 29 Working Party (2015) Guidelines on the right to data portability. European Commission, Brussels. Available at: http://ec.europa.eu/newsroom/document.cfm?doc_id=45685(accessed 28 May 2020).

9Article 29 Working Party (2015) Guidelines on the right to data portability. European Commission, Brussels, p.8. Available at: http://ec.europa.eu/newsroom/document.cfm?doc_id=45685(accessed 28 May 2020).

10ICO (no date) Right to data portability. Information Commissioner’s Office, Wilmslow. Available at: https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/individual-rights/right-to-data-portability/(accessed 28 May 2020).

11Binns, R. (2019) Enabling access, erasure, and rectification rights in AI systems. Information Commissioner’s Office, Wilmslow. Available at: https://ico.org.uk/about-the-ico/news-and-events/ai-blog-enabling-access-erasure-and-rectification-rights-in-ai-systems/ (accessed 15 May 2020).

12Ibid.

13AEPD (2020) Adecuación al RGPD de tratamientos que incorporan Inteligencia Artificial. Una introducción.Agencia Espanola Proteccion Datos, Madrid, p.27. Available at: www.aepd.es/sites/default/files/2020-02/adecuacion-rgpd-ia.pdf(accessed 28 May 2020).

14EDPS (2014) Guidelines on the rights of individuals with regard to the processing of personal data. European Data Protection Supervisor, Brussels, p.18. Available at: https://edps.europa.eu/sites/edp/files/publication/14-02-25_gl_ds_rights_en.pdf (accessed 10 May 2020).

15Office of the Data Protection Ombudsman (no date) Right to Rectification. Office of the Data Protection Ombudsman, Helsinki. Available at: https://tietosuoja.fi/en/right-to-rectification (accessed 28 May 2020).

16ICO (no date) Right to rectification. Information Commissioner’s Office, Wilmslow. Available at: https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/individual-rights/right-to-rectification/ (accessed 28 May 2020).

17Fosch-Villaronga, E., Kieseberg, P. and Li, T. (2018) ‘Humans forget, machines remember: artificial intelligence and the right to be forgotten’, Computer Law & Security Review 34(2): 304-313.

18ICO (no date) Right to erasure. Information Commissioner’s Office, Wilmslow. Available at: https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/individual-rights/right-to-erasure/(accessed 28 May 2020).

19Norwegian Data Protection Authority (2018) Artificial intelligence and privacy. Norwegian Data Protection Authority, Oslo, p.29. Available at: https://iapp.org/media/pdf/resource_center/ai-and-privacy.pdf (accessed 28 May 2020).

Skip to content