Exercising data subjects rights
Home » AI » Step by step » Modeling (Training) » Exercising data subjects rights

Quite obviously, controllers must facilitate all data subjects’ rights in the whole life cycle. However, in this specific stage, right to access, rectification and erasure are particularly sensitive and include certain characteristics of which controllers need to be aware.

a) Right of access (See “Right of access” Section in the Main Rights Part of these Guidelines)

In general, training data can scarcely be linked to an individual data subject, since they usually only include information relevant to predictions, such as past transactions, demographics, or location, but not contact details or unique customer identifiers. Moreover, they are often pre-processed, to make them more amenable to the algorithms. However, this does not mean that these data can be considered as entirely pseudonymized or anonymized (see “anonymization” section in the Concepts part of these Guidelines). Thus, they continue to be personal data. For instance, in the case of a purchase prediction model, the training might include a pattern of purchases unique to one customer. In this example, if a customer were to provide a list of their recent purchases as part of their request, the organization may be able to identify the portion of the training data that relates to that individual.

Under such circumstances, AI developers should respond to data subjects’ requests to gain access to their personal data, assuming they have taken reasonable measures to verify the identity of the data subject, and no other exceptions apply. And, as the ICO states, “requests for access, rectification or erasure of training data should not be regarded as manifestly unfounded or excessive just because they may be harder to fulfil or the motivation for requesting them may be unclear in comparison to other access requests an organization typically receives.” [1]

However, it is clear that organizations do not have to collect or maintain additional personal data to enable identification of data subjects in training data for the sole purposes of complying with the regulation. If the AI developers cannot identify a data subject in the training data and the data subject cannot provide additional information that would enable their identification, AI developers are not obliged to fulfil a request that is not possible to satisfy.

b) Right to rectification (See the “Right to rectification” section in the Main Rights part of these Guidelines)

In the case of the right to rectification, the controller must guarantee the right of rectification of the data, especially those generated by the inferences and profiles drawn up by the AI development.

Even though the purpose of training data is to train models based on general patterns in large datasets and thus individual inaccuracies are less likely to have any direct effect on a data subject, the right to rectification cannot be limited. As a maximum, the controller could ask for a longer period (two extra months) to proceed with the rectification if the technical procedure is particularly complex (Article 11(3)).

Box 19: Rectification

As an example: it may be more important to rectify an incorrectly recorded customer delivery address than to rectify the same incorrect address in training data. This is because the former could result in a failed delivery but the latter would barely affect the overall accuracy of the model.[2]

c) Right to erasure (See the “Right to erasure” section in the Main Rights part of these Guidelines)

Data subjects hold a right to request the deletion of their personal data. However, this right might be limited if some concrete circumstances apply. According to the ICO, “organizations may also receive requests for erasure of training data. Organisations must respond to request for erasure when data subjects provided appropriate grounds, unless a relevant legal exemption applies. For example, if the training data is no longer needed because the ML model has already been trained, the organization must fulfil the request. However, in some cases, where the development of the system is ongoing, it may still be necessary to retain training data for the purposes of re-training, refining and evaluating an AI tool. In this case, the organization should take a case-by-case approach to determining whether it can fulfil requests. Complying with a request to delete training data would not entail erasing any ML models based on such data, unless the models themselves contain that data or can be used to infer it.”[3]
 

References


1ICO (2019) Enabling access, erasure, and rectification rights in AI tools. Information Commissioner’s Office, Wilmslow. Available at: https://ico.org.uk/about-the-ico/news-and-events/ai-blog-enabling-access-erasure-and-rectification-rights-in-ai-systems/ (accessed 15 May 2020).

2ICO (2019) Enabling access, erasure, and rectification rights in AI systems. Information Commissioner’s Office, Wilmslow. Available at: https://ico.org.uk/about-the-ico/news-and-events/ai-blog-enabling-access-erasure-and-rectification-rights-in-ai-systems/ (accessed 15 May 2020).

3ICO (2019) Enabling access, erasure, and rectification rights in AI systems. Information Commissioner’s Office, Wilmslow. Available at: https://ico.org.uk/about-the-ico/news-and-events/ai-blog-enabling-access-erasure-and-rectification-rights-in-ai-systems/ (accessed 15 May 2020).

Skip to content