GDPR provisions
Home » AI » General exposition » Accountability » GDPR provisions

Accountability

According to Article 5(2) of the GDPR, the controller shall be responsible for, and must be able to demonstrate, compliance with all principles of the GDPR mentioned at Article 5(1). This includes the principle of accountability (see “Accountability principle” within Part II section “Principles” of these Guidelines).

The accountability principle in the GDPR is risk-based: the higher the risk of data processing to the fundamental rights and freedoms of data subjects, the greater the measures needed to mitigate those risks.[1] The accountability principle is based on several compliance duties for data controllers, including: transparency duties (Articles 12-14); guaranteeing the exercise of data protection rights (Articles 15-22); keeping records of the data-processing operations (Article 30); notifying eventual data breaches to a national supervisory authority (Articles 33) and to the data subjects (Article 34); and, in cases of higher risk, hiring a DPO and carrying out a DPIA (Article 35).

Since the processing of personal data in AI systems might often be considered as high risk,[2] the developer of AI will often need to have a DPO and perform a DPIA. The next two sections address these two specific accountability duties.

Risk assessment and DPIAs

A DPIA is a process in which the data controller, before starting a data-processing procedure with high risk to the fundamental rights and freedoms of data subjects, assesses the impact of the envisaged processing operations on the protection of personal data (Article 35(1)).

Determining if the data processing is of high risk is not an easy task, however. Article 35(3) lists three cases: (1) a systematic and extensive evaluation of personal aspects relating to natural persons, which is based on automated processing, including profiling and on which decisions are based that produce legal effects concerning the natural person or similarly significantly affect the natural person; (2) processing on a large scale of special categories of data referred to in Article 9(1), or of personal data relating to criminal convictions and offences referred to in Article 10; and (3) a systematic monitoring of a publicly accessible area on a large scale.

With regard to innovative technologies, the Article 29 Working Party clarified some examples, such as “combining use of fingerprint and face recognition for improved physical access control” and “certain “Internet of Things” applications”. These data-processing operations are considered as high risk “because the use of such technology can involve novel forms of data collection and usage, possibly with a high risk to individuals’ rights and freedoms. Indeed, the personal and social consequences of the deployment of a new technology may be unknown.”[3]

If the processing is high risk, then a DPIA should be conducted following Article 35(7) of the GDPR. Recital 90 of the GDPR further clarifies that the assessment of risk should be done using two parameters: the likelihood and severity of high risk, taking into account the nature, scope, context and purposes of the processing and the sources of risk. Several national supervisory authorities have issued guidance on how to assess these risks, such as the Agencia Española de Protección de Datos Personales, the Information Commissioner’s Office, the Irish Data Protection Commission, the Commission Nationale de l’Informatique et des Libertés, among others (see “DPIA” in Part II, section “Main Tools and Actions” within Part II of these Guidelines).

In certain situations, if the result of the DPIA is that the intended processing activity has a high risk of causing harm to the fundamental rights and freedoms of data subjects, the controller should request the opinion of the national supervisory authority, as prescribed by Article 36 of the GDPR. Some Member States have issued lists that contain examples of data-processing activities that would trigger this mandatory consultation; among those examples, we can identify situations that match with AI techniques and, in some cases, go as far as expressly including AI. Supervisory authorities can require the adoption of certain measures to mitigate the risk, if possible, or forbidding the use of AI if it is not possible.

Checklist: is a DPIA necessary?

 The controller determined the jurisdictions where data-processing activities will take place.

 The controller checkedif those jurisdictions have enacted lists indicating the processing that require a DPIA and sawif the intended data processing activities that involve AI are covered by those provisions.

 If the controllersareunsure of the necessity of carrying out a DPIA, they consult with the DPO or, in lieu of, legal department of the controller.

 If necessary, the controller carried out a DPIA.

 If necessary, the controller filed a prior consultation with the appropriate supervisory authority.

 If changes were suggested, the controller followed the advice of the supervisory authority.

Processor due diligence

The accountability principle (see “Accountability principle” within Part II section “Principles” of these Guidelines) is also present when a controller chooses to require the services of a processor. In this regard, Article 28(1) of the GDPR[4]requirescontrollers to perform certain due diligence actions, and prior to providing processors with access to the personal data for the performance of data-processing activities. As with other provisions of the GPDR, it is not stated which specific actions a controller should carry out when evaluating processors. The only criteria provided by the GDPR is that controllers should judge processors on the basis of their ability to demonstrate that they can carry out processing activities in compliance with the GDPR.

Therefore, a researcher conducting AI development that needs to hire a third party for certain processing activities would need to ask two questions: (1) what type of conduct is expected to demonstrate compliance with this obligation; and (2), if some form of positive action is expected, how should controllers proceed to carry such due diligence?

For the first question, the GPDR indicates that if controllers intend to remain compliant with the GDPR, they can only retain a processor that is able to demonstrate their compliance with the GDPR. Therefore, controllers need to request information to assess this. In other words, the GDPR expects controllers to actively ask their potential processor about this; it is not sufficient to rely on a representations and warranties clause in the data-processing agreement (see “Integrity and confidentiality principle” within Part II section “Principles” of these Guidelines).

As for how controllers should carry out this due diligence, again the GDPR does not provide concrete issues to analyze. Nevertheless, certain national supervisory authorities have proposed topics to consider, such as whether the processor follows industry standards, to request the provision of both legal and technical information about how the processor processes personal data, if they adhere to a code of conduct, or if they have gone through a certification scheme.[5]

Besides these general considerations, and depending on how the processing requested to this third party integrates within the framework of the developed AI, further questions should be asked. In this regard, any question that the controllers would ask themselves when developing the AI should be asked to the processor. We defer to the issues posed in the Checklist for further guidance.

Checklist: processor due diligence

 The controllers required information regarding where the data-processing activities will take place, and: (1) carry out the case law review suggested below; and (2) assess if the jurisdictions, in case of non-EU countries, are deemed as adequate by the EU Commission.

 The controllers reviewed case law from the national supervisory authorities where the processor operates to check for potential sanctions.

 The controllers required proof of adherence to a code of conduct or certification.

 The controllers required proof of relevant ISO certification.

 The controllers required a copy of records of processing activities.

 The controllers enquired about the development process of the AI, in particular which kind of data were used for training the AI and the data that the AI needs to operate and deliver a useful result.

DPOs

DPOs play a crucial role when designing and implementing data-processing activities in a GDPR-compliant manner. They are another safeguard that the GDPR mandates on certain occasions and, in general, it is recommended to appoint such a figure. The Article 29 Working Party considers that this “is a cornerstone of accountability and that appointing a DPO can facilitate compliance”.[6]

Article 37(1) of the GDPR[7] outlines when controllers and processors should appoint a DPO. In the case of AI development, and as explained previously, the appointment of a DPO is (almost) certainly necessary, as many AI systems process personal data, which would make them applicable under the conditions described in Article 37(1)(a) and (b) in most situations. This opinion is shared by, as an example, the Spanish supervisory authority.[8] However, neither the Article 29 Working Party nor the EDPB has specifically stated that a DPO is mandatory if a controller or processor engages in data-processing activities that involve AI. Nevertheless, the Article 29 Working Party has pointed out that profiling activities can be considered as activities that trigger the mandatory appointment of a DPO[9] if, as pointed out above, these profiling activities involve AI.

It would be useful if each Member States’ regulations on the need for DPOs expanded the list of activities that demand the appointment of a DPO or, at least, provided clear examples that could help to interpret which data-processing activities carried out by controllers and processor demand such an appointment.

If a DPO has to be appointed, for any of the reasons mentioned above, it is necessary to have their participation in the DPIA (required by Article 39(1)(c)) as well as any other issue related to data protection within the entity (as prescribed by Article 39(1)(a)). This may include reviewing a potential processor, as described in the previous item. Therefore, the researchers involved in the development of the AI should consult with the DPO regarding the data-protection issues that might arise during the development of the AI. For example, the role of the DPO, in connection to AI systems, is also relevant for collaborating in drafting an appropriate notice, as required by Articles 13 and 14 as it corresponds, to properly communicate data subjects how the AI operates and what consequences it might have on them.

Checklist: DPOs

 The controllers checked if the institution has already appointed a DPO.

 If not, they checked with the legal department if the intended data-processing activities trigger the appointment of a DPO, either by looking at European authoritative interpretations, local regulations, local authoritative interpretations, case law – both local and European – and, finally, academic interpretations.

 The controllers required the appointment of DPOs if necessary, and theirinvolvement in the AI development process as necessary.

 As a general rule, the DPO should be aware of every step taken to allow room for their intervention if deemed relevant.

Additional information

Agencia Española de Protección de Datos Personales (2020) Adecuación al RGPD de tratamientos que incorporan Inteligencia Artificial. Una introducción, p.35. Agencia Española de Protección de Datos Personales, Madrid. Available at: www.aepd.es/sites/default/files/2020-02/adecuacion-rgpd-ia.pdf

Article 29 Working Party (2010) Opinion 3/2010 on the principle of accountability. European Commission, Brussels. Available at: https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2010/wp173_en.pdf

Article 29 Working Party (2017) Guidelines on the Data Protection Impact assessment (DPIA), pp. 9-10. European Commission, Brussels. Available at: https://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=611236

 

References


1See Articles 24, 25 and 32 of the GDPR, which require controllers to take into account the “risks of varying likelihood and severity for the rights and freedoms of natural persons” when adopting specific data protection measures.

2See, in particular, Article 35(3)(a), according to which data processing is considered as high risk in cases of, inter alia, “a systematic and extensive evaluation of personal aspects relating to natural persons which is based on automated processing, including profiling, and on which decisions are based that produce legal effects concerning the natural person or similarly significantly affect the natural person”.

3Article 29 Working Party (2017) Guidelines on the Data Protection Impact Assessment, WP248, pp. 10. European Commission, Brussels. Available at: https://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=611236 (accessed 20 May 2020).

4‘Article 28 Processor 1. “Where processing is to be carried out on behalf of a controller, the controller shall use only processors providing sufficient guarantees to implement appropriate technical and organizational measures in such a manner that processing will meet the requirements of this Regulation and ensure the protection of the rights of the data subject.”

5ICO (no date) Guide to the General Data Protection Regulation (GDPR), What responsibilities and liabilities do controllers have when using a processor? Information Commissioner’s Office, Wilmslow. Available at: https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/contracts-and-liabilities-between-controllers-and-processors-multi/responsibilities-and-liabilities-for-controllers-using-a-processor/ (accessed 20 May 2020).

6Article 29 Working Party (2017) Guidelines on Data Protection Officers (‘DPOs’), p.4. European Commission, Brussels.

7Article 37. Designation of the data protection officer. 1. The controller and the processor shall designate a data protection officer in any case where: (a) the processing is carried out by a public authority or body, except for courts acting in their judicial capacity; (b) the core activities of the controller or the processor consist of processing operations which, by virtue of their nature, their scope and/or their purposes, require regular and systematic monitoring of data subjects on a large scale; or (c) the core activities of the controller or the processor consist of processing on a large scale of special categories of data pursuant to Article 9 and personal data relating to criminal convictions and offences referred to in Article 10.

8Agencia Española de Protección de Datos Personales (2020) Adecuación al RGPD de tratamientos que incorporan Inteligencia Artificial. Una introducción, p.35. Agencia Española de Protección de Datos Personales, Madrid. Available at: www.aepd.es/sites/default/files/2020-02/adecuacion-rgpd-ia.pdf (accessed 20 May 2020).

9Article 29 Working Party (2017) Guidelines on Data Protection Officers (‘DPOs’), p.4. European Commission, Brussels.

 

Skip to content