Accountability and oversight
Home » IoT » Accountability and oversight

The accountability principle in the GDPR is risk-based: the higher the risk of data processing to the fundamental rights and freedoms of data subjects, the greater the measures needed to mitigate those risks.[1] (See the section “Accountability Principle” within “Principles” in Part II of these Guidelines). The accountability principle is based on all compliance duties for data controllers, including: transparency duties (Articles 12-14); guaranteeing the exercise of data protection rights (Articles 15-22); keeping records of the data-processing operations (Article 30); notifying eventual data breaches to a national supervisory authority (Articles 33) and to the data subjects (Article 34); and, in cases of higher risk, hiring a DPO and carrying out a DPIA (Article 35).

Since the processing of personal data in IoT systems might often be considered as high risk,[2] the developers of AI will often need to have a DPO and perform a DPIA. Also, controllers should create a Data Protection Policy that allows the traceability of information. Finally, if approved codes of conduct exist, these could also be implemented (see the “Economy of scale for compliance and its demonstration” subsection in the “Accountability” section of the “Principles” in Part II of these Guidelines).

Box 8: The difficulty of accountability in IoT development

Accountability is an essential requirement given the risks inherent in IoT, such as the “opaque nature of distributed data flows; inadequate consent mechanisms, and lack of interfaces enabling end-user control over the behaviors of Internet-enabled devices”[3].

Another particularly complex issue is the fact that the IoT enables many tools and technologies that have their own data protection risks. Particularly, AI, machine learning, big data, cloud computing, “with personal data collected by IoT devices typically being distributed to the cloud for processing and analytics”[4].

There are standards being developed by CEN and CENELEC

See the list here:

https://standards.cen.eu/dyn/www/f?p=204:32:0::::FSP_ORG_ID,FSP_LANG_ID:2307986,25&cs=1F4A71C19873519CC81C4B2C031CF3CF5

References


1See Articles 24, 25 and 32 of the GDPR, which require controllers to take into account the “risks of varying likelihood and severity for the rights and freedoms of natural persons” when adopting specific data protection measures.

2See, in particular, Article 35(3)(a), according to which data processing is considered as high risk in cases of, inter alia, “a systematic and extensive evaluation of personal aspects relating to natural persons which is based on automated processing, including profiling, and on which decisions are based that produce legal effects concerning the natural person or similarly significantly affect the natural person”.

3Urquhartet L. et al, Demonstrably doing accountability in the Internet of Things, International Journal of Law and Information Technology, 2019, 27, 1–27

4Ibid.

Skip to content