Minimization principle
Home » IoT » Data governance: minimization, purpose limitation and storage limitation principles » Minimization principle

The minimization principle states that personal data shall be adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed (see “Data minimization”, in the “Principles” section in Part II of these Guidelines). In simple terms, it would mean reducing as much as possible the amount, categories and granularity of personal data that processed. This means that it will not be possible to collect personal data that are not going to be processed simply so that the controller can have them and use them in the future, either for the declared purposes or new ones. Unfortunately, this principle is sometimes in tension with the logic of the IoT technology. Sometimes, inferring data and profiling are necessary for the purposes of the system, but they multiply the amount of data involved in the processing. In addition, most IoT systems process many personal data between devices that are often under control of alternative processors and/or involve third parties.

There are some ways through which pervasive scenarios might be avoided. If the purpose of the processing can be obtained with no need or identifiable information, data must be made anonymous as soon as possible. In principle, IoT systems should promote the use of anonymized data, especially if those data are shared with other devices. “Since the possibility to build extensive personal profiles can be hardly avoided, data anonymization is important in the context of data sharing.”[1] In principle, this seems feasible, but in practice, it might be hard to reach. As the AEPD stated, “Linking IoT devices to unique identifiers, lined to the close links between certain devices and its users, make it virtually impossible to use such data anonymously, and the risk of re-identification skyrockets. For example, many devices require user registration or include advertisement unique identifiers, such as smart televisionsLinking unique identifiers in mobile devices is a proven fact, and such devices are widely used to interact with and operate IoT devices.”[2] Thus, controllers should not presume that their anonymization processes would serve well to preserve data subjects’ privacy. Indeed, they should perform DPIAs and risk assessments to ensure such belief (see accountability in this part of the Guidelines).

In this regard, controllers must be aware that capturing personal data that is then made anonymous accounts for processing of personal data, until such data has finished the anonymization process, no matter how little that time lapse is. Moreover, anonymization is processing, which means that it can only be lawful if a legal basis applies. Legitimate interest or consent are the most promising candidates. Once the data has been made anonymous, the processing must no longer comply with the personal data protection requirements. It may still fall under e-Privacy rules, but still in a more diluted way.

An alternative to anonymization as such is the use of aggregated data (sometimes aggregation is considered an anonymization tool). This should be reachable in most IoT systems, since most of the data values we deal with are a form of aggregation, even if this may not be evident since it may be done “invisibly” by some sensor or data collection method. Aggregation is a way of substituting several data elements by a single one. Prime examples come from statistics and include the average, median, minimum, and maximum. In the context of data protection, two kinds of aggregation have to be distinguished (see “Data minimization” within “Principles”, Part II of these Guidelines):

  • Aggregation of data elements pertaining to a single person: Taking for example a person’s average monthly income over a year reduces the information content pertaining to that person.
  • Aggregation of data elements pertaining to a multitude of persons: Taking for example the average yearly income over group of people also reduce the overall information content (data minimization). In addition, it also weakens the degree of association between a data element and a given person. This kind of aggregation is therefore also pertinent to storage limitation.

When the purpose can be achieved using aggregated data, this should be implemented. Under such circumstances, no one but the data subject should access the raw data, unless a relevant reason applies. The transformation of raw data into aggregated data should be possible in the IoT tool, so as raw material leaving the device remains the minimal strictly needed. These aggregated data should be in a standardized format.”[3] In any ways, controllers must be aware of the fact that collecting data and deleting it after a small amount of time, even milliseconds, still constitutes processing of personal data and full compliance with data protection rules is required.

If the purpose of the processing can only be obtained by processing personal data, such data can still be made pseudonymous. This will still fall under data protection and privacy rules, but accounts as a good security measure and enhances accountability.

An important topic to highlight if the fact that a controller must not create, collect and store data “just in case”. That is, data stored for the case that a future new idea comes which requires such data to develop a different project. In this case, the storage of data for longer than necessary and the repurposing of data in unlawful ways can trigger the highest fines.

Last but not least, IoT developers “should enable local controlling and processing entities (the so-called personal privacy proxies) allowing users to have a clear picture of data collected by their devices and facilitating local storage and processing without having to transmit the data to the device manufacturer.”[4]



1Rolf H Weber, ‘Internet of Things: Privacy Issues Revisited’ (2015) 31 Computer Law & Security Review 618.

2AEPD, “IoT (I): What is IoT and which risks does it entail”, at:

3Art 29 Data Protection Working Party Opinion 8/2014 on the on Recent Developments on the Internet of Things (SEP 16, 2014)

4Art 29 Data Protection Working Party Opinion 8/2014 on the on Recent Developments on the Internet of Things (SEP 16, 2014)

Skip to content