This position paper is also available for downloading in .pdf format.



The use of profiling of individuals is increasing, also in the law-enforcement sector. Given that profiling can affect large numbers of data subjects, most of whom will have done nothing unlawful, and because it can be very invasive, it should be tightly regulated.

(1) Our analysis:

Profiling is fraught with many problems. The first is what statisticians call the “base rate fallacy”. It refers to the mathematically unavoidable fact that if you are looking for very rare instances in a very large data set, then no matter how well you design your algorithm, you will always end up with either excessive numbers of “false positives” (cases or individuals that are wrongly identified as belonging to the rare class), or “false negatives” (cases or individuals that do fall within in the rare, looked-for category, but are not identified as such), or both. This limitation is inherent in profiling and should be remembered every time it is used.

The second problem is that profiling tends to reproduce societal discrimination of “out-groups”, even if characteristics such as ethnic or racial origin are not overtly part of the profiling algorithm. Discrimination can still creep in and perpetuate existing discrimination in the guise of scientifically defensible profiles. In this regard, it should be noted that under international human rights law, “unintentional” discrimination is outlawed just as well.

The third problem is related to technological advances: with data mining and analysis techniques becoming more and more sophisticated, it can become difficult to understand the logic behind the profiling – even for the authorities using it. This reinforces the two first problems mentioned above and also makes decisions based on profiles harder to challenge, since even the authorities making the decision might often not be able to supply a better explanation than “because the computer said so”.

These problems make it abundantly clear that the use of profiling should be tightly regulated. Article 9 of the proposed Directive sets out rules on profiling and contains some protection against measures based on profiling. However, unlike their peers under the proposed Regulation, individuals have no right not to be subject to such measures. What also needs to be noted is that a clear definition of “profiling” itself is missing – the term is defined neither in Article 9, nor in Article 4.

It is not clear from the Commission proposal how the main criterion for recognising an activity as profiling, namely that it “produce[s] an adverse legal effect for the data subject or significantly affect[s] them” should be interpreted. In our opinion, this formulation is too narrow and could lead to a situation where the prohibition of using measures based on profiling and automated processing, which was adopted as a general principle, will be unreasonably diluted. This is another reason for having a clear definition.

While the starting point of prohibiting profiling as a general rule and then allowing certain exceptions is the right one, we are also concerned about the scope of exceptions. Under the Commission proposal, profiling would be allowed if it is “authorised by a law which also lays down measures to safeguard the data subject’s legitimate interests”. This formulation is too general and does not guarantee an adequate level of protection of the rights of the data subject. In every case, profiling should be accompanied by specific safeguards. This applies especially to the use of sensitive data (such as data on race or ethnic origin, political opinions and religious beliefs). The proposed Directive simply says that profiling shall not be based “solely” on such characteristics. This is too weak; similarly, it does not offer adequate protection against discrimination via profiling and does not guarantee the data subject’s right to information on the logic behind the profile.

(2) Our recommendations:

  •  The definition of profiling should be brought in line with the Council of Europe Recommendation CM/Rec(2010)13. Such an explicit definition should be included in Article 4.
  • The proposed Directive should require that laws allowing profiling shall be subject to a strict test of necessity and proportionality, which would have to demonstrate that profiling is necessary in a given situation and does not affect the vital interests of a person concerned.
  • It should be clarified that profiling shall not be based on or generate any of the special categories of personal data specified in Article 8 of the proposed Directive.
  • Profiling that leads – intentionally or unintentionally – to discrimination based on such special categories should be completely prohibited.
  • When informing data subjects about the profiling measure, they should also be given information on the logic behind the profiling. This mirrors recommendations EDRi made to the provisions on profiling in the proposed General Data Protection Regulation.
  • To strengthen the ability of individuals to enforce the prohibition of profiling, Article 9 should be revised to give individuals a right not to be subject to such measures.
  • eu logo The launch and upkeep (until December 31, 2013) of this website received financial support from the EU's Fundamental Rights and Citizenship Programme.
%d bloggers like this: