Skip to main content

Strengths and Weaknesses of the Proportionality Principle for Biometric Applications

  • Chapter
  • First Online:
Privacy and Data Protection Issues of Biometric Applications

Part of the book series: Law, Governance and Technology Series ((LGTS,volume 12))

Abstract

The analysis of the practice in the previous Chapter allows for the distillation of the particular criteria used by the DPAs for biometric applications. These conditions are herein described, as well as how the DPAs cope with interference of such applications with fundamental rights. The author also defends that biometric data processing shall be made subject to a systematic double review of the proportionality, both under data protection regulation and fundamental rights. This double check is often not made. The Reform proposals seem to further adhere to a narrow view. This Chapter further lists the pro’s and con’s of the with difficulty applied proportionality principle, which has both strengths and weaknesses, to biometric data processing all together. The author argues that while the principle allows for flexibility to cope with new technology and varying situations, it proves to result in unpredictable outcomes. The authorizations and decisions of the DPAs provide insufficient legal certainty. Therefore, legislation needs to set forth the appropriate safeguards for the processing of biometric data.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 229.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 299.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 299.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    See also H. Winter and A. Sibma, Sanctionering van privacyschendingen. Een vergelijkend onderzoek in België, Duitsland en Oostenrijk, Wetenschappelijk Onderzoek- en Documentatiecentrum (WODC) (ed.), Den Haag, 2009, 64 p. (‘Winter and Sibma, Sanctionering van privacyschendingen, 2009’.)

  2. 2.

    See ECJ, Rechnungshof v. Österreichischer Rundfunk 2003, § 39. For completeness, we need to mention, however, that a debate is still going on about the required harmonization imposed by EU Directives in general.

  3. 3.

    M. Kumm, ‘Internationale Handelsgesellschaft, Nold and the New Human Rights Paradigm’, in M. Maduro and L. Azoulai (eds.), The Past and Future of EU Law, Oxford and Portland, Hart, 2010, (106), p. 7 and p. 110. (‘Kumm, New Human Rights Paradigm, 2010’). Kumm further states: ‘If, all things considered, there are good reasons that support a regulatory measure, it will be proportional’.

  4. 4.

    WP 29 Opinion 2/2005 on VIS and exchange of data (WP110), pp. 7–14. This Article 8 §2 ECHR and the proportionality principle also are applicable in relations amongst private parties and the test will therefore also have to be made even if the actors involved are private entities. About the need for such review for data protection in general, see also the Motion Franken in the Netherlands, as mentioned in Chap. 5, § 268. For a practical case, see Hoge Raad, 9.09.2011 as discussed and referenced above.

  5. 5.

    Advocate General’s Opinion, Scarlet v. Sabam, 2011, § 109. The European Parliament stated it in its report on the implementation of the Directive 95/46/EC and in the section relating to data retention as follows: ‘Believes that Member States’ laws providing for the wide-scale retention of data related to citizens’ communications for law-enforcement purposes are not in full conformity with the European Convention on Human Rights and the related case law, since they constitute an interference in the right to privacy, falling short of the requirements of: being authorized by the judiciary on a case-by-case basis and for a limited duration, distinguishing between categories of people that could be subject to surveillance, respecting confidentiality of protected communications (such as lawyer-client communications), and specifying the nature of the crimes or the circumstances that authorize such an interference; believes, furthermore, that serious doubts arise as to their necessity within a democratic society and – as specified by Article 15 of Directive 2002/58/CE – as to their appropriateness and proportionality’ (European Parliament, Report on the First Report on the implementation of the Data Protection Directive (95/46/EC) COM(2003)265, A5-0104/2004 final, 24.02.2004, pp. 9–10). See also the national court decisions on data retention legislation adopted by Member States, as explained in Chap. 5, § 329.

  6. 6.

    Advocate General’s Opinion, Scarlet v. Sabam, 2011, § 113.

  7. 7.

    The question could therefore be raised whether this means that even a triple review of the proportionality is imposed?

  8. 8.

    See also JRC, Large-scale Biometrics Deployment, 2008, p. 82.

  9. 9.

    Such safeguard could be, as we argue in Part III, for example, the use of protected templates.

  10. 10.

    WP 29 Opinion 2/2005 on VIS and exchange of data (WP110), p. 9.

  11. 11.

    Art. 3.2 §1 Directive 95/46/EC. The alphanumeric divisions are added by us. See also Part I, Chap. 3, §§ 395–396. This remains relevant for the processing of biometric data which fall within the above-mentioned activities: the processing of biometric data for such data processing operations, such as for example the fingerprint data bases held by the national police, will not be subject to the principles of the Directive 95/46/EC, unless national (data protection) law would determine otherwise and impose similar principles. Note also that some of these activities for which the Directive 95/46/EC does not apply, resemble the conditions for which interference is allowed under certain conditions under Article 8 §2 ECHR.

  12. 12.

    In other cases, the processing may also fall outside the territorial scope of the Directive 95/46/EC (see, e.g., the issues relating the Swift case and currently the PNR exchange with the United States). We however do not go deeper into this hypothesis.

  13. 13.

    See, e.g., some recent studies inter alia by Korff. See, e.g., D. Korff, Comparative Study on Different Approaches to new privacy challenges, in particular in the light of technological developments, Comparative Chart: Divergencies between data protection laws in the EU, 20 January 2010, Brussels, European Commission, p. 4, (‘Korff, Comparative Chart, 2010’), available at http://ec.europa.eu/justice/policies/privacy/docs/studies/new_privacy_challenges/final_report_comparative_ chart_en.pdf In this chart, the importance of the extension of national data protection laws to (former) Third Pillar matters is rated as having a very serious impact in terms of data protection with considerable divergences. See on this issue also Hijmans and Scirocco, Shortcomings in the EU Data Protection, 2009, pp. 1485–1525.

  14. 14.

    Article 13 (1) Directive 95/46/EC.

  15. 15.

    About other fundamental problems in our society mentioned by the CNIL, see also CNIL, the Panthéon-Assas-Paris II University and the French Senate, Informatique : servitude ou libertés?, Report, Colloquium, 7–8 November 2005, p. 4, (‘CNIL Panthéon-Assas-Paris II University and the French Senate, Servitude ou libertés, 2005’) available at http://www.senat.fr/colloques/colloque_cnil_senat/colloque_cnil_senat.html The CNIL mentions ‘biometrics, geolocalisation, videosurveillance, ethical issues, databases with debt information of private parties, criminal records and also the transatlantique relation (United States/Europe) regarding the protection of data’ (free translation).

  16. 16.

    Article 6.1(b) Directive 95/46/EC.

  17. 17.

    See also the notification obligation in many national data protection legislations, which has to be complied with before the start of the processing.

  18. 18.

    We discussed these legitimate grounds above, Chap. 5, § 279 et seq.

  19. 19.

    CBPL, Opinion N°17/2008 biometric data, §§ 42–43; see also above.

  20. 20.

    About the difficulty to define the interests of companies and employers, colleagues and third persons, see, e.g., Hendrickx, Privacy en Arbeidsrecht, 1999, pp. 48–49; R. Blanpain, Labour Law in Belgium, Alphen aan den Rijn, WoltersKluwer, 2010 (‘Blanpain, Labour Law in Belgium, 2010’).

  21. 21.

    See and compare with the Ship and Port Facility Security (ISPS) code developed for the United Nations as an amendment to the SOLAS Convention imposing to ‘detect security threats and take preventive measures against security incidents affecting ships or port facilities used in international trade’, in force since 2004, and the European Regulation (EC) No 725/2004 of the European Parliament and of the Council of 31 March 2004 on enhancing ship and port facility security (imposing to security authorities of ports to take necessary port security measures) and European Directive 2005/65/EC of the European Parliament and of the Council on enhancing port security to be implemented in Member States by 15 June 2007, imposing on ports to develop security assessments and security plans (such as the conditions of access to ports), as implemented in national legislation.

  22. 22.

    For example, health related information. See also above.

  23. 23.

    For example, legislation to combat money laundering. See also Chap. 4, § 26.

  24. 24.

    See also the Belgian DPA who explicitly refers in this context to the interest to combat fraud: CBPL, Opinion N°17/2008 biometric data, §44. The DPA refers to the risk that passwords or badges are intentionally transmitted to third parties or are abused, which is no longer possible according to the DPA in case of use of biometric authentication mechanisms.

  25. 25.

    It is rather exceptional that the use of biometric systems is explicitly mentioned in legislation and therefore, a(n) explicit legal basis required for interference with fundamental rights, remains for biometric applications problematic.

  26. 26.

    See, e.g., for Belgium, conclusions after the hearing of the representative of the federal police, Verslag Noreilde, p. 51.

  27. 27.

    See e.g., in Belgium, the Act of 21 March 2007 (as modified) and the collective labor agreement No 68.

  28. 28.

    CBPL, Opinion N°17/2008 biometric data, §57. The use of ‘and’ instead of ‘or’ would have been expected. Compare also with Art. 5 (f) Law of 1992 as modified. This requirement could for this reason at first sight also be called a ‘circular reasoning’. Later on, however, the proportionality in terms of necessity is further explained (see §§ 67–77). See also §§ 54–57 where both the necessity and proportionality as requirements are mentioned.

  29. 29.

    The Belgian DPA refers not only to the general obligation to limit the processing of data to these situations where the processing of personal data is required, but also further explains this. See CBPL, Opinion N°17/2008 biometric data, §§ 42–43.

  30. 30.

    However, at the same time, a variety of criteria were used, such as for border control.

  31. 31.

    See Chap. 4, § 19. It may well be possible that the CNIL has observed these provisions in determining its position regarding biometric applications, although this remains unsure.

  32. 32.

    See Chap. 5, §§ 508–509.

  33. 33.

    Such limited database with only biometric data of persons who have disturbed order and this has been decided with information to these persons, including the storage in a database, avoids that all persons are suspect from the start. See also Part III.

  34. 34.

    See also Chap. 5, §§ 351–353.

  35. 35.

    See, e.g., the Dutch DPA in its opinion on VIS 2000. See Chap. 5, §§ 508–509; this will also be further discussed in Sect. 6.2.4 below.

  36. 36.

    See, e.g., Unique Authorization N° AU 009 for hand geometry for access to the school restaurant.

  37. 37.

    See, e.g., the Dutch DPA in the Discopas opinion. See Chap. 5, §§ 508–509.

  38. 38.

    Other exceptions to this prohibition, besides the explicit consent is the necessity of the processing for a right or obligation under employment law, which allows the processing.

  39. 39.

    See and compare with WP 29 Opinion Consent 2011 (WP187), p. 5, which paid only limited attention to consent in Art. 8 EU Charter.

  40. 40.

    See also Chap. 4, § 283.

  41. 41.

    E. Kosta, Unravelling consent in European data protection legislation. A prospective study on consent in electronic communications, Leuven, Law faculty, 2011, 364 p. (‘Kosta, Unravelling consent, 2011’).

  42. 42.

    See, e.g., CBPL, Opinion N°17/2008 biometric data, §38. See also the CBP in for example the Discopas opinion (see Chap. 5, §§ 508–509). The importance to review consent with care was already argued and discussed in Westin, Privacy and Freedom, 1970, pp. 373–377.

  43. 43.

    CBPL, Opinion N°17/2008 biometric data, §§ 38–39. We would assume that this is valid for biometric systems which are not deemed by the CBPL as proportionate per se, but this is unsure as the ‘absolute necessity’ is further in the Advice hardly mentioned.

  44. 44.

    See the At Face Value report.

  45. 45.

    CNIL, 26 ième rapport d’activité 2005, p. 50, available at http://lesrapports.ladocumentationfrancaise.fr/BRP/064000317/0000.pdf

  46. 46.

    See, e.g., in Belgium, the Royal Decree of 13 February 2001 for the execution of the Act of 8 December 1992 states that the written consent of an employee may not be used to allow the processing of sensitive data, including medical data, by a present or future employer. See also Part I.

  47. 47.

    For reasons, see also Chap. 5, § 283. See also art. 7 of the Reform Proposal on data protection.

  48. 48.

    See CBPL, Opinion N°17/2008 biometric data, §38. About the CNIL, see below § 597.

  49. 49.

    In the UAs of the French DPA, the requirements relating to technical characteristics are very limited and only pertain to rather well established criteria, in particular the requirement of the use of a template instead of the image which were already recommended and addressed in the Article 29 Data Protection Working Party Opinion on biometrics.

  50. 50.

    See above, Chap. 5.

  51. 51.

    About the position of the Belgian and Dutch DPAs on this issue, see also Part I.

  52. 52.

    See the At Face value report which it published on its site. The reasoning in the report is however difficult to follow, because of the lack of the use of defined vocabulary. The report was also one of the forerunners compared with other reports and opinions of DPAs and a biometric vocabulary was not yet being developed (see on the importance of a biometric vocabulary also above).

  53. 53.

    See the report At Face Value of 1999. The discussion of the use of ‘protected templates’ as recently developed in the research community by the DPAs remains generally very limited. This is in contrast with the DPA of Ontario, Canada, as we will further describe in Part III.

  54. 54.

    See Article 6, 1(c) Directive 95/46/EC. In addition, these limited conditions specified by the CNIL could also have as an effect that the risks for the data subjects may be considerably reduced. Because the risks are reduced, the use of the biometric data for the increased security finality of the processing could be defended as proportional, i.e. relevant and sufficient and in proportion with the risks for the data subjects.

  55. 55.

    E.g., hand geometry, which may limit identification and re-use.

  56. 56.

    Registratiekamer, Discopas opinion 2001, pp. 9–10; see also CBP, Biometrische gegevens vreemdelingen, 2009, in which the CBP pleaded for guarantees for due removal of biometric data of individuals who are no longer aliens, improved specification of the finalities of access for the investigation of crimes and guarantees in case of improper storage and use for law enforcement purposes.

  57. 57.

    Whether this type of use is required under the Directive 95/46/EC or under Article 8 ECHR is not mentioned explicitly. About anonymous verification, see also Part III.

  58. 58.

    This was, e.g., by the CNIL in a more express way applied in its Communication of 2007 on central storage of fingerprint.

  59. 59.

    See CBP Gezichtsherkenning 2004, p. 4: ‘Het CBP is in het algemeen geen tegenstander van het gebruik van biometrie bij toegangscontrole, onder meer omdat daarmee de onnodige verwerking van persoonsgegevens voorkomen kan worden’ (emphasis added). But: CBP Gezichtsherkenning 2004, p. 4 where the subsidiary of the biometric processing is retained and in detailed reasoned.

  60. 60.

    See CNIL, 28 ième Rapport d’Activité 2007, Paris, 2008, pp. 97–120 (‘CNIL Rapport d’Activité 2007’).

  61. 61.

    CNIL Rapport d’Activité 2007, p. 102. See also, for in our view conflicting results, e.g., the decisions of the CNIL in 2000, mentioned in Chap. 5, §§ 427–428. For the positive opinion allowing central storage of fingerprint, the CNIL took in addition the split of the database and the encryption of the data into account, while for the same type of biometric system, equally involving fingerprint and central storage, it rendered a negative opinion. See also the decisions of the CNIL of 2004, described in Chap. 5, §§ 431–432 above which in some sense could be considered conflicting. See and compare also the Discopas opinion of the CBP and the refusal of the CBPL for a seemingly similar system, as mentioned in the CBPL’s annual report and above.

  62. 62.

    See above.

  63. 63.

    For characteristics which do not leave trace, the CBPL refers to characteristics such as iris or hand geometry. See, for Belgium, on this issue, also Pres. Hasselt, 19 February 2009 and Pres. Hasselt, 16 April 2009, as discussed in Chap. 5, § 380.

  64. 64.

    See also CNIL, Guide pour les employeurs et les salaries, Paris, CNIL, 2008, p. 34.

  65. 65.

    The CNIL chooses in particular for hand geometry and finger vein pattern analysis for its UAs as these characteristics are not presumed to leave traces according to the CNIL based on the present state of the art. See also above.

  66. 66.

    Part III, Chap. 7, § 49 et seq.

  67. 67.

    There seems to be no objective reason why the CNIL would not have taken the state of the art of the hand geometry into account for the N°AU-009 for access control to the school restaurant. The CNIL however omits explicit reference thereto.

  68. 68.

    It is relevant in this context to know that the French DPA also has several specialist engineers among its staff.

  69. 69.

    The CNIL however ordered in 2007 three research programs in the domain of biometric data processing.

  70. 70.

    See also above.

  71. 71.

    See also Chap. 4, § 40.

  72. 72.

    DNA, however, is as explained above for purposes of this research, not further taken into account as biometric characteristic.

  73. 73.

    See and compare with Part I, Chap. 2, § 164 and § 167.

  74. 74.

    See International Biometric Group, Biometrics Market and Industry Report 2009–2014, 2008, 229 p. One of the key aspects in the report is that fingerprint is expected to gain 45.9 % of the ‘non-AFIS biometrics market’ (sic), followed by face recognition by 18.5 % (and iris recognition at 8.3 %), as mentioned and summarized in X., International Biometric Group (IBG) Announces November 13 Webcast and Release of the Biometric Market and Industry Report 2009–2014, 11.11.2008, available at http://www.findbiometrics.com/articles/i/6060/; for other market studies, see, e.g., Frost & Sullivan, Biometrics in Europe – Future Technologies and Applications, 2009, available (for purchase) at http://www.frost.com/sublib/display-report.do?id=9834-00-04-00-00&bdata=bnVsbEB%2BQEJhY2tAfkAxMzc4OTA4NTY0NjM3; see further W. Lusoli, R. Compaňó, and I. Maghiros, Young People and Emerging Digital Services. An Exploratory Survey on Motivations, Perceptions and Acceptance of Risks, Sevilla, European Commission, JRC, 2009, p. 34 and table 14 in particular, with forecast of future uses of eID technologies, mentioning fingerprint and eye recognition, also available at http://ftp.jrc.es/EURdoc/JRC50089.pdf

  75. 75.

    Advances in biometric and smart card technology, however, may diminish this advantage of fingerprint as the size of the templates of other characteristics, allowing comparison with acceptable results, will also allow local storage of these other characteristics.

  76. 76.

    See CNIL, Biométrie: des dispositifs sensibles soumis à autorisation de la CNIL, 7.04.2011, available at http://www.cnil.fr/en-savoir-plus/fiches-pratiques/fiche/article/biometrie-des-dispositifs-sensibles-soumis-a-autorisation-de-la-cnil/?tx_ttnews%5BbackPid%5D = 91&cHash = 33c56bf40f

  77. 77.

    See about this risk and the example of VIS, Chap. 4, § 185.

  78. 78.

    See also Part I.

  79. 79.

    See Chap. 5, § 397.

  80. 80.

    A fair collection and use of biometric data implies that the data subject is always informed of the elements imposed by existing data protection legislation (in particular information about the identity of the controller and the purposes).

  81. 81.

    Whether this information is a particular condition for the fair processing of biometric data, is not made explicit.

  82. 82.

    See and compare also with the United States Federal Trace Commission (FTC) complaint over Facebook’s use of facial recognition technology, where one of the aspects is that individuals are not adequately informed of biometric information being collected about them (pp. 9–10).

  83. 83.

    An example of such decree is the decree of 30 April 2008 whereby the new French biometric passport was introduced.

  84. 84.

    See also above.

  85. 85.

    See also the Report Cabal 2003.

  86. 86.

    See Part I.

  87. 87.

    See CNIL, Délibération n° 2007-368 du 11 décembre 2007 portant avis sur un projet de décret en Conseil d’Etat modifiant le décret n° 2005-1726 du 30 décembre 2005 relatif aux passeports électroniques, p. 3 (‘CNIL, Délibération n° 2007-368 passports électroniques’); see also CNIL Rapport d’Activité 2007, p. 19 : ‘L’ampleur de la réforme et l’importance des questions justifieraient sans doute le dépôt d’un project de loi, lequel permettrait l’engagement d’un vaste débat public’; for Belgium, a regulatory act of the government would in principle not be sufficient in case of interference with fundamental rights. See AH (Belgium), N° 131/2005, 19.07.2005, B.5.1, (B.S. 08.08.2005, p. 34462 (‘Het wettelijkheidsbeginsel vloeit voort uit de WVP en het artikel 22 van de Grondwet. Volgens het reeds aangehaalde arrest van het Grondwettelijk Hof houdt artikel 22 van de Grondwet in dat « dat elke overheidsinmenging in het recht op eerbiediging van het privé-leven en het gezinsleven wordt voorgeschreven in een voldoende precieze wettelijke bepaling, beantwoordt aan een dwingende maatschappelijke behoefte en evenredig is met de nagestreefde wettige doelstelling’) and reference thereto in the Opinion of the Belgian DPA in: CBPL, Advies nr. 23/2008 betreffende voorontwerp van wet houdende oprichting van de authentieke bron voertuiggegevens (A/2008/011), 11.06.2008, p. 18 (‘CBPL, Advice No 23/2008 Authenticated Source vehicle data’). It is in this advice stated by the Belgian DPA as follows: ‘In essentie is het dus enkel de bevoegdheid van de wetgever om een algemeen systeem op te stellen die op grote schaal persoonsgegevens beoogt’).

  88. 88.

    These questions, however, are not relevant in case the processing should under the ‘safeguards’ set out in the UAs not be considered as a risk. In these UAs, conditions are specified relating to finality, technical characteristics, the kind of data processed, the receivers of the data and the term of storage, as well as some conditions relating to security measures and the rights of the data subjects, but there is no reference to the pressing need.

  89. 89.

    The same can be said for CNIL, Délibération n° 2007-368 passports électroniques, where the CNIL considers the central storage and retention of biometric data as posing ‘risks of serious interference with privacy and individual freedom’ and that such processing ‘would seriously interfere with the individual liberty’. (p. 2).

  90. 90.

    See, e.g., CBP Gezichtsherkenning 2004, p. 4.

  91. 91.

    See Corr. Brussel, 14.08.1991, Rev. dr. Pén 1992, p. 133; see also Verslag Noreilde, p. 38.

  92. 92.

    See Verslag Noreilde.

  93. 93.

    See also Part I, Chap. 3, § 289 and footnote 269. See also Verslag Noreilde, p. 21.

  94. 94.

    See Article 7 Act of 21 March 2007 on camera surveillance, as modified.

  95. 95.

    More in particular, it is specified that mobile cameras can be used ‘for major crowd assemblies’ as defined in an Act of 5 August 1922 (Article 5).

  96. 96.

    Article 7/2 states that mobile cameras shall only be used for non-permanent tasks which are limited in time.

  97. 97.

    Depending on the place where mobile cameras will be used and any urgency, the decision is taken by either an officer of the administrative police who is responsible for the operational matters as determined by the Act of 5 August 1922 or the major. The Belgian DPA should also be informed.

  98. 98.

    See, e.g., the projects of particular police zones, including the use of ‘intelligent’ surveillance camera’s, including for face recognition, as described during the parliamentary discussion. Verslag Noreilde, pp. 29–33. These projects are further implemented. Verslag Noreilde, p. 31.

  99. 99.

    During the parliamentary discussions for the adoption of the initial Act, reference was made to a regulation (of which it is not clear whether this is internal or external) adopted by the police; see and compare also with EDPS, The EDPS Video-surveillance Guidelines, Brussels, March 2010, p. 30 (‘EDPS, Video-surveillance Guidelines, 2010’). The EDPS therein stated that “high-tech video-surveillance tools” or “intelligent video-surveillance systems” are permissible only subject to an impact assessment’ and that they are subject to prior checking with the EDPS, who will ‘assess, case by case, the permissibility of the technique used and may impose, as necessary, specific data protection safeguards’. See also CBP, Recommendation 4/2012 (CO-AR-2011-011), 29.2.2012, 24 p. about the various uses of surveillance cameras.

  100. 100.

    Verslag Noreilde, p. 28.

  101. 101.

    The initial proposal for the modification of the Act of 2007 contained an explicit reference to the use of ‘smart cameras’ (see Parl. St. Senaat, 2008–09, no 4-1149/1, Art. 10), but was amended and smart cameras were no longer mentioned (see Parl. St. Senaat, 2008–09, no 4-1149/2). The legislator stated that this was not required since it would fall under the definition of surveillance camera. We do not agree with this point of view since no explicit reference is made in this definition to the use of biometric techniques. The legislator also stated that ‘if it is about systems which measure other parameters, they are no longer surveillance cameras’ [free translation’] but what is meant is not clear. See on the use of smart surveillance cameras in general also F. Coudert, ‘When video cameras watch and screen: Privacy implications of pattern recognition technologies’, Computer Law & Security Review 2010, pp. 377–384 (‘Coudert, Pattern recognition technologies, 2010’); F. Coudert and J. Dumortier, ‘Intelligent video surveillance networks: data protection challenges’, in Proceedings of the third international conference on availability, reliability and security, IEEE Computer society, 2008, pp. 975–981.

  102. 102.

    See also Chap. 5, §§ 319–339.

  103. 103.

    Registratiekamer, Discopas opinion 2001, pp. 9–10: ‘[i]n het algemeen kan het als een gerecht-vaardigd belang van de exploitant worden beschouwd om gegevens over bezoekers te verzamelen met het oog op de handhaving van de orde en veiligheid in de horeca – of sportgelegenheid’.

  104. 104.

    It should further be reviewed if it would fit the need to ‘genuinely meet objectives of general interests’ in the Union as required by Article 52 EU Charter.

  105. 105.

    E.g., there is no request of evidence of previous incidents or limitation of central storage of biometric data of troublemakers. At the same time, an obligation not to process excessive data is linked with the need for security measures (p. 12).

  106. 106.

    See also Chap. 5, § 319–324.

  107. 107.

    For the goods and installations, e.g., ‘serious and irreversible damages’ are to be taken into account, insofar it exceeds the strict interests of the controller, taking into account the need to render services to the public for such goods and installations. An example of sensitive places for fire man is given (see CNIL, Communication central storage fingerprint 2007, p. 8).

  108. 108.

    CNIL, Communication central storage fingerprint 2007, p. 3, p. 5 and pp. 7–8.

  109. 109.

    See about this requirement of the need for a decision or legislation for the processing of sensitive data for a ‘substantial public interest’, below § 651.

  110. 110.

    CNIL, Communication central storage fingerprint 2007, p. 7.

  111. 111.

    See also above.

  112. 112.

    See Chap. 5, §§ 508–509.

  113. 113.

    See, e.g., in Belgium, Wet 21 December 1998 betreffende de veiligheid bij voetbalwedstrijden (B.S., 3.2.1999); see also Verslag Noreilde, stating that the legality and legitimacy (‘wettigheid’ in the Flemish text and translated (erroneously) (to or from) ‘légitimité’ in the French text) shall be determined based on the circumstances in which the images are recorded, such as for the prevention or recording of interferences with the safety of persons or goods in places which are particular risk areas and mentioning inter alia the example soccer stadia (p. 51).

  114. 114.

    In particular, starting the section on ‘legitimacy and proportionality’ (‘Rechtmatigheid en proportionaliteit’/‘La légitimité et la proportionnalité’), it analyses the risks for the data subjects on pp. 12–13, makes a (short) reference to the essential requirement that the proportionality and justification of biometric data systems shall be reviewed in a strict manner on p. 14, and gives examples and introduces additional criteria for this proportionality review on pp. 17–19 (see also above). Unfortunately, the use of wrong terms (or translations), for example in the Flemish text (see, e.g., § 36 ‘Om gerechtvaardigd te zijn’ which should be ‘Om rechtmatig te zijn’ and ‘Om legitiem te zijn’ which should be ‘gerechtvaardigd’ and which refers in our opinion to the proportionality principle sensu strictu (see pp. 17–19 of the Opinion)) adds to the confusion.

  115. 115.

    See CBPL, Opinion N°17/2008 biometric data, § 39.

  116. 116.

    See also CBPL, Opinion N°17/2008 biometric data, § 67, where the CBPL requires that the biometric system is compared with other similar systems on the market.

  117. 117.

    For a similar conflict of interests involving Article 6 ECHR (right to a fair trial), where the Court has opted for not ‘giving up’ the rights of the defendant, especially not in a society governed by the rule of law, in the advantage of opportunism reasons, and the case there mentioned, see P. Lemmens, ‘Article 6 E.V.R.M., in het licht van de algemene rechtsbeginselen die het huldigt en van de doelstellingen die eraan ten grondslag liggen’, in W. Van Gerven, P. Van Orshoven and R. De Corte, et al., De norm achter de regel: wet, doel en algemene rechtsbeginselen, Gent, Story-Scientia, 1995, (160), p. 179.

  118. 118.

    CBP Gezichtsherkenning 2004, p. 4; but : see its advice on the biometric passport in 2001 (see Chap. 5, § 510 et seq.).

  119. 119.

    As explained above, Article 52 also requires that any limitation may only be made if they are necessary and genuinely meet objectives of general interest of the Union.

  120. 120.

    See and compare also with the additional measures for video surveillance specified by the EDPS in EDPS, Video-surveillance Guidelines, 2010, 63 p.

  121. 121.

    See Chap. 5, §§ 473–478.

  122. 122.

    The CNIL only will request documentation in which inter alia the error rates are mentioned, without specifying however any acceptable rates (see p. 10). See also about the ‘weak link’ doctrine, Part III, Chap. 7, § 73.

  123. 123.

    See also Pres. Hasselt, 19 February 2009 and Pres. Hasselt, 16 April 2009, as discussed in Chap. 5, § 380. We will further explain that local storage on an object held by the data subject also permits the data subject to keep control over his or her biometric data.

  124. 124.

    CBPL, Opinion N°17/2008 biometric data, §59.

  125. 125.

    It was however not very clear from the opinion to what extent central storage and the identification functionality would be used for all visitors (e.g., to identify ‘troublemakers’ post factum). See also above.

  126. 126.

    See on this possibility above, e.g., in Chap. 5, § 456.

  127. 127.

    See, e.g., the Dutch DPA in its opinion on VIS 2000. See Chap. 5 , §§ 508–509.

  128. 128.

    Only a ban of the use and storage of biometric data for a specific aim, i.e. marketing purposes, was further imposed by the Dutch DPA.

  129. 129.

    CBP Gezichtsherkenning 2004, p. 6.

  130. 130.

    See also Part I.

  131. 131.

    See Chap. 5, § 504.

  132. 132.

    See Chap. 5, § 501. This problem can in the meantime be solved, as we will explain in Part III.

  133. 133.

    These advantages include the control over his or her biometric data. See also Part III.

  134. 134.

    See also above.

  135. 135.

    See and compare with the recommended privacy-friendly technology as specified by the EDPS in EDPS, Video-surveillance Guidelines, 2010, pp. 12–13.

  136. 136.

    CBPL, Opinion N°17/2008 biometric data, § 7.

  137. 137.

    CNIL, Biométrie: des dispositifs sensibles soumis à autorisation de la CNIL, 7.04.2011, available at http://www.cnil.fr/en-savoir-plus/fiches-pratiques/fiche/article/biometrie-des-dispositifs-sensibles-soumis-a-autorisation-de-la-cnil/?tx_ttnews%5BbackPid%5D=91&cHash=33c56bf40f

  138. 138.

    The societal issues raised and decided under Article 8 ECHR cover many domains, e.g., issues on rights of suspects, discrimination of unmarried mothers, implantation of embryos, treatment of homosexuals, etc.

  139. 139.

    The use of surveillance cameras is another example which caught the attention of the Court in Peck.

  140. 140.

    The crucial role of the DPAs in data protection was also recognized by the EDPS. See P. Hustinx, The European Approach: Regulation through Protection Authorities, 8 November 2005, p. 1, addressing the Colloquium ‘Information technologies: Servitude or Liberty?’ in Paris, available at https://secure.edps.europa.eu/EDPSWEB/edps/cache/off/EDPS/Publications/SpeechArticle/pid/102 (‘Hustinx, Regulation through Protection Authorities, 2005’): ‘The way in which data protection authorities fulfill their tasks in that context, is a key factor in the success of the European model and has allowed, for instance, dealing with competing interests and new developments in a flexible way’ (emphasis added).

  141. 141.

    This may be different for other courts, for example, the Constitutional Court in Germany or in Belgium. The latter has annulled legislation which it found in breach of the legality principle as required under the fundamental right to privacy laid down in Article 22 of the Belgium Constitution. See also e.g., GwH (Belgium), N° 59/2010, 27 May 2010 as discussed above in Chap. 4 , at § 27.

  142. 142.

    Von Hannover 2004, § 70.

  143. 143.

    See Part I, Chap. 3, § 437.

  144. 144.

    See, e.g., Th. Murphy and G. Ócuinn, ‘Works in Progress: New Technologies and the European Court of Human Rights’, Human Rights Law Review 2010, pp. 601–638. Some authors, however, were less enthusiastic or at least have mixed thoughts about the ‘wild interpretative method’ used by the Court. See De Hert, Balancing security and liberty, p. 74 et seq. See also below.

  145. 145.

    See on this aspect, e.g., Burgorgue-Larsen, L’appréhension constitutionnelle de la vie privée en Europe 2005, pp. 69–115.

  146. 146.

    Free translation from J. De Meyer, ‘Quelques aspects de l’action de la Cour européenne des droits de l’Homme’, E.D.C.E. 1989, p. 269. De Meyer was judge at the ECtHR from 1986–1998.

  147. 147.

    See, e.g., Docquir, who apparently adheres to this view. Docquir, Vie Privée, 2008, p. 111.

  148. 148.

    See Velu, Le droit au respect de la vie privée, 1974, p. 56.

  149. 149.

    Doc. DH/Exp. (70) p. 15, § 37, mentioned and discussed by Velu, in particular Velu, Le droit au respect de la vie privée, 1974, p. 56, footnote 113. Whether the drafters of the Directive 95/46/EC have actually done so, however, is not clear. Preparatory documents relating to the Directive 95/46/EC need to be further researched and analyzed for this purpose.

  150. 150.

    See also our recommendation in Part III for an obligation to provide alternative means if consent is relied upon.

  151. 151.

    See, for example, also De Bot, e-government, 2005, p. 38, no 81. The authors seems to introduce this check in a weakened form, i.e., only the proportionality check under the requirement that the purposes have to be legal or justified (‘wettig of gerechtvaardigd’). For another example of influence of Art. 8 ECHR upon the Directive, see the provision stating that upon ‘reasons of substantial public interest’ exemptions of the prohibition to process sensitive data may be provided for by decisions of the supervisory authority (Art. 8(4) Directive 95/46/EC). This notion (‘substantial’) in the Directive includes in our view reference to the need for proportionality. About this article and the Proposal for Regulation, see footnote 199 below. For an additional example of influence of Art. 8 ECHR, see Article 13 Directive, also mentioned in Chap. 5, footnote 280 above.

  152. 152.

    CNIL, Communication central storage fingerprint 2007, p. 7.

  153. 153.

    See Art. 7 Act N° 78-17.

  154. 154.

    See Chap. 4, §§ 18–19. See also Chap. 5, §§ 319–322.

  155. 155.

    See CNIL, Guide pour les employeurs et les salaries, p. 36.

  156. 156.

    CBPL, Advice N° 17/2008, §53.

  157. 157.

    See also above. At the same time, the DPAs do not seem to consider seriously that biometric data are sensitive, and rather mention it at the side.

  158. 158.

    We also refer to our discussion of the interests of the data controllers and of the data subjects which may be involved in particular biometric applications in cases discussed in Part III. However, our analysis remains a first estimation and attempt to render some motives more explicit. That the involvement of stakeholders and related interest are for the introduction of new technologies and applications often (very) complex, see also, for the case study of the introduction of a chip card for public transportation in the Netherlands, and the interests involved, including an overview of the parties involved, Wetenschappelijke Raad voor het Regeringsbeleid, iOverheid, Amsterdam, Amsterdam University Press, 2011, p. 114 and Fig. 4.1.

  159. 159.

    Van Gerven, Proportionality. National Viewpoints, in Ellis, The Principle of Proportionality, 1999, p. 61.

  160. 160.

    See also Van Gerven, Proportionality. National Viewpoints, in Ellis, The Principle of Proportionality, 1999, pp. 60–61.

  161. 161.

    See the case Scheichelbauer v. Austria, no. 2645/65 discussed by Velu, Preadvies, 1974, p. 67. The case involved the use in court of a recording as evidence and required balancing of the rights of the accused against the public interests in the elucidation of crime and the apprehension of offenders. This balance was required because of the absence of a (common) European ground for excluding unlawfully obtained evidence.

  162. 162.

    In its commencement, it refers generally to the Convention N° 108 of 1981 (to be distinguished from the ECHR). See also a review by the CNIL of the processing of sensitive data, in which the CNIL relied directly on the fore mentioned Convention to impose written consent: CNIL, Déliberation 88-125, 22.11.1988, Expertises 1988, p. 386.

  163. 163.

    See also Part I, Chap. 3, § 430 and § 455.

  164. 164.

    Cons. const. (France), n°2004-505 DC, 19 November 2004, Traité établissant une Constitution pour l’Europe. See also Burgorgue-Larsen, L’appréhension constitutionnelle de la vie privée en Europe 2005, pp. 105–107. Although the Act N° 78-17 states in Article 1 that informatics should not affect ‘human rights, private life, individual or public liberties’, references to the rights as laid down in the Convention in application of the Act N° 78-17 were rare.

  165. 165.

    Y. Détraigne and A.-M. Escoffier, Rapport N° 441, Senate, 2008–09, p. 39, referring to A. Türk, president of the CNIL, explaining the four important principles, including the proportionality principle, which are not explicitly stated in the Act N° 78-17 (but ‘transparaissaient néanmoins en filigrane ‘) and which were clearly confirmed since the Directive 95/46/EC and the Act of 6 August 2004 (modifying the Act N° 78-17).

  166. 166.

    See, on the (complex) legal system and constitutional review in France, Part I, Chap. 3, § 455.

  167. 167.

    See above.

  168. 168.

    See CBPL, Opinion N°17/2008 biometric data, § 40.

  169. 169.

    See Article 29 Data Protection Working Party, Opinion 8/2001 on the processing of personal data in the employment context, WP 48, 13 September 2001, p. 23 (‘WP 29 Opinion employment context 2001 (WP48)’).

  170. 170.

    See Šušnjar, Proportionality, 2010, p. 297 et seq. The author discusses various objectivity standards of legal reasoning combined with relevant constitutional requirements (such as e.g., the separation of powers).

  171. 171.

    See also Emilou, The principle of proportionality in European Law, 1996, pp. 26–37 and pp. 62–63.

  172. 172.

    For the speeches of the EDPS, see http://www.edps.europa.eu/EDPSWEB/edps/EDPS/Publications/SpeechArticle The EDPS has repeated its concerns in relation to the Proposals for Reform 2012. See EDPS, Opinion reform package 2012, 75 p. The EDPS therein stated, with regard to the delegated acts by the Commission (see also Part III, Chap. 7, § 21) that it ‘should be ensured that the essential elements are sufficiently defined in the legislative act’ (p. 33). On this particular subject, see also WP 29 Opinion further input reform discussions 2012 (WP199), pp. 10–12 and Part III, Chap. 8, § 216.

  173. 173.

    This criticism is even reinforced by the fact that there is even no further motivation for particular criteria in these UAs.

  174. 174.

    See also Kumm stating about the rationalist conception of human rights and the principle that this leads to ‘a massive empowerment of the judicial branch’ (see Kumm, New Human Rights Paradigm, 2010, p. 112); see, in relation more in particular with the principle of proportionality in Union law, Jacobs, Recent Developments Principle of Proportionality, in Ellis, the Principle of Proportionality, 1999, p. 20.

  175. 175.

    Article 8(4) of the Directive 95/46/EC. This is an optional exemption. For critical comments of the EDPS on the broad use of the notion of public interest in the Proposals for Reform 2012, allowing for exemption to the main principles without a further definition of these public interests, see EDPS, Opinion reform package 2012, p. 14.

  176. 176.

    See Chap. 5, §§ 304–318.

  177. 177.

    See Chap. 5, § 305.

  178. 178.

    Advocate General’s Opinion, Scarlet v. Sabam, 2011, § 113.

  179. 179.

    Kumm, New Human Rights Paradigm, 2010, p. 110.

  180. 180.

    But: see Kumm, discussing the proportionality principle in a changing context from legalism to rationalism in Union law, stating that the ECJ emphasizes that what counts as proportional shall be assessed in the light of the objectives of the Union, from which it follows that within the rationalist conception of human rights there is only limited space ‘for the kind of inspiration by Member States’ constitutional traditions’. Kumm, New Human Rights Paradigm, 2010, p. 113.

  181. 181.

    See also Part III, Chap. 7, footnote 339. Similar diverging opinions were reported by controllers intending to install a same system (in particular access control system) in different Member States, but this remains however often undocumented.

  182. 182.

    This is contrary to the French data protection requirements.

  183. 183.

    See Chap. 5, §§ 427–428. See also the changed position of the French DPA with regard to the use of hand geometry biometric systems for time and attendance control, as discussed in Chap. 5, § 439 et seq.

  184. 184.

    See also Kumm, New Human Rights Paradigm, 2010, p. 117, referring to overburdened courts.

  185. 185.

    Kumm, New Human Rights Paradigm, 2010, pp. 106–110. He hereby states that ‘interests protected as rights and countervailing policy considerations compete on the same level and are subject to the same equation within proportionality analysis. There is no built-in priority for claims that involve an infringement of the scope of a right’.

  186. 186.

    Kumm, New Human Rights Paradigm, 2010, p. 110 and p. 113.

  187. 187.

    See G. González Fuster and P. De Hert, ‘PNR and Compensation’, in Lodge, J. (ed.), Are you who you say you are? The EU at Biometric Borders, Nijmegen, Wolf Legal Publishers, 2007, pp. 101–111 and referring to the opinion of the Advocate General Léger in ECJ, PNR case 2006 (see also Part I, Chap. 3, footnote 118).

  188. 188.

    Van Drooghenbroeck states it as follows: ‘(…) certains on pu laisser entendre que le jugement de proportionnalité se réduirait à une pure comparaison de faits évacuant toute subjectivité de celui qui le pose. (…) Il va de soi qu’une telle présentation est profondément naïve’. Van Drooghenbroeck, La proportionnalité. Prendre l’idée simple au sérieux, 2001, p. 15

  189. 189.

    E.g., the opinions of the Dutch DPA or the advice of the Belgian DPA on the necessity.

  190. 190.

    Before, however, there was discussion amongst legal authors and in case law on the existence of such positive obligation.

  191. 191.

    Storck 2005, § 149. The Court found in that case that the State failed to provide effective control over private psychiatric institutions at the relevant time and failed to protect the applicant, who was detained against her will in a private psychiatric institution and was given medication that harmed her health, against interferences with her private life as guaranteed by Article 8 §1 ECHR. The Court stated that private psychiatric institutions, in particular those where persons are held without a court order, needed not only a license, but also competent supervision on a regular basis as to whether the confinement and medical treatment is justified (§103).

  192. 192.

    Ibid. § 101. It should be noted that some may argue that this positive obligation allows that individuals have a legal case against their State before the ECtHR, in case of non-respect of their fundamental rights by other private parties, without a further need for horizontal effect of Article 8 ECHR. We believe that such position however does not pay full respect to the fundamental human rights as core values in a democratic society (see Part I), and these rights are therefore also in our opinion applicable in relations between private parties. On this basis, we therefore defended above the need for a double proportionality review, also for relations between private parties.

  193. 193.

    Storck v. Germany 2005, § 103.

  194. 194.

    K.U. v. Finland 2008, § 43. In this case, unknown person(s) had placed an advertisement on a dating site on the Internet in the name of a minor of 12 years old, mentioning his age, year of birth, a detailed description of his physical characteristics and a link to a webpage of the minor with his picture and phone number (which was accurate save for one digit). The advertisement, stating that the boy was looking for an intimate relationship, was placed without the knowledge of the minor. See also Chap. 4, § 35. For other cases in which the Court found that the defending state omitted to take effective steps to protect the rights of the applicants, see, e.g., also ECtHR, I. v. Finland, no. 20511/03, 17 July 2008 (‘I. v. Finland 2008’) and ECHR, Costello-Roberts v. United Kingdom (no 13134/87 of 25.03.1993), where the State was held responsible for the act of a headmaster of an independent school on account of its obligation to secure to pupils their rights of inter alia Article 8 ECHR (§§ 27–28).

  195. 195.

    On the margin of appreciation, see also above.

  196. 196.

    W. Van Gerven, ‘Principe de proportionnalité, abus de droit et droits fondamentaux’, J.T. 1992, p. 309.

  197. 197.

    E.g., freedom of expression and confidentiality of communications. About the diverging views on which interests have to be taken into account, see Chap. 5, § 321 and in particular footnote 269 and §§ 614–615 above.

  198. 198.

    See also Part III, Chap. 7, §§ 163–168.

  199. 199.

    Art. 8 (4) Directive 95/46/EC. Such reasons for substantial public interest seem to coincide with the aims mentioned in Art. 8 § 2 ECHR. See De Bot, Verwerking persoonsgegevens, 2001, pp. 151–152. This ‘general’ provision is no longer contained in European Commission, Proposal for General Data Protection Regulation COM(2012) 11 final. See also and compare with the comments and criticism of the EDPS, requesting ‘an additional, separate provision which exhaustively lists the grounds of public interest (…)’: EDPS, Opinion reform package 2012, pp. 53–54. Exemptions for ‘public interest’ by law (and no longer by decision of public authority) are recovered in part for personal data concerning health in the proposed Art. 81, 1 (b) and (c) for well determined purposes. See also and compare with Art. 9 (j) of the same proposal on data in relation to criminal convictions and referring to ‘important public interest reasons’. The well determined purposes in the proposed Art. 81, 1 (b) and (c) in our opinion however do not fit for biometric applications.

  200. 200.

    The notion would hence include at least the notions under Art. 8 § 2 ECHR discussed above (see Chap. 5, §§ 329–332) but seems to be broader.

  201. 201.

    Compare with the Communication of the CNIL relating to the central storage of fingerprint. The CNIL explains that the central storage of fingerprint can only be legitimate and justified if there is an ‘important necessity for security reasons’ (‘un fort imperative de sécurité’) which surpasses the strict interests of the organization. The illustrations given by the CNIL in this Communication, however, places the enforcement of access control to specific places of a nuclear power installation at the same level as the need to enforce the security to a room of an intellectual property advisor. This is in our view confusing and the examples given dilute the previous criteria.

  202. 202.

    E.g., re-inforcement of the access control to specific area’s in nuclear plants after some incidents have proven that the access control need to authenticate in an improved way authorized personnel, may, upon further conditions, be in the substantial public interest of having secured access to nuclear power installation.

  203. 203.

    E.g., the use of facial recognition systems could for reasons of substantial public interest in particular circumstances without consent for limited time be allowed.

  204. 204.

    See Part III.

  205. 205.

    See also Cons. const. (France) n°2012-652, 22 March 2012 (Loi protection de l’identité) also mentioned in Part III, Chap. 7, § 186, § 6. See also Ergec, Les libertés fondamentales et le maintien de l’ordre dans une société démocratique, Ergec, et al., Maintain de l’ordre et droits de l’homme, 1987, p. 31: ‘Enfin, la règle de la proportionnalité suppose que l’ingérence dans les libertés soit assortie de garanties adéquates contre les abus.(…) Bien que le controle juridictionnel offre la garantie la plus efficace, il peut, dans certaines circonstances, être supplanté par un contrôle parlementaire ou même administratif présentant un minimum de garanties.(..)’.

  206. 206.

    CNIL, Panthéon-Assas-Paris II University and the French Senate, Servitude ou libertés, 2005, p. 4:’La CNIL ne se prononce pas seulement en légalité pure et encore moins en opportunité. Lorsque l’on a expliqué à la CNIL quelle était la finalité d’un project, elle se prononce en termes de proportionnalité. Mais tous les juristes qui sont présents dans la salle savent à quel point la notion même de proportionnalité emprunte à la fois au concept de légalité et au concept d’opportunité’.

  207. 207.

    The EDPS pleads in the discussion about the reform of the Directive 95/46/EC for clarifying this principle in explicit provisions and being ‘as precise as possible with regard to the core elements determining the lawfulness of data processing’ and further omitting this term. EDPS, Opinion 14.01.2011 on a Communication from the Commission to the European Parliament, the Council, the Economic and Social Committee and the Committee of the Regions – “A comprehensive approach on personal data protection in the European Union”, p. 13 (‘EDPS, Communication on Personal Data Protection, 2011’); see also Part III.

  208. 208.

    That this is relevant, is apparent the more because the fundamental right to the protection of personal data shall be considered in relation to its function in society. ECJ, Volker und Markus Schecke, 2010, § 48. The Court hereby referred to ECJ, C-112/00 Schmidberger, ECR 2003, p. I-5659, § 80.

  209. 209.

    See also Gutwirth, De toepassing van het finaliteitsbeginsel, 1993, pp. 14444: ‘De belangenafweging en de toepassing van het proportionaliteitsbeginsel zullen bijgevolg aan de orde zijn, en dat wel onverminderd het effect van de voorijking van de balans: er moet een doortastend belang aanwezig zijn om een inmenging in het fundamenteel recht op privacy van de burgers te rechtvaardigen’ (emphasis added) (in a previous version of the article by Gutwirth, however, the words ‘hoger belang’ were used). See also the position of the CNIL in CNIL, Communication central storage fingerprint 2007, discussed above.

  210. 210.

    See also Kumm, New Human Rights Paradigm, 2010, p. 115.

  211. 211.

    Ibid., p. 107.

  212. 212.

    See Chap. 5, § 329.

  213. 213.

    See, e.g., some opinions of the CNIL which we criticized.

  214. 214.

    See, e.g., the opinions of the CBPL and the CBP on the use of a similar access control system VIS 2000 discussed above.

  215. 215.

    For example, in relation with face recognition.

  216. 216.

    See and compare also with Van Kralingen, Prins en Grijpink, Het lichaam als sleutel, 1997, pp. 59–61.

  217. 217.

    A restriction of the use of biometric applications for particular legitimate aims also implies that use of biometric technology would always infringe fundamental rights. If appropriate safeguards are defined and respected (e.g., local storage under the control of the data subject, use of protected templates, etc.…), this may no longer be the case. Only in the hypothesis the safeguards would not be respected or fit (e.g., the controller requires a database for protecting the rights of others), a law defining the legitimate aim(s) for interference with Art. 8 ECHR is required. For practical examples, see Part III.

  218. 218.

    See also Gutwirth, De toepassing van het finaliteitsbeginsel, 1993, p. 1456 et seq.

  219. 219.

    Art. 8 (4) Directive 95/46/EC. About this article and the Reform proposals, see also footnote 199 above.

  220. 220.

    For examples, see Part III, Chap. 7, section 7.3.

  221. 221.

    See and compare, e.g., with legislation adopted in Slovenia (see also Part III, Chap. 8, § 218).

  222. 222.

    About ambient intelligence environments and the role of the body, see, e.g., the research conducted in the EU-funded project ACTIBIO (2008–2011), also mentioned in Part I, Chap. 2, footnote 64. ACTIBIO researches the combined use of various biometric characteristics, including dynamic and soft biometric characteristics, in combination with “always on” networks and service infrastructures. The integration of biometric methods with this so-called Ambient Intelligence security infrastructure allows continuous verification of identity and identification and monitoring of individuals. This new ‘smart environment’ poses various legal and other issues. See, on this topic, e.g., M. Hildebrandt, ‘Privacy en identiteit in slimme omgevingen’, Computerrecht 2010, pp. 273–282.

  223. 223.

    See also C. Lobet and Y. Poullet, ‘The Challenge of the Interdisciplinarity’, presentation at 30 years CRID, 20-22 January 2010; see also AFIS and Biometrics Consulting, Inc., Biometric identification on Cloud Computing. A solution for major government identification initiatives: Process terabytes of biometrics data rapidly using clusters of hundreds to thousands of nodes, available at www.afisandbiometrics.com/biometric_identification_on_cloud_computing

  224. 224.

    JRC, Biometrics at the Frontiers, 2005, p. 98.

  225. 225.

    Gutwirth, De toepassing van het finaliteitsbeginsel, 1993, pp. 14446–1447: ‘Voorbij die vastgestelde noodzakelijkheid, zal de verwerking nochtans alleen wettig zijn, indien zij de proportionaliteitstoets doorstaat: de verwerking mag – niettegenstaande zijn “interne” noodzaak – geen disproportionele inmenging betekenen in de privacy van de burgers (…)’ and ‘De materiële of inhoudelijke wettigheidsvereiste is ook toepasselijk in de private sector. De finaliteit van de verwerking zal hier in de eerste plaats moeten overeenstemmen met de finaliteit van de activiteit van de verantwoordelijke houder. (…) Daarenboven zal de (“externe”) proportionaliteitstoets sensu stricto ook doorgevoerd moeten worden: de verwerking en zijn finaliteit mogen niet op disproportionele wijze indruisen tegen de fundamentele vrijheid van de privacy van de burgers. (…)’ (footnotes mentioned by Gutwirth to Poullet and Léonard in citation omitted). While we fully agree with the principles of the need for a double proportionality check, we do not adopt however the terminology of ‘material’ or ‘internal’ and external’ lawfulness or proportionality check.

  226. 226.

    Gutwirth, De toepassing van het finaliteitsbeginsel, 1993, p. 1436, referring to Van Gerven, including to W. Van Gerven, Hoe blauw is het bloed van de prins, Antwerpen, Kluwer, 1983, p. 16, nos. 44–45.

  227. 227.

    Hossein, Privacy as Freedom, 2006, p. 143. Hossein also states that our attitudes toward privacy change in the face of technology, as they have changed since the advent of cameras, the tabloid press, the telegraph, databases and the Internet. Hossein gives also the example of the VISIT system, whereby visitors to the U.S. grow accustomed to submitting their fingerprints. In this case, they are ‘less likely to be offended when their home governments require their fingerprints for more general purposes’. Ibid. p. 143.

  228. 228.

    See Kumm above. However, biometric data processing should be considered in our view as seriously affecting fundamental rights.

  229. 229.

    These legitimate grounds are for processing of data both in the private and public sector.

  230. 230.

    See Part I, Chap. 3, § 238.

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer Science+Business Media Dordrecht

About this chapter

Cite this chapter

Kindt, E.J. (2013). Strengths and Weaknesses of the Proportionality Principle for Biometric Applications. In: Privacy and Data Protection Issues of Biometric Applications. Law, Governance and Technology Series, vol 12. Springer, Dordrecht. https://doi.org/10.1007/978-94-007-7522-0_6

Download citation

Publish with us

Policies and ethics