Skip to main content

The Criteria for the Correct ‘Balancing of Rights’

  • Chapter
  • First Online:
Book cover Privacy and Data Protection Issues of Biometric Applications

Part of the book series: Law, Governance and Technology Series ((LGTS,volume 12))

  • 2772 Accesses

Abstract

In this Chapter, the author lists – based on the research and results described in the previous Chapters – a first set of criteria which do matter for biometric applications and for limiting interference with the fundamental rights and freedoms of the data subjects involved. Besides some criteria over which a consensus seems to have grown over the years, such as the use of templates, the author stresses the importance of the distinct use of the verification and identification functionality and control by the data subject. Other criteria apply as well. While the criterion of biometrics ‘which leave traces’ is criticized, the author for example also pays attention to the use of biometric data in an anonymous and pseudonymous way and to the need of effective systems.

This Chapter concludes with an application of the principles and criteria distilled from the existing legal provisions and practice of the data protection authorities in the day to day practical cases, including access control for employees, access to private clubs, for customers, use of facial images on social networks, use at public events and biometric data processing for research.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 229.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 299.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 299.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    I.e., the review of the legitimate basis.

  2. 2.

    About all (legal) requirements, see also Part I, Chap. 3, §§ 378–388.

  3. 3.

    See and compare with Verslag Noreilde, in which the use of mobile cameras are discussed. These criteria were later incorporated in modifications to the camera surveillance legislation of 2007; see also Gutwirth, De toepassing van het finaliteitsbeginsel, 1993, discussing in depth the finality and purpose principle and the proportionality principle.

  4. 4.

    See also Part II, Chap. 5, §§ 250–251.

  5. 5.

    See also the comments of the German Constitutional Court in its decision of 2 March 2010 requesting more specific safeguards (e.g., security measures, detailed description of use, etc.) in relation to the data retention legislation (see Part II, Chap. 5, footnote 296).

  6. 6.

    See EU Commission, Impact Assessment, available at http://ec.europa.eu/governance/impact/index_en.htm and the various initiatives in this regard. The initiatives of the Commission are however not so much directed to the use of IAs for assessing privacy and data protection. About the launch of the Impact Assessment in the Union, see also the Communication of the Commission in this regard, COM(2002)276 of 5 June 2002, available at http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:52002DC0276:EN:NOT

  7. 7.

    See, e.g., in relation with another identification technology, the Industry Proposal for an IA for RFID Applications, as commented by the Article 29 Working Party: Article 29 Data Protection Working Party, Opinion 9/2011 on the revised Industry Proposal for a Privacy and Data Protection Impact Assessment Framework for RFID Applications, 11.02.2011, 7 p. and Annex, 24 p. (‘WP 29 Impact Assessment Framework for RFID Applications (WP180)’). The criteria therein discussed aim mainly at listing the risks and limiting the interference with the privacy rights by applying specific measures.

  8. 8.

    See and compare with EDPS, Video-surveillance Guidelines, 2010, p. 30. The likeliness of impostors could also be included in this assessment (see on this particular aspect, Part I, Chap. 2, footnote 139).

  9. 9.

    See EPEC, Study for the Extended Impact Assessment of the Visa Information System. Final Report, 2004, 95 p.; see also and compare with the Impact Assessment Framework developed for RFID Applications. See also the EDPS deploring the lack of an IA for the modification of the Regulation No 2252/2004: EDPS, Opinion of 26 March 2008 on the proposal for a Regulation of the European Parliament and of the Council amending Council Regulation No 2252/2004, O.J. C 200, 6.08. 2008, pp.1–5. For the Opinion of the Article 29 WP, see Article 29 Data Protection Working Party, Opinion 9/2011 on the revised Industry Proposal for a Privacy and Data Protection Impact Assessment Framework for RFID Applications, WP180, 11.2.2011, 24 p. (‘WP 29 Opinion Revised Impact Assessment Framework RFID 2011 (WP180)’).

  10. 10.

    See below, § 21. About privacy impact assessment in general, see D. Wright and P. De Hert, (eds.), Privacy Impact Assessment, Springer, 2012, 519 p. See also the reports of the PIAF project, available at http://www.piafproject.eu/

  11. 11.

    About the debate whether the balancing should review instruments and/or values or rights, see, Van Gerven, Het proportionaliteitsbeginsel, 1995, (1), pp. 8–14.

  12. 12.

    This is also expressly stated in Article 8 § 2 ECHR and the Articles 7 and 8 EU Charter juncto Article 52 EU Charter.

  13. 13.

    See also R. Drulàkovà, Post-democracy within the EU: Internal security vs. human rightsunavoidable conflict? Paper prepared for the CEEISA 4th Convention, 25–27.06.2006, 23 p., available at http://www.ceeisaconf.ut.ee/109100 The author uses the term ‘post-democracy’ hereby referring to a society where ‘the public opinion and influence on public affairs are reduced by the powerful minority of political elites’ whereby the latter ‘do not keep in line with the public opinion and expertise more and more issues’, until the public audience ‘gives up on a possibility to influence the public affairs at all’. About the ‘right’ to security, see also Part II, Chap. 5, § 335.

  14. 14.

    See, e.g., B. Hayes, ‘There is no “balance” between security and civil liberties – just less of each’ in Essays for civil liberties and democracy in Europe, 2006, available at http://www.ecln.org/essays/essay-12.pdf; see also A. Cavoukian, A. Stoianov, and F. Carter, ‘Biometric Encryption: Technology for Strong Authentication, Security AND Privacy’ in E. de Leeuw, Fischer-Hübner, S., Tseng, J., Borking, J. (eds.), IFIP. Policies and Research in Identity Management, Boston, Springer, 2008, pp. 57–77 (‘Cavoukian, Stoianov and Carter, Biometric Encryption, 2008’); see also Cavoukian and Stoianov, Biometric encryption, 2007.

  15. 15.

    See Verslag Noreilde, pp. 57–61.

  16. 16.

    Ibid., pp. 57–61. See also p. 26 and footnote 1.

  17. 17.

    See (previous) Article 5 §2 of the original Act of 21 March 2007. This requirement to obtain the positive opinion of the responsible police officer referring to such security- and effectiveness report however has been removed after modification of the legislation.

  18. 18.

    See, e.g., in Alberta, Canada, where the privacy commissioner pleaded for switching of surveillance cameras in public places. See J. Van Rassel, ‘Crime cameras should go, says privacy czar’, 24.07.2010, Calgary Herald, available at http://www2.canada.com/calgaryherald/news/city/story.html?id=32a0a1bf-e32c-46ea-b99d-bf25e1dc68e0 In Germany, cameras would also have been de-installed. There are other examples. Because speed cameras were deemed ineffective to convince drivers to moderate their speed, such cameras were removed in the United Kingdom by the council of Swindon, as reported. One of the reasons, however, seemed the changes in the financial streams relating to the cameras (both in terms of ‘gains’ which were transferred to the national level, and maintenance, for which the city remained responsible) and was criticized. See also M. Weaver, ‘More councils expected to ban speed cameras’, Guardian.co.uk, 23.10.2008, available at http://www.guardian.co.uk/politics/2008/oct/23/localgovernment-motoring

  19. 19.

    This question has to be distinguished from the security aspects of biometric systems in general, i.e. the issue as to how a biometric system shall be protected from unauthorized access and processing of the biometric data. On this issue, see below.

  20. 20.

    For an example that citizens valuate privacy and biometric data as important, see Müller and Kindt, Model implementation, Fidis, D.3.14, 2009, mentioned and discussed in § 70 below.

  21. 21.

    See Coudert, Pattern recognition technologies, 2010, pp. 379–380. See also for a critical view on this aspect F. Dumortier, L’utilisation de la biométrie et des RFIDs, 2009.

  22. 22.

    See, e.g., Cavoukian and Stoianov, Biometric encryption 2007, pp. 7–10.

  23. 23.

    Some debate, however, has taken place in the U.K. in relation with the introduction of the Identity Card Bill and about some large-scale systems, such as SIS II, and to a limited extend in France and Germany. See also Kindt and Müller, Privacy legal framework for biometrics, Fidis, D.13.4, 2009 in which the debate (or lack thereof) for the countries studies, is mentioned; on the need for debate see also Kindt, Biometrie? Hoog tijd voor een debat, 2006, p. 80; see also about the lack of debate when introducing the biometric ePassport in the Netherlands, in particular about the technical aspects, or more precisely, the lack of the review and discussion of the technical aspects, Snijder, Crash of zachte landing, 2010, pp. 118–126; about the adoption of the Identity Card Bill and the inclusion of biometric identifiers in the ePassports in general, see below §§ 178–189.

  24. 24.

    See Part II, Chap. 4, § 204.

  25. 25.

    The Madrid Privacy Declaration, Global Privacy Standards for a Global World, 3 November 2009, available at http://thepublicvoice.org/madrid-declaration/

  26. 26.

    See Council of Europe, The need for a global consideration of the human rights implications of biometrics, 2011. About this call, see also Chap. 9.

  27. 27.

    See also P. de Hert and A. Sprokkereef, ‘Biometrie en recht in Nederland’, in Computerrecht 2008, p. 300. The authors state it is time to determine what is legitimate use of biometric data, as follows: ‘Meer behoefte is er aan het verduidelijken van wat legitieme en niet-legitieme vormen van biometriegebruik zijn. Proportionaliteitsvraagstukken moeten met andere woorden centraal staan in wettelijke initiatieven. (…) Is het tijd voor een verbod van centrale opslag van biometrische gegevens, omdat in de praktijk decentrale opslag eenzelfde resultaat kan geven? Moet er een plicht komen om met templates te werken eerder dan met het ruwe biometrische materiaal? Zijn er vormen van biometriegebruik, bijvoorbeeld vingerafdrukken, die in principe verboden zouden moeten worden ten voordele van biometrie alternatieven? Dat zijn de vragen waarmee we ons zouden moeten bezighouden’.

  28. 28.

    See and compare also with R. Stone, Police powers in the context of terrorism, p. 10, available at http://eprints.lincoln.ac.uk/3144/1/terrorismarticle_(3).pdf

  29. 29.

    As to the lawfulness of identification and identity control, see Part II, Chap. 4. For this reason, we argue in Part III that if the use of biometric samples is not necessary, a prohibition to store such samples shall be adopted. See and compare also with the use and retention in most cases of DNA profiles in databases rather than DNA samples for identification purposes.

  30. 30.

    LRDP Kantor and Centre for Public Reform, Comparative study on different approaches to new privacy challenges, in particular in the light of technological developments, Final report, Brussels, European Commission, 20.01.2010, p. 32 (‘LRDP and Centre for Public Reform, Comparative study, 2010’).

  31. 31.

    The The Hague Programme also introduced the idea of the use of biometric identifiers for passports and of a visa information system. See also, Part I, Chap. 2, § 145 and footnote 200.

  32. 32.

    The Hague Programme, 2005, para. 2.1.

  33. 33.

    See Commission, Proposal for a Council framework decision on the exchange of information under the principle of availability, 12.10.2005, COM(2005)490 final, 33 p. (see Annex II), also available at http://eurlex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:52005PC0490:EN:NOT

  34. 34.

    See above, Part II, Chap. 2. It has been argued that the Treaty was not about availability, because the Treaty uses a system of indirect access through reference data and would not eliminate borders for police information (see House of Lords, Prüm: an effective weapon against terrorism and crime?, London, HL Paper 90, 9 May 2007, p. 10 (‘House of Lords, Prüm, 2007’), also available at http://www.publications.parliament.uk/pa/ld200607/ldselect/ldeucom/90/90.pdf, but this could in our view be doubted.

  35. 35.

    See also above Part I, Chap. 3, § 396, where the Framework Decision 2008/977/JHA is also fully cited.

  36. 36.

    For critical observations, however, also about the legality, see, e.g., House of Lords, Prüm, 2007, 98 p.

  37. 37.

    See, e.g., Brouwer, De Hert and Saelens, Ontwerp-Kaderbesluit, 2007; De Hert and Riehle, Data Protection, 2010; see also about the need for regulation in the previously named Third pillar, Hijmans, Recent developments, 2010, p. 224. The author seems to hold that the reasoning that ‘police data to be wholly different’, which might have been defended at the time of the adoption of the Directive 95/46/EC cannot be upheld today anymore. He states that ‘the exclusion of police cooperation and judicial cooperation in criminal matters in Directive 95/46/EC was the consequence of the pillar structure under the old regime of the Treaties, not of the fact that police and judicial are wholly different’.

  38. 38.

    EU Council, The Stockholm Programme – An open and secure Europe serving and protecting citizens, O.J. C. 115, 4.05.2010, p. 18 (‘Stockholm Programme’). The Stockholm Programme followed up the The Hague Programme (from 2004 to 2009), which was preceded by the Tampere Programme (1999–2003). At the same time, it is recognized that there is a need to consolidate, to take stock of the measures in place and to evaluate the current instruments, in order to assess whether the instruments function as originally intended and how they can be improved in order to lay ground for a coherent development of all existing and future information systems.’ See also Commission, Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the regions, Delivering an area of freedom, security and justice for Europes citizens. Action Plan Implementing the Stockholm Programme, 20.4.2010, p. 6, COM(2010) 171 final (‘Commission, Action Plan implementing Stockholm programme, COM(2010) 171 final’).

  39. 39.

    See also the message of the Council of Europe, in particular that it is possible to combat terrorism while respecting fundamental rights.

  40. 40.

    This is also one of the advantages of the double proportionality test in case the Directive 95/46/EC would not apply, such as in – as previously named – Third pillar matters.

  41. 41.

    See the European DPAs Conference’s Common Position on the use of the availability principle adopted during the conference in Cyprus, 10–11 May 2007, 21 p., available at https://secure.edps.europa.eu/EDPSWEB/webdav/site/mySite/shared/Documents/Cooperation/Conference_EU/07-05-11_Larnaca_availability_EN.pdf The declaration provides a checklist for assessing proposals in the area of law enforcement and combating terrorism as to necessity and proportionality.

  42. 42.

    See about this observation and compare also with D. Wood, A report on the Surveillance Society, 2006, Surveillance studies Network, p. 18, available at http://www.ico.gov.uk/upload/documents/library/data_protection/practical_application/surveillance_society_full_report_2006.pdf (‘Wood, Surveillance Society, 2006’): ‘There is evidence of a shift of military supply and arms companies towards exploiting the civilian market, and indeed of creating new markets for innovative products that are no longer purely military of civilian’. The same report gives there examples, e.g., of a major partner of the US Defense contract, which became a leader in civilian biometrics. For similar critical observations, see also Korff, Automated processes of identification, behavioral analysis and risk detection, 2010, pp. 35–36.

  43. 43.

    These applications are less needed now in the military because of several reasons, including because of ending up in the post ColdWar period.

  44. 44.

    Commission, Communication from the Commission to the European Parliament, the Council, the Economic and Social Committee and the Committee of the Regions. A Comprehensive approach on personal data protection in the European Union, 4.11.2010, Com(2010) 609 final, p. 5 (‘Commission, Communication. Personal Data Protection, 2010’).

  45. 45.

    LRDP and Centre for Public Reform, Comparative study, 2010, p. 33.

  46. 46.

    See, e.g., various studies made by stakeholders, some of which are mentioned in Part II, Chap. 6, footnote 74.

  47. 47.

    For a selection only and as already mentioned in distinct places throughout this treatise, see, e.g., Dijstelbloem and Meijer, De Migratiemachine, 2009; E. Brouwer, ‘The use of biometrics in EU data bases and identity documents. Keeping track of foreigner’s movements and rights’, in J. Lodge (ed.), Are you who you say you are? The EU ad Biometric Borders, Nijmegen, Wolf Legal Publishers, 2007, pp. 45–66; De Leeuw, Biometrie en nationaal identiteitsmanagement, 2007, pp. 50–56; P. De Hert, W Schreurs and E. Brouwer, ‘Machine-readable identity documents with biometric data in the EU: Overview of the legal framework’, Keesing Journal of Documents and Identity, Issue 21, 2006, pp 3–10; E. Guild, ‘Chapter IV. Unreadable Papers? The EU’s first experiences with biometrics: Examining EURODAC and the EU’s Borders’, in J. Lodge (ed.), Are you who you say you are? The EU ad Biometric Borders, Nijmegen, Wolf Legal Publishers, 2007, pp. 31–43.

  48. 48.

    Meaning, in terms of organizational or shareholder control. In some cases, however, the State may be a minority shareholder of such actors and one could debate whether such actors should in these cases be excluded from our concept of ‘private actor’ or ‘controller in the private sector’.

  49. 49.

    About public-private cooperation, see Part II, Chap. 5, §331.

  50. 50.

    For examples of cooperation between private and public sector controllers, see also Part II, Chap. 5, § 331 and our recommendations in §§ 459–461 below.

  51. 51.

    Some schools may be organized and controlled by the State as well. In case a regulation for deployment of biometric systems in schools would be adopted, it would indeed in principle not be necessary to make a distinction amongst schools.

  52. 52.

    Some biometric applications may be promoted as for convenience while surveillance or security purposes prevail. On the need of purpose binding, see also Council of Europe, Progress report of application of the Convention to biometric data, 2005, pp. 17–18.

  53. 53.

    See also above.

  54. 54.

    See also Part I, Chap. 3, § 304.

  55. 55.

    See Part I, Chap. 3, §§ 261–263 and Part II, Chap. 4, §§ 87–91. In general, research on this topic shall be continued. The reason of limited research so far may be that the interests in knowing the results of the research diverge too much. Since more (interdisciplinary) research on these issues are needed, one can also say that presently it has not been confirmed either with sufficient certainty that templates still hold information concerning health or race and ethnic origin. We argue however that there are sufficient reasons for reasonable assumptions that at least particular characteristics and their templates contain such information, pressing for a cautious approach and for more research. Another aspect to be taken into account is that the differentiation between samples and templates is to some extent artificial, since there are several intermediary steps inbetween, from which it is not always clear where the sample stops to be a sample and where the template starts.

  56. 56.

    See Part II, Chap. 4, § 90.

  57. 57.

    See Chap. 8, §§ 315–320.

  58. 58.

    See Part II, Chap. 5, § 425.

  59. 59.

    See Part I, Chap. 2, § 97 and footnote 94.

  60. 60.

    See Part I, Chap. 2, § 87, footnote 82.

  61. 61.

    It was suggested that ‘positive identification’ should be replaced by the term ‘positive identity claim’ which refers in principle to being a source of a particular reference. It creates confusion between a one to many comparison (identification) or one to one verification. See term 3.5.10 SD2 Version 12 – Harmonized Biometric Vocabulary. The term ‘positive identity claim’ is however not mentioned anymore in the adopted ISO Vocabulary for Biometrics 2012, while positive biometric identification is (see term 37.03.12, Note 2). This may in our view lead to confusion as to the functionality. See also Part I, Chap. 2, §§ 92–94.

  62. 62.

    See Part I, Chap. 2, §§ 91–92. See also Kindt, Biometric applications and the data protection legislation, 2007, pp. 166–170.

  63. 63.

    For example, the press release on the website of the Belgian DPA announcing to the general public the new opinion on biometrics, stated it as follows: ‘The privacy commission renders an opinion on how the processing of biometric data can be done in a privacy friendly way. Biometric systems are new technologies by which the identity control can be made with great certainty. (…) For this reason, the Commission stresses the importance of a careful check of the use, the desirability and the justification of these techniques by the controller. (…)’. (stress added). See Privacycommissie schetst kader voor verwerking van biometrische gegevens, 6 June 2008, previously available on http://www.privacycommission.be/nl/press_room/pers_bericht6.html See also the Dutch DPA, Part II, Chap. 5, § 495 and footnote 605 in particular.

  64. 64.

    See and compare also with the EDPS, for example, stating in a recent opinion that in principle, he ‘favours the use of “one to one” search mode whereby the identification (sic) unit would compare the biometric data of the individual with a unique template (associated with the identity)’ (EDPS, Opinion on Turbine, 2011, p. 11, § 52). The reason why the EDPS prefers this functionality, however, is not so clear. One of the main reasons seems to lay in the accuracy. Ibid., p. 8.

  65. 65.

    Wetenschappelijke Raad voor het Regeringsbeleid, iOverheid, Amsterdam, Amsterdam University Press, 2011, p. 100: ‘Zo discussieerde het parlement vele jaren over de toepassing van biometrie op het paspoort en: “door de jaren heen is er veel begripsverwarring ontstaan, waar niemand echt zijn vinger achter kon krijgen. Steeds groter werden de termen en ambities als het ging over het doel dat de biometrie moest dienen (…) (Snijder 2010:85)”’ (WRR, iOverheid, 2011’).

  66. 66.

    Some understand under this concept the storage on a token or object under the control of the data subjects, while local storage is understood by others as storage in e.g., a sensor, local database, access terminal, hence implying centralized biometric data.

  67. 67.

    See also the recommendation 2 for high quality data in this respect in JRC Report Large-scale Biometrics Deployment, p. 103. For purposes of comparison with regulation of genetic data, see also the Council of Europe, Convention on Human Rights and Biomedicine (ETS No. 164) and ETS No. 164. Additional Protocol 203, which stresses the importance of principles such as of quality and utility.

  68. 68.

    House of Lords, Schengen Information System II (SIS II). Report with evidence, London, House of Lord, HL Paper 49, 2 March 2007, p. 23 (testimony by professor Groenendijk). The testimony referred to scientific work by Ms. Evelyn Brouwer.

  69. 69.

    See, e.g., the Article 29 Working Party in assessing the legitimacy and proportionality of the central storage of biometric data in VIS: ‘(…) attention should also be drawn to the possible expansion of the access scope to include entities other than those that had been envisaged initially’. WP 29 Opinion 2/2005 on VIS and exchange of data (WP110), p. 12.

  70. 70.

    This term is used by Bromba. See Bioidentifikation. Fragen und Antworten,¸44 p., last update 11.2.2011, available at www.bromba.com/faq/biofaqd.htm

  71. 71.

    Several cases of attacks of major central databases with personal data have taken place the last years, widely reported in the press, such as the attack and theft of personal data from gamers from the Sony Playstation Network central gamesite in April 2011. See also Part II, Chap. 4, footnote 307.

  72. 72.

    S. Kent and L. Millett (eds.), Who goes there? Authentication Through the Lens of Privacy, National Research Council, 2003, p. 123 (‘NRC, Authentication Report, 2003’).

  73. 73.

    See above Part I.

  74. 74.

    We explained in Part I that the verification functionality can also be used if the data are stored centrally (about the functionalities of a biometric system, see Part I, Chap. 2, §§ 85–90). However, in that case, the functionality may change overnight (see Council of Europe, Progress Report,p. 14, § 48). In our argument here, we therefore refer to the verification functionality whereby the data are locally stored on an object held by the data subject, unless indicated otherwise.

  75. 75.

    See also Prins, Making our body identify for us, 1998, p. 163.

  76. 76.

    See the EDPS in his opinion on the Turbine project, where the risks of biometric data are formulated in terms of risk of identification, as cited in footnote 64 above.

  77. 77.

    E.g., the data could thereafter also be stored centrally and re-used. The CNIL is therefore very critical to the use of characteristics which ‘leave traces’, since the risks of re-use after central storage are higher for these characteristics.

  78. 78.

    See, for an example, footnote 63 above.

  79. 79.

    This is subject to following the other recommendations we make in Chap. 8, such as (local) storage under the control of the data subject and the use of pseudonyms. These recommendations may further accomplish that in particular cases and under conditions the fundamental rights are respected.

  80. 80.

    WP 29 Working Document on Biometrics 2003 (WP80), p. 6. See and compare with the comments of the Article 29 Working Party about VIS (although not in the private sector) as well: ‘Use of biometric data for identification purposes should be limited selectively; inclusion of these data in the CS-VIS should be envisaged where it is absolutely necessary – for instance in connection with suspected procedural misuse, or else in respect of an applicant whose data are already stored in the system and whose request has been rejected for serious reasons’ (emphasis added) (WP 29 Opinion 2/2005 on VIS and exchange of data (WP110), p. 13). The references by the Article 29 Working Party to the ‘absolute necessity’ seems to be the criterion for the proportionality, evaluated under article 8 §2 ECHR, although not expressly stated.

  81. 81.

    See Part II, Chap. 4, §§ 32–35.

  82. 82.

    See Part II, Chap. 4, §§ 11–24. For example, under Belgian law, the law provides that the chief of administrative police can instruct police officials to control the identity to maintain public safety (Article 34 §3 of the Act on the Police Function). See also the discussion and references to specific legislation relating to the use of camera images by police (for identification purposes) in Verslag Noreilde.

  83. 83.

    See also and compare with the attempts of some citizens to organize the identification of ‘criminals’, e.g., shoplifters or drivers taking gasoline without paying, on the basis of surveillance camera images, by posting the images in the shop or on a website. DPAs have in the past not always reacted in a clear manner on the legality of this practice, but this is improving. The Belgian DPA has condemned this as not legal. One of the main reasons is that this would be done without respecting the existing legal framework. An additional argument is that criminal investigations should not be conducted by citizens. About the ongoing debate in this matter in the Netherlands, see also X., ‘Plaatsen foto’s criminelen niet altijd bestraffen’, 7.08.2011, available at http://www.binnenlandsbestuur.nl/openbare-orde-en-veiligheid/nieuws/plaatsen-foto-s-criminelen-niet-altijd-bestraffen.1617031.lynkx, and the references therein to the discussion between the government and the DPA.

  84. 84.

    See Part II, Chap. 4, §§24–28. See and compare also with the regulation for accessing a national registry containing the identity details of citizens.

  85. 85.

    This functionality could either be used for a positive biometric claim or a negative biometric claim, previously also named positive and negative identification (but the latter terms are depreciated) (about positive and negative biometric claim see Part I, Chap. 2, footnotes 90 and 91). See also the criticism on the use of identification, e.g., at the Super bowl event of 2001; for an example of legislation permitting identification (without however, by this reference, endorsing), see legislation adopted in the State of Illinois, also mentioned in Chap. 8, at footnote 311.

  86. 86.

    But: in some legislation, it is prohibited to adopt a ‘false name’ which may also affect the use of pseudonyms. See below § 98.

  87. 87.

    WP 29 Working Document on Biometrics 2003 (WP80), p. 6.

  88. 88.

    See Part I, Chap. 2, § 128 and in particular the (rather poor) results of the field test done by the Bundeskriminalamt in 2007 as reported.

  89. 89.

    For the improvement of the error rates of face recognition, see Part I, Chap. 2, § 128.

  90. 90.

    CNIL, 21e rapport dactivité, 2000, p. 109.

  91. 91.

    See CNIL, 28ième rapport dactivité, 2007, p. 20.

  92. 92.

    See CNIL, Biométrie: des dispositifs sensibles soumis à autorisation de la CNIL, 7.04.2011, available at http://www.cnil.fr/en-savoir-plus/fiches-pratiques/fiche/article/biometrie-des-dispositifs-sensibles-soumis-a-autorisation-de-la-cnil/?tx_ttnews%5BbackPid%5D=91&cHash=33c56bf40f

  93. 93.

    See the Iris on the Move developments. See also Part I, Chap. 2, footnote 49.

  94. 94.

    A. Jain and J. Feng, ‘Latent Palmprint Matching’, 31 IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 6, 2009, pp. 1032–1047, also available at http://www.computer.org/portal/web/csdl/doi?doc=doi/10.1109/TPAMI.2008.242

  95. 95.

    External biometric characteristics are opposed as to what we would call more internal biometric characteristics, such as the ear channel or retina and internal biological processes which cannot be observed from the outside. However, technology becomes increasingly sophisticated and it is therefore not excluded that even some of those characteristics and processes may be measured covertly in the future.

  96. 96.

    New technology would allow to capture contactless in milliseconds images of veins while in motion. See Part I, Chap. 2, § 66. But: see the opinion of the CNIL as mentioned in § 50 above.

  97. 97.

    About new methodologies for capturing from a distances, by developing a ‘biometric tunnel’ with cameras for the capture of non-contact biometric characteristics (in particular face and gait), see e.g., L Middleton, D. Wagg, A. Bazin, J. Carter, M. Nixon, A smart environment for biometric capture, in IEEE Conference on Automation Science and Engineering, 2010, Shanghai, China, 6 p., available at http://eprints.ecs.soton.ac.uk/12914/

  98. 98.

    The term also comprises references to multimodality, the use of new biometric traits (e.g., gait, …), ‘under the skin’ (or electrophysiological) and soft biometrics’. I. Van der Ploeg, Identity, Biometrics and Behavior Predictability, presentation at the Rise/Hide Conference, 9–10.12.2010, Brussels, previously available at http://riseproject.webtrade.ie/_fileupload/RISE%20Conference/Presentations/Irma%20van%20der%20Ploeg.pdf; see and compare also with the term used in Council of Europe, The need for a global consideration of the human rights implications of biometrics, 2011, p. 5, but referring to rather soft biometrics.

  99. 99.

    See also the Volkszählungsurteil of the German Federal Constitutional Court of 1983, pointing to such effect (see below).

  100. 100.

    See Part I, Chap. 3, § 422.

  101. 101.

    Ibid. Following some reports, in particular the Records, Computers and the Rights of Citizens of the Secretary’s Advisory Committee on Automated Personal Data Systems of 1973 (available at http://epic.org/privacy/hew1973report/) fair information practices and legislation was adopted in the United States, in particular for specific classes of record keeping. For a history, see R. Gellman, Fair Information Practices: A Basic History, 3.10.2011, 14 p., available at http://bobgellman.com/rg-docs/rg-FIPShistory.pdf

  102. 102.

    See Rigaux, Protection vie privée, 1990, p. 739. About the evolution of the concept of privacy, see also De Hert, Artikel 8 EVRM, 2004.

  103. 103.

    C. Prins, ‘Property and Privacy: European Perspectives and the Commodification of our identity’, Information Law Series, 2006, pp. 223–257 available at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=929668; C. Prins, ‘Property and Privacy: European Perspectives and the Commodification of our identity’, in L. Guibault and P. Hugenholtz (eds.), The Future of the Public Domain, Kluwer, 2006, pp. 223–257.

  104. 104.

    See Part I, Chap. 3, § 185. See also OECD, Thirty Years after 2011, p. 12.

  105. 105.

    See Part I, Chap. 3, § 423.

  106. 106.

    Exception could be made for the transfer of personal data to countries without guaranteeing an adequate level of protection, for which the data subject could give consent. See Art. 26 (1) a Directive 95/46/EC.

  107. 107.

    But: see Directive 2002/58/EC, discussed in § 61 below.

  108. 108.

    The personal computer (PC) market started to boast after the launch by International Business Machines Corp. (IBM) in 1981 of its PC with disk operating system which became a standard. The vision of PCs ‘on every desk and in every home’ became steadily a reality after the Windows operating system software developed and launched by Microsoft in 1985 and the release of Office in 1989. See also C. Beaumont, ‘Bill Gates’s dream: A computer in every home’, The Telegraph, 27.06.2008, available at http://www.telegraph.co.uk/technology/3357701/Bill-Gatess-dream-A-computer-in-every-home.html

  109. 109.

    The World Wide Web (WWW) set off when HyperText Markup Language (HTML) was developed for access to documentation and was made known by Tim Berners-Lee and the Belgian Robert Cailliau at CERN in 1990. HTML allowed the Internet to expand in the WWW whereby sites can be viewed (and feeded) by data subjects, using browsers and search terms. The technology was released by CERN under impulse of Cailliau into the public domain in 1993. See also Timeline of Computer History, available at http://www.computerhistory.org/timeline/?category=net

  110. 110.

    About this right of access and correction, see also P. De Hert and S. Gutwirth, ‘Data Protection in the Case Law of Strasbourg and Luxemburg: Constitutionalisation in Action’, in S. Gutwirth, Y. Poullet, P. De Hert, C. de Terwangne, S. Nouwt (eds.), Reinventing Data Protection?, Springer, 2009, p. 19; see also P. De Hert, ‘Identity management of e-ID, privacy and security in Europe. A human rights view’, Information Security Technical Report, 2008, pp. 71–75. De Hert stresses that this right to control is conform the general criteria of the Directive 95/46/EC but should be specified.

  111. 111.

    Cavoukian, Privacy and biometrics, 1999, p. 10.

  112. 112.

    European Group on Ethics in Science and New Technologies, Report on the Charter on Fundamental Rights related to technological innovation as requested by President Prodi on February 3, 2000, 23 May 2000, pp. 25–26.

  113. 113.

    Alterman, A piece of yourself, 2003, p. 146. See also above.

  114. 114.

    See, e.g., EDPS, Turbine Opinion, 2011, p. 11. This aspect of control is only indirectly addressed by the EDPS in his reaction to the Communication of the Commission on a comprehensive approach on personal data, by lamenting that ‘in practice, often users have limited control in relation to their data, particularly in technological environments’ (see EDPS, Opinion 14.01.2011 on a Communication from the Commission to the European Parliament, the Council, the Economic and Social Committee and the Committee of the Regions –“A comprehensive approach on personal data protection in the European Union”, p. 28, § 79 (‘EDPS, Communication on Personal Data Protection, 2011’).

  115. 115.

    See also Part I, Chap. 3, § 181.

  116. 116.

    More in particular, in the initial version of the Directive, the data subject shall be given the right, after being informed, to refuse such storage or access. This applies, for example, to cookies. This has however been modified and reinforced as stated. About consent under the ePrivacy Directive, see also E. Kosta, Unravelling consent in European data protection legislation. A prospective study on consent in electronic communications, Leuven, Law faculty, 2011, unpublished, p. 217 et seq. (‘Kosta, Unravelling consent, 2011’).

  117. 117.

    Article 5(3) Directive 2002/58/EC as amended.

  118. 118.

    Article 14 (3) Directive 2002/58/EC also refers to ‘measures’ that may be adopted ‘to ensure that terminal equipment is constructed in a way that is compatible with the right of users to protect and control the use of their personal data’. This Article 14 (3) could not only be relied upon for defending a right to control but may by some also be regarded as providing a legal basis for regulation that imposes and enforces the use of specific privacy- enhancing technologies (see below) such as for permitting control. See also J.-M. Dinant, ‘Chap. 5. The Concepts of Identity and Identifiability: Legal and Technical Deadlocks for Protecting Human Beings in the Information Society’, Reinventing Data Protection?, S. Gutwirth, Y. Poullet, P. De Hert, C. de Terwangne, S. Nouwt (eds.), Springer, 2009, p. 122.

  119. 119.

    Article 29 Data Protection Working Party, Opinion 1/2008 on data protection issues related to search engines, WP148, 4 April 2008, p. 12 (‘WP 29 Opinion search engines 2008 (WP148)’).

  120. 120.

    See also Article 29 Data Protection Working Party, Opinion 2/2010 on online behavioural advertising, WP171, 22 June 2010, p. 9 (‘WP 29 Opinion on online behavioural advertising 2010 (WP171)’).

  121. 121.

    See below.

  122. 122.

    The Court has developed this new right in a case referred to as the case Online Durchsuchung (BVerfG, 27.02.2008, 1 BvR 370/07; 1 BvR 595/07 (‘Online Durchsuchung’)), §§ 203 and 204.

  123. 123.

    For an analysis of this important decision, see e.g., P. De Hert, K. de Vries, S. Gutwirth, ‘Duitse rechtspraak over remote searches, datamining en afluisteren op afstand. Het arrest Bundesverfassungsgericht 27 februari 2008 (Online-Durchsuchung) in breder perspectief’, Computerrecht 2009, pp. 200–211; T. Hoeren, ‘Was ist das Grundrecht auf Integrität und Vertraulichtkeit informationstechnischer Systeme?’, Multimedia und Recht 2008; G. Hornung, ‘Ein neues Grundrecht’, Computer und Recht 2008, pp. 299–306.

  124. 124.

    Some examples given are based upon recent research, such as conducted in the Fidis project and a subsequent field test consisting of a model implementation for a user controlled biometric authentication and in the Turbine project.

  125. 125.

    See our analysis in Part I, Chap. 3, §§ 234–263 and Part II, Chap. 4, §§ 72–91.

  126. 126.

    See Art. 8 (2) (e) Directive 95/46/EC.

  127. 127.

    BVerfG, 15.12.1983, BVerfGE 65, 1 (‘Volkszählungsurteil’). Several countries have amended their data protection legislation to include the right to informational self-determination. See V. Mayer-Schönberger, ‘Generational development of data protection in Europe’, in Ph. Agre and M. Rotenberg, (eds.), Technology and privacy: the new landscape, Cambridge, Massachusetts, MIT press, 1998 , p. 219 et seq.

  128. 128.

    WP 29 EHR, p. 17.

  129. 129.

    Ibid., p. 17.

  130. 130.

    See Eerste Kamer der StatenGeneraal, Elektronisch patiëntendossier, available at http://www.eerstekamer.nl/wetsvoorstel/31466_elektronisch

  131. 131.

    See e.g., N. Koffeman, ‘The right to personal autonomy in the case law of the European Court of Human Rights, Leiden, 2010, 71 p., available at https://openaccess.leidenuniv.nl/handle/1887/15890 stating it as follows: ‘The elements that are explicitly and repeatedly defined as rights by the ECtHR, are the right to personal identity, the right to personal development and the right to establish relationships with other human beings and the outside world. As discussed in section 1.1.2, one may carefully argue that the Court has furthermore recognized a right to personal autonomy. However, its case law is not consistent on this point’; see also De Hert and Gutwirth, referring to the right of informational self-determination and the references therein: De Hert and Gutwirth, Data Protection in the Case Law of Strasbourg and Luxemburg: Constitutionalisation in Action, in Gutwirth et al., Reinventing Data Protection, 2009, p. 19.

  132. 132.

    E.g., publication on a public website.

  133. 133.

    See, Part II, Chap. 5, § 424 et seq.; see also the more recent UA of 2011 N°AU-027 of the French DPA authorizing storage on a professional laptops which shall remain under the control of the data subject, discussed in Part II.

  134. 134.

    For example, the DPA of Belgium in its guidelines of 2008. See also other DPAs, including the DPA of the Hellenic Republic of Greece, of which some opinions on biometric data processing (in English), such as Decision No. 9/2003 relating to a biometric access control system in the Athens metro and Decision No. 31/2010 on the Turbine pilot, can also be found at the site (in English) at http://www.dpa.gr/portal/page?_pageid=33,43590&_dad=portal&_schema=PORTAL In the latter opinion, the DPA stated it as follows: ‘As far as the storage of biometric identities is concerned, it is worth pointing out that, under real case scenarios, the best way to store them would be locally in smart cards (and not in a central database); this enables data subjects to have greater control over their personal data’ (p. 6).

  135. 135.

    E.g., in the decisions of the CNIL. See also the DPA of the Hellenic Republic of Greece’s Decision No. 31/2010 on the Turbine pilot as cited in the footnote above.

  136. 136.

    WP 29 Working Document on Biometrics 2003 (WP80), p. 6; WP 29 Opinion on developments in biometric technologies 2012 (WP193), p. 31: ‘Especially for verification, the Working Party considers advisable that biometric systems are based on the reading of biometric data stored as encrypted templates on media that are held exclusively by the relevant data subjects (e.g. smart cards or similar devices)’.

  137. 137.

    See JRC, Report Large-scale Biometrics Deployment, 2008, pp. 102–103. This report discusses several relevant elements of the different ways of storage for the evaluation of the proportionality; see also Pfitzmann, Biometrics, 2008, p. 4.

  138. 138.

    See and compare with the recital 24 in the ePrivacy Directive (see above § 63).

  139. 139.

    See also Van Kralingen, Prins en Grijpink, Het lichaam als sleutel, 1997, p. 32, where the authors refer to this distinction in footnote 39.

  140. 140.

    Persons in both the physical and the digital world are often represented by only some of their characteristics, also called attributes, for example, being an employee and having an employee number or being a customer and having a loyalty card. These attributes reflect a partial identity of a person. In a digital world, however, these partial identities are represented by data sets and can be managed by technical means. Identity management provides tools for managing these partial identities. Another way to define ‘identity management’ is hence as ‘the managing of partial identities of entities, i.e. definition, designation and administration of identity attributes as well as choice of the partial identity to be (re)-used in a specific context’ (citation from Modinis, Study on Identity Management in eGovernment. Common Terminological Framework for Interoperable Electronic Identity Management, v.2.01, November 2005, p. 11, available on https://www.cosic.esat.kuleuven.be/modinis-idm/twiki/pub/Main/GlossaryDoc/modinis.terminology.paper.v2.01.2005-11-23.pdf.

  141. 141.

    A federated identity management provides a framework for a multi-organization identity management system whereby service providers and identity providers operate in federations that have business relationships with each other based on a particular architecture and operational agreements. An example is the Liberty Alliance Project; a centralised identity management system is usually dominated by one (identity) provider (e.g., see Windows Life ID and previously .NET Passport in early 2000 of Microsoft; about .NET Passport, see also J. Dumortier, ‘Combining Personalised Communications Services with Privacy-Friendly Identity Management’, Proceedings of the 44th FITCE Congress Vienna, 1–3 September 2005, p. 142–146, available at https://www.law.kuleuven.be/icri/publications/710FITCE2005_IDManagement.pdf?where=; generally, about identity management, see also E. Kindt, D.1.4 Legal Aspects of Identity Management, Paris, Turbine, 2009, p. 8 and 11, available at http://www.turbine-project.eu/

  142. 142.

    Prime, Prime White paper, v. 3.0, 2008, p. 2, available at https://www.prime-project.eu/prime_products/whitepaper/index_html (‘Prime White paper ’) The text was cited from the Liberty Alliance Project Whitepaper: Personal Identity, 23 March 2006, available at http://projectliberty.org/liberty/content/download/395/2744/file/Personal_Identity.pdf; see also B. Priem, E. Kosta, A. Kuczerawy, J. Dumortier, R. Leenes, ‘User-centric privacy-enhancing Identity Management’, in J. Camenisch, R. Leenes, and D. Sommer (eds.), Digital PrivacyPRIMEPrivacy and Identity Management for Europe, Berlin – Heidelberg, Springer, 2011, pp. 91–106. For other authors describing storage of biometric data on personal device, such as cell phones, organizers, palm pilots and other before 2000, see, e.g., G. Bleumer, ‘Biometric Authentication and Multilateral Security’, in G. Müller and K. Rannenberg (eds.), Multilateral Security in Communications. Technology, Infrastructure, Economy, München, Addison-Wesley, 1999, pp. 157–171.

  143. 143.

    Some consider this local storage on an object under the control of the data subject also as some kind of Privacy Enhancing Technology (‘PET’). See, e.g., At Face Value report, 1999, pp. 51–53. In this report, the authors refer rather to ‘decentralized template storage’ for referring to storage (and comparison) on an object held by the data subject or locally in the sensor.

  144. 144.

    See, e.g., the bilateral agreements of Union member states with the United States for exchanging fingerprint and DNA data.

  145. 145.

    See Part II, Chap. 4, §§ 180–184.

  146. 146.

    See Art. 14 (b) Directive 95/46/EC. Compare with European Commission, Proposal for General Data Protection Regulation COM (2012) 11 final, art. 19.

  147. 147.

    Art. 14 (a) Directive 95/46/EC.

  148. 148.

    See, e.g., D. Korff, Comparative Study on Different Approaches to new privacy challenges, in particular in the light of technological developments, Working Paper N° 2: Data protection laws in the EU: the difficulties in meeting the challenges posed by global social and technical developments, 20 January 2010, Brussels, European Commission, pp. 78–80 (‘Korff, New Challenges to Data Protection. Working Paper N° 2, 2010’), available at http://ec.europa.eu/justice/policies/privacy/docs/studies/new_privacy_challenges/final_report_working_paper_2_en.pdf The Netherlands applies the right strictly to the minimum required, while other countries such as Finland, Spain and Sweden do not provide a general right to object at all. As long as there is no ownership right in biometric data recognized, the right to object remains important.

  149. 149.

    But: see in the Netherlands, Hoge Raad, 9.09.2011 where a data subject successfully claimed the removal of particular data.

  150. 150.

    About different control schemes and the concepts of divided and distributed control, see also E. Kindt, M. Meints, M. Hansen, and L. Müller, ‘3.3. Control schemes within biometric systems’ in E. Kindt and L. Müller (eds.), D.3.10. Biometrics in identity management, Frankfurt, FIDIS, 2007, pp. 55–67 (‘Kindt, Meints, Hansen and Müller, Control schemes, in Kindt and Müller, Biometrics in identity management, Fidis, D.3.10, 2007’); see also Grijpink who recognizes with a new concept of ‘chain computerization’ that for many interorganizational cooperations and policies, no single organizational actor keeps control or authority over the system: see J. Grijpink, ‘Two barriers to realizing the benefits of biometrics: a chain perspective on biometrics, and identity fraud as biometrics’ real challenge’, Computer Law and Security Report 2005, pp. 138–145 and pp. 249–256; new forms of biometric data processing, e.g., cloud computing, will further complicate the matter. See on cloud computing, Part II, Chap. 4, § 143.

  151. 151.

    In a compromise reached on the Biometrics Bill mid 2009, Israel seems to opt to use split biometric database. See X., Israel passes bill on national biometric database, 9.12.2009, available at http://www.thirdfactor.com/2009/12/09/israel-passes-bill-on-national-biometric-database; a similar type of split biometric databases is also used for Eurodac (see Part I, Chap. 3, § 221).

  152. 152.

    It may, however, be unrealistic that police or law enforcement authorities would be interested to hold (partial) biometric information databases. Furthermore, this collection and use of biometric data for particular purposes by the police in combination with e.g., a membership list of a private controller, would, as it may be an infringement of fundamental rights as well, as we would argue, require a legal basis and compliance with other requirements, in addition to the same for the private controller.

  153. 153.

    This approach seems to be taken in Germany for the collection and storage of facial image data for the biometric ePassport, which can be preserved in local databases, while a nation-wide database is explicitly excluded in the German Passport Act. See S. Cehajic and A. Sprokkereef, ‘Germany’ in E. Kindt and L. Müller, D.13.4. The privacy legal framework for biometrics, Frankfurt, FIDIS, 2009, (70), p. 79 (‘Cehajic and Sprokkereef, Germany, in Kindt and Müller, Fidis, D.13.4, 2009’). About the German ePassport, see also G. Hornung, ‘The European Regulation on Biometric Passports: Legislative Procedures, Political Interactions, Legal Framework and Technical Safeguards’, SCRIPTed 2007, pp. 246–262, available at http://www.law.ed.ac.uk/ahrc/script-ed/vol4-3/hornung.asp (‘Hornung, European Regulation on Biometric Passports, 2007’). The term ‘local storage’ should be used with care. See also Chap. 9.

  154. 154.

    Part II, Chap. 4, § 142. The distinction between a distributed (central) database and a central database remains however thin and it will always remain difficult and subject to discussion to draw a line between these two types of (central) storage.

  155. 155.

    The authors mentioned in footnote 153 above comment that the decentralized storage in Germany is not a real safeguard given the fact that the local databases of the municipalities can be electronically connected and accessed, e.g., by police in case of investigations. ‘Given de possibility of connecting decentralized databases (in Germany, the aim to electronically connect all local municipalities was accomplished by the end of 2006), the decentralized storing of the data does not form any real safeguard’ (Hornung, European Regulation on Biometric Passports, 2007, p. 256).

  156. 156.

    H. Biermann, M. Bromba, C. Busch, G. Hornung, M. Meints, and G. Quiring-Kock, (eds.) White Paper zum Datenschutz in der Biometrie, 2008, p. 9, available at http://www.teletrust.de/uploads/media/White_Paper_Datenschutz-in-der-Biometrie-080321.pdf (‘White Paper zum Datenschutz, Teletrust, 2008’).

  157. 157.

    For a comparison of these types of systems, see also R. Turn, N. Shapiro, M. Juncosa, Privacy and security in centralized vs. decentralized databases, Santa Monica, Calif., Rand Corp., 1975, 31 p. (Turn, Shapiro and Juncosa, centralized vs.decentralized databases, 1975’). Turn et al. prefer a ‘properly designed centralized databank system’ above a decentralized system because such centralized database ‘(..) would have a higher degree of public visibility and would have available more resources for providing privacy protection and data security than could the databanks in a decentralized system’ (p. 28).

  158. 158.

    See and compare with WP 29 EHR, p. 17.

  159. 159.

    See, e.g., At Face Value report, 1999, p. 52; see also White Paper zum Datenschutz, Teletrust, 2008, p. 18.

  160. 160.

    WP 29 Opinion on developments in biometric technologies 2012 (WP193), p. 33. This approach and technology is also mentioned by the CNIL in relation to the proposed legislation for the protection of identity, rendering identification in the view of the CNIL not possible for law enforcement purposes: see CNIL, 32 ième Rapport d’Activité 2011, pp. 48–49. But: see also the decision of the Constitutional Court in France of 2012 mentioned in Part II, Chap. 5, § 357.

  161. 161.

    AXSionics, a Swiss company and partner in the Fidis project, developed a secure and privacy-friendly biometric authentication solution, named the AXSionics Internet Passport™.

  162. 162.

    L. Müller, ‘User Side Identity Management System – encapsulated biometrics’, in E. Kindt and L. Müller (eds.), D.3.10. Biometrics in identity management, Frankfurt, FIDIS, 2007, pp. 110–113 (‘Müller, Encapsulated biometrics, in Kindt and Müller, Biometrics in identity management, Fidis, D.3.10, 2007’). One of the main advantages pointed to includes also the solving of the problem of ‘information leakage’ (i.e. a term commonly used to refer to the possibility that sensitive information is disclosed) from templates. About the concept, see also Kindt, Müller and Meints, 4.2.3. Biometrics, 2009, pp. 147–148.

  163. 163.

    Müller and Kindt, Model implementation, Fidis, D.3.14, 2009, 57 p.

  164. 164.

    The model was also referred to as ‘encapsulated biometrics’. In this model, the control over the application as a whole is shared, while the data subject keeps control over his or her biometric data. The authentication organizations, however, who are in principle (co) controllers (because they define and control the means), design and determine the biometric comparison and evaluation process. See also Kindt, Müller and Meints, 4.2.3. Biometrics, 2009, p. 148 and the scheme therein set out.

  165. 165.

    OpenID standards offer a framework, not relying on a central authentication authority, allowing authentication and access control to a practically unlimited number of websites and services (provided these sites and services accommodate the use of an OpenID identifier) with one and the same identifier. This is also referred to as a world wide single sign-on scheme. For further details about the functioning see Müller and Kindt, Model implementation, Fidis, D.3.14, 2009, pp. 12–13; for further information and for obtaining an OpenID, see http://openid.net/

  166. 166.

    For example, for access to a SNS, a lower level of authentication is generally sufficient, while for a banking account, a user will easily accept a two or three factor authentication, including biometric data. See Müller and Kindt, Model implementation, Fidis, D.3.14, 2009, p. 35.

  167. 167.

    For the questionnaire, see Müller and Kindt, Model implementation, Fidis, D.3.14, 2009, pp. 39–44.

  168. 168.

    Müller and Kindt, Model implementation, Fidis, D.3.14, 2009, p. 30.

  169. 169.

    The reply upon this question could in our view possibly be influenced by the local storage of the biometric data on the token and the information the users received about how the biometric data were processed.

  170. 170.

    The age mixture showed a ratio between younger person (under 40) of about 72 % and older persons (above 40) of about 28 %, which is presumably close to a typical value for Internet users.

  171. 171.

    Müller and Kindt, Model implementation, Fidis, D.3.14, 2009, p. 30.

  172. 172.

    Only a few observations were made, including about the importance of the learning process of the users. It was also concluded that an individual device often used by the same data subject in different situations is better adapted to the convenience needs (which is a very important factor) than a centralized biometric authentication scheme which offers little possibilities for individualized customization (see p. 36). In the meantime, over 30,000 users deploy the encapsulated system as it was used in the field test.

  173. 173.

    TrUsted Revocable Biometric IdeNtitiEs project (TURBINE), EU project no. 216339 (2008–2011), www.turbine-project.eu (‘Turbine’). Turbine’s research concentrated on the transformation of fingerprints, whereby the individual can create different ‘pseudo-identities’ for different applications with the same fingerprint (see below). For the public deliverables of the project, including the evaluation of the demonstrators, see the project’s website mentioned above.

  174. 174.

    The characteristics are stored in combination with a service identifier which limits the use of the characteristics to a specific service context.

  175. 175.

    More particular, Turbine proposes a user-centric IdM system model, which allows the data subject to manage its identities and the personal information released. See also below.

  176. 176.

    See EDPS, Turbine Opinion, 2011, p. 13. It was the very first time that the EDPS issued an opinion on a European research project, hereby giving effect to the EDPS’s 2008 policy paper entitled “The EDPS and EU Research and Technological Development”, in which the possible roles of the EDPS for research and development (RTD) projects in the context of the 7th Framework Programme for Research and Technological development (FP7) are described. About the privacy by design principle, see below.

  177. 177.

    See O. Spyroglou, CryptoBiometrics for Enhanced Trusted Identity Management: Dreams and Reality. Increase security trust on secure areas, slides 14–17, presentation at CryptoBiometrics for Enhanced Trusted Identity Management: Dreams and Reality, 17–18.01.2011, Turbine final public workshop, Brussels, Turbine, available at http://www.turbine-project.eu/workshop_presentations.php

  178. 178.

    R. Halperin and J. Backhouse (eds.), D.4.12 A qualitative comparative analysis of citizensperception of eIDs and interoperability, Frankfurt, Fidis, June 2009, 50 p. (‘Fidis, D.4.12, 2009’); see also Part II, Chap. 4, § 80 and footnote 258.

  179. 179.

    We refer for this purpose to the guarantees that we suggest in Chap. 9.

  180. 180.

    Müller and Kindt, Model implementation, Fidis, D.3.14, 2009, p. 46. See and compare with the other risks mentioned by the participants of the field test using the ‘encapsulated’ biometric token developed by AXSionics.

  181. 181.

    CBPL, Opinion N°17/2008 biometric data, §§ 45–51; see also other studies, e.g., White Paper zum Datenschutz, Teletrust, 2008, pp. 18–19.

  182. 182.

    The Art. 29 WP stated: ‘the hijacked identity would then be permanently associated with the digital fingerprints in questions’ (emphasis added) (WP 29 Opinion 7/2004 on VIS (WP96), p. 4).

  183. 183.

    About this risk, see also more in depth Part II, Chap. 4.

  184. 184.

    See also JRC Report Large-scale Biometrics Deployment, 2008, p. 102.

  185. 185.

    At the same time, it may be realistic to acknowledge that the use of such specific techniques (e.g., for renewability, irreversibility, ….) may be restricted or limited, especially in egovernment schemes. See, e.g., with the requirement of the use and storage of biometric samples (images) (of face and fingerprint) in the ePassport.

  186. 186.

    This question also frames in the discussion relating to the use of PETs (see below) which refers to technical facilities which offers a data subject the possibility to remain ‘anonymous’ while using online services. In this context, various arguments are made for the use of pseudonyms.

  187. 187.

    See Article 29 Data Protection Working Party, Recommendation 3/97: Anonymity on the Internet, 3.12.1997, p. 5 (‘WP 29 Recommendation 3/97: Anonymity on the Internet’): ‘Clearly one way of addressing privacy concerns would therefore be to seek to ensure that wherever feasible the data traces created by using the Internet do not permit the identification of the user. With anonymity guaranteed, individuals would be able to participate in the Internet revolution without fear that their every move was being recorded and information about them accumulated which might be used at a later date for purposes to which they object’; about the need for anonymity, see also Y. Poullet, Pour une troisième génération de réglementations de protection des données, 2005, pp. 9–10 (‘Poullet, Troisième génération de réglementations de protection des données, 2005’), available at http://www.privacyconference2005.org/fileadmin/PDF/poullet.pdf and later published in M. Pérez Asinari and P. Palazzi (eds.), Défis du droit à la protection de la vie privée. Perspectives du droit Européen et Nord-Américain, Brussel, Bruylant, 2008, pp. 25–70.

  188. 188.

    See OECD, Ministerial Declaration on the Protection of Privacy on Global Networks, 7–9.10.1998, Ottawa, DSTI/ICCP/REG(98)10/FINAL, p. 3, available at http://www.oecd.org/dataoecd/39/13/1840065.pdf (‘OECD, Privacy on Global Networks, 1998’).

  189. 189.

    WP 29 Working Document on Biometrics 2003 (WP80), p. 7 and footnote 17.

  190. 190.

    The opposite of anonymity and pseudonymity, in particular identification, has been described in Chap. 4 as a risk for the data subjects upon the processing of biometric data and we refer to our analysis of the concept of identification in Part II, Chap. 4, §§ 5–43.

  191. 191.

    See Part II, Chap. 4, § 30; see also on this issue, E. Lievens, Protecting Children in the Digital Era, Leiden-Boston, Martinus Nijhoff, 2010, p. 319 et seq. (‘Lievens, Protecting Children, 2010’).

  192. 192.

    C. Prins, ‘Biometrie: een instrument bij privacybescherming’, Beveiliging, 2001, pp. 50–55, also available at http://arno.uvt.nl/show.cgi?fid=6017 (‘Prins, Biometrie, 2001’). The author explains that anonymity is desired for an array of reasons, such as the interest to control the moment when particular information is given. Anonymity has also been mentioned by Westin as an aspect of privacy. See Westin, Privacy and Freedom, p. 7: ‘Viewed in terms of the relation of the individual to social participation, privacy is the voluntary and temporary withdrawal of a person from the general society through physical or psychological means, either in a state of solitude or small-group intimacy or, when among larger groups, in a condition of anonymity or reserve’ (emphasis added).

  193. 193.

    E.g., in the field of electronic communications, Directive 2002/58/EC, article 6.1 which requires that traffic data must be erased or made anonymous when no longer needed. Article 9 requires that location data shall be made anonymous if processed, unless with the consent of the users or subscribers to the extent and for the duration necessary for the value added service.

  194. 194.

    See, e.g., the Belgian Supreme Court: Cass., 6 December 2005 discussed in footnote 204 below.

  195. 195.

    E.g., C. Nicoll, J. Prins and M. Van Dellen (eds.), Digital Anonymity and the LawTensions and Dimensions, The Hague, Asser Press, ITeR, 2, 2003, 307 p. and the contributions therein, including C. Goemans and J. Dumortier, ‘Mandatory retention of Traffic Data in the EU: Possible Impact on Privacy and on-line Anonymity’, in C. Nicoll, J. Prins and M. Van Dellen (eds.), Digital Anonymity and the LawTensions and Dimensions, The Hague, Asser Press, ITeR, 2, 2003, p. 182. (‘Goemans and Dumortier, Privacy and on-line Anonymity, 2003’); in the United States, see e.g., J. Cohen, ‘The right to read anonymously: a closer look at ‘copyright management’ in cyberspace’, 28 Conn. L. Rev., 1996, (981), p. 1012.

  196. 196.

    See e.g., J. Dumortier, C. Goemans and M. Loncke, D.4, General report of the legal issues, 2003, Anonymity and Privacy in Electronic Services (APES), 158 p., (‘Dumortier, Goemans and Loncke, Apes, Legal Issues, 2003’). Projects of private parties also endeavour to set up anonymous (vis-à-vis the authorities) networks, whether or not to perform illegal acts (such as the distributions and/or downloading of copyrighted works (see, e.g., the Torproject, available at www.torproject.org)).

  197. 197.

    See, e.g., ECJ, Promusicae v. Telefonica, 2008. The Court has ruled in this case that it cannot be derived from European legislation that Member States are obliged to install a duty to provide personal data in the context of a civil procedure to ensure the effective protection of copyright (see p. 9.). For an exhaustive overview of case law in various countries searching for an appropriate legal basis for the communication or not of identifying details in an electronic communications environment, see A. Ekker, Anoniem communiceren: van drukpers to Weblog, Dissertation, Amsterdam, 2006, p. 192 et seq. (‘Ekker, Anoniem communiceren, 2006’); see also E. Kindt and S. van der Hof, ‘Identiteitsgegevens en – beheer in een digitale omgeving: een juridische benadering’, Computerrecht 2009, (44) pp. 44–46; see also F. Coudert, and E. Werkers, ‘In the Aftermath of the Promusicae Case: How to Strike the Balance?’, in International Journal of Law and Information Technology 2010, pp. 50–71.

  198. 198.

    Recital 9 of Directive 2002/58/EC stresses the need for Member States to take particular account of the objectives of minimizing the processing of personal data and of using anonymous or pseudonymous data where possible. Article 6 of the Directive imposes as a principle anonymity of traffic data when it is no longer needed for the purposes of the transmission of the communication and, as stated above, Article 9 imposes anonymity of location data, unless used with the consent for the provision of value added services. See also Dumortier, Goemans and Loncke, Apes, Legal Issues, 2003, p. 29.

  199. 199.

    For protection of the secrecy of (electronic) communications at the constitutional level, see, e.g., the Netherlands, Germany and Sweden. For protection in specific legislation, see, e.g., Belgium and France. See also Article 1 (1) of the Directive 2002/58/EC as modified, referring to the need for harmonization of in particular of the right to privacy and confidentiality and the processing of personal data in the electronic communications sector (and the free movement of such data).

  200. 200.

    See however Art. 6.1 of Directive 2002/58/EC which one could invoke for defending that anonymity of subscribers and users in public networks (for third parties) is however aimed at (electronic) communications services.

  201. 201.

    For example, the issue whether and under which conditions parties to a sales contract may remain anonymous. Since there are for the purchase of movable goods in general in many countries no formal requirements, parties can remain in general anonymous. For a detailed discussion of requirements and anonymity for transactions under Dutch legislation, see J. Grijpink and C. Prins, ‘New rules for anonymous electronic transactions? An exploration of the private law implications of digital anonymity’, in C. Nicoll, J. Prins and M. Van Dellen (eds.), Digital Anonymity and the LawTensions and Dimensions, The Hague, Asser Press, ITeR, 2, 2003, pp. 249–269 (‘Grijpink and Prins, New rules for anonymous electronic transactions, 2003’).

  202. 202.

    See e.g., for France, Article 326 of the Civil Code, which states that the mother can, during giving birth, request that the confidentiality of her admission and of her identity is kept. See also the Act N° 93–22 of 8 January 1993 modifying the Civil Code relating to the civil identity, the family and the rights of the child and installing a family judge (JO N° 7, 9 January 1993) which states that an application for disclosure of details identifying the natural mother is inadmissible if confidentiality was agreed at birth (See Art. 325 and 326 of the French Civil Code as modified, available at http://legifrance.gouv.fr). This right to anonymity or secrecy was upheld by the ECtHR in the case Odièvre v. France of 13 February 2003.

  203. 203.

    See, e.g., the use of anonymous data for biobanks (see advice 17/2009 of the CBPL).

  204. 204.

    For example, the right to file a complaint anonymously. According to Belgian criminal procedure law, the person who files a complaint needs to sign in principle the written statement drawn up by the judicial officers (Art. 31 of the Criminal Procedure Code). About the filing of a complaint online for particular crimes, see Federal police, Aangifte van misdrijf via internet kan vanaf nu in heel België, 7.06.2007, available at http://www.polfed-fedpol.be/presse/presse_detail_nl.php?recordID=1320 While it is not expressly stated, the use of an eID for identification purposes is required. Case law, however, accepts that the police is entitled to keep the identity in some cases secret. The Belgian Supreme Court however stated in a decision of 6 December 2005 that it needs to be checked that this right is not used for other purposes. See also and compare with the (Belgian) Act of 8 April 2002 relating to anonymity of witnesses, interrogated by an investigating judge and whose witness declaration will be used as evidence. About anonymity in criminal procedure in Belgium, see L. Smets and J. De Kinder, Proces-verbaal, aangifte en forensisch onderzoek, Antwerpen, Maklu, 2011, pp. 64–65.

  205. 205.

    See, e.g., the Royal Decree of 2001 in execution of the Belgian Data Protection Act (Article 3) and the German Federal Data Protection Act, section 40 (2) (‘The personal data shall be rendered anonymous as soon as the research purpose permits this’. (…)).

  206. 206.

    In Belgium, Article 1(5) Royal Decree of 2001; on anonymous data, see also Kuner, European Data Protection Law, 2007, no. 2.08 et seq.; however, anonymity is often (mis)understood. See also P. Ohm, ‘Broken Promises of Privacy: Responding to the surprising failure of anonymization’, UCLA Law Review 2010, pp. 1701–1777 (‘Ohm, Broken Promises, 2010’).

  207. 207.

    B. Holznagel and M. Sonntag, ‘A case study: the Janus project’, in C. Nicoll, J. Prins and M. Van Dellen (eds.), Digital Anonymity and the LawTensions and Dimensions, The Hague, Asser Press, ITeR, 2, 2003, (121), p. 127.

  208. 208.

    In national legislation, e.g., for Belgium, the Belgian Electronic Communication Act prohibits the supply and use of telecommunications services or equipment that render caller identification impossible, or that otherwise make it difficult to track, monitor, wiretap or record communications. In addition, technical and administrative measures can be adopted and imposed on operators or end users in order to be able to identify the calling line in cases of emergency calls as well as for the investigation of specific crimes (Article 127).

  209. 209.

    For example, of free speech. See on this topic, Ekker, Anoniem communiceren, 2006 and WP 29 Recommendation 3/97: Anonymity on the Internet, p. 5.

  210. 210.

    For example, RFID and biometric technologies, but also DNA sniffers (genomic DNA sequence automatic searcher) or the development of ubiquitous sensors in an ambient environment.

  211. 211.

    See also on this issue C. Prins, ‘Making our body identify for us: Legal implications of biometric technologies’, Computer Law & Security Report, 1998, (159), p. 163; see for a similar conclusion in relation with public electronic communications and the need to establish such right in the Netherlands, Ekker, Anoniem communiceren, 2006, p. 237; on the issue of RFID and anonymity, see also G. Verhenneman, ‘Radio Frequency Identification – Fictie wordt werkelijkheid en onze privacy staat weer onder druk’, in Jura Falconis 2007–2008, pp. 154–155 and pp. 158–159.

  212. 212.

    See and compare, e.g., with the common law view of anonymity: e.g., Ohm, Broken Promises, 2010.

  213. 213.

    See Part II, Chap. 4, § 30.

  214. 214.

    In some civil law countries, for example, a sales agreement is in principle concluded and will take effect as soon as parties agree upon the price and the object of the sale. For Belgium, see article 1583 of the Civil Code. This principle is effective if the purchase would concern goods or services for which no written contract is entered into, for example the sale of a good or service in a shop.

  215. 215.

    See also Prins, Biometrie, 2001, p. 5: ‘Ook bij de toepassing van biometrie bestaan variatie-mogelijkheden’.

  216. 216.

    If the object is however a real estate, the sales agreement will usually have to be passed before a notary public and the sales deed registered to render it opposable against third parties. The parties to the purchase agreement will for these purposes be fully identified (by the notary public, who has in many cases a legal obligation to do so) in the authentic deed which will be made public (by registration) (‘organized, personalized transactions’). For the proposed terms for the varying degrees of anonymity, see Grijpink and Prins, New rules for anonymous electronic transactions, 2003, p. 251.

  217. 217.

    A. Pfitzmann and M. Hansen, Anonymity, Unlinkability, Undetectability, Unobservability, Pseudonymity, and Identity Management – A Consolidated Proposal for Terminology (Version v0.31 Febr. 15, 2008), 83 p., available at http://dud.inf.tu-dresden.de/literatur/Anon_Terminology_v0.31.pdf (‘Pfitzmann and Hansen, Terminology, 2008’), p. 8; see also the ISO/IEC 15 408 standard (as mentioned below at footnote 227 and discussed in Chap. 8, § 338 and footnote 325) which determines that the first level of security is non-observation.

  218. 218.

    See C. Diaz, S. Seys, J. Claessens and B. Preneel, ‘Towards measuring anonymity’, in Designing Privacy Enhancing Technologies, H. Federath (ed.), vol. 2482, LNCS, 2002, available at https://www.cosic.esat.kuleuven.be/privacyGroup/person.php?persid=36, and the references therein to related research for measuring anonymity.

  219. 219.

    Council of Europe, Recommendation No. R(97) 5 of the Committee of Ministers to Member States on the Protection of Medical Data, 17 February 1997, Art. 1, available at https://wcd.coe.int/wcd/com.instranet.InstraServlet?command=com. instranet.CmdBlobGet&InstranetImage=564487&SecMode=1&DocId=560582&Usage=2 (‘CoE, Recommendation No. R(97) 5 Medical data’).

  220. 220.

    See also Part I, e.g., as stated in Chap. 3, footnote 533.

  221. 221.

    Article §3(6) and (6a) of the German Federal Data Protection Act of 20 December 1990, Federal Gazette I, pp. 2954–2955, as amended, including by the Act to Amend the Federal Data Protection Act of 18 May 2001. See and compare also with the concepts of personally identifiable information (PII) and non-PII used in e.g., the United States, but recently being questioned. See also Ohm, Broken Promises, 2010, explaining that researchers ‘have found data fingerprints (sic) in pools of non-PII data, with much greater ease than most would have predicted’.

  222. 222.

    See Goemans and Dumortier, Privacy and on-line Anonymity, 2003, pp. 182–183.

  223. 223.

    We would argue that anonymity is presently under the Directive 95/46/EC a more uniform concept while being subject to (failing) reasonable efforts to identify.

  224. 224.

    See, e.g., J. Hoepman, ‘Revocable privacy’, P&I 2008, pp. 114–118.

  225. 225.

    R. Clarke, ‘Identified, Anonymous and Pseudonymous Transactions: The Spectrum of Choice’, April 1999, p. 5, in S. Fischer-Hübner, G. Quirchmayr, L. and L. Yngström (eds.), User Identification & Privacy Protection: Applications in Public Administration & Electronic Commerce, Kista, Sweden, June 1999, IFIP WG 8.5 and WS 9.6.; about the protection of user identities on application level, see also Fischer-Hübner, IT-security and Privacy. 2001, pp. 137–157; in general, see also S. Clauss, A. Pfitzmann, M. Hansen and E. Van Herreweghen, Privacy-Enhancing Identity Management, IPTS report, September 2002, available at http://dl.acm.org/citation.cfm?id=1102501

  226. 226.

    See for the concept of ‘héteronymat’ in France, Y. Détraigne and A.-M. Escoffier, Rapport N° 441. Respect de la vie privée à lheure des mémoires numériques, Senate, 2008-09, p. 107: ‘Chaque individu pourrait se forger de véritables personnalités alternatives, distinctes de la personnalité civile qui les exploite. Afin d’éviter que ce droit ne serve à commettre des infractions, ces identités alternatives pourraient être déposées auprès d’un organisme chargé de les gérer. En cas d’infractions par exemple, la justice pourrait demander l’identité civile de la personne’ (‘Détraigne and Escoffier, Rapport N° 441, 2009’).

  227. 227.

    See the standard ISO/IEC 15408- 2:1999 Information Technology – Security techniques – Evaluation criteria for IT Security – Part 2: Security Functional Components, first edition, p. 287, in the meantime replaced by ISO/IEC 15408-1/2/3:2005. About this standard, see also Chap. 8, § 338 and footnote 325

  228. 228.

    See also the definition of the term ‘reversible pseudonymity’ in the proposals for the fore mentioned ISO 15408 – 2 standard (see above), term 13.2.6, p. 72.

  229. 229.

    See K. Borcea-Pfitzmann, E. Franz and A. Pfitzmann, ‘Usable Presentation of Secure Pseudonyms’, in DIM 2005, p. 70 et seq.

  230. 230.

    Article 8 (3) eSignature Directive 1999/93/EC of 13 December 1999 on a Community framework for electronic signatures (O.J. L. 13, 19.01.2000, pp. 12–20) states it as follows: ‘Without prejudice to the legal effect given to pseudonyms under national law, Member States shall not prevent certification service providers from indicating in the certificate a pseudonym instead of the signatory’s name’.

  231. 231.

    Article 3a German Federal Data Protection Act, as revised. See also the German Federal Data Protection Act establishing the General Conditions for Information and Communications Services which recognizes the use of pseudonyms by stating that ‘user profiles are permissible under the condition that pseudonyms are used. Profiles retrievable under pseudonyms shall not be combined with data relating to the bearer of the pseudonym’, mentioned in Dumortier, Goemans and Loncke, Apes, Legal Issues, 2003, p. 30. The authors refer to Article 2 § 4 (4) of the Federal Act Establishing the General Conditions for Information and Communication Services Information and Communication Services Act. For other provisions, see also, e.g., in Germany, Section 4(6) of the Teleservices Act and Section 18 (6) of the Interstate Treaty for Media Services which impose a duty on the provider of information society services to offer the possibility of anonymous or pseudonymous use of their services where this is technically feasible and reasonable.

  232. 232.

    Article §3 (6a) of the German Federal Data Protection Act of 20 December 1990, as amended.

  233. 233.

    Art. 231 Belgian Penal code penalizes ‘adopting in public a name which does not belong to oneself’ (‘valse naamdrachtoraanmatiging van naam’). The article was introduced with the adoption of the Penal Code by Act in 1867 and is part of Title III ‘Criminal offences against the Public Trust’ and in particular of a chapter which penalizes the unlawful adoption of functions, titles, or names. The purpose of the legislator was to abolish uncertainty with regard to someone’s identity. The article is related to public order (‘openbare orde’/‘ordre public’). Three elements have to be combined: (1) the adoption of a name, (2) in public, and (3) the name should not belong to oneself. In addition, one shall do this ‘knowingly’ (‘wetens en willens’); for a further discussion and evaluation, see Kindt, Belgium, van der Meulen and B.-J. Koops, Identity-related crime, Fidis D12.7, 2008, p. 19–20. For case law, applying art. 231 Penal Code, see, e.g., Corr. Gent, 21.09.2011, not yet published; other legislation which shall be reviewed in case one shall be entitled to use pseudonyms, is for Belgium a decree of 6 fructidor jaar II (23.08.1794) which is still in force and which forbids every citizen to use another name or first name than the one mentioned in the birth certificate.

  234. 234.

    See and compare with the French DPA which only mentions the use of a pseudonym in connection with a biometric identifier in Unique Authorization n°AU-027, discussed in Part II.

  235. 235.

    About this risk, see Part II. About pseudonyms, see also R. Clarke, Introduction to Dataveillance and Information Privacy, and Definitions of Terms, 1997, as revised, 12 p., available at http://www.rogerclarke.com/DV/Intro.html

  236. 236.

    About the need for multiple (biometric) identities, see also A. Pfitzmann, ‘Biometrics – How to put to use and How Not at All’, S. Furnell, S. Katsikas and A. Lioy (eds.), TrustBus 2008, LNCS, p. 4 (‘Pfitzmann, Biometrics, 2008’); about this new approach towards ‘reversible anonymity’ in general in data protection, see also Poullet, Troisième génération de réglementations de protection des données, 2005, pp. 9–10.

  237. 237.

    See and compare also with the recommendations to use so-called ‘pseudo-identities’, which the data subject can choose to deploy in different situations, in the At Face Value report published by the Dutch DPA. See Part II, Chap. 6, § 505. As to the ‘identification risks’, this shall be understood as how one defines such ‘identification risks’.

  238. 238.

    The data subject, however, could still be identified by comparing a sample against the pseudonymous identifier (protected template). See also the comment in the report for the Dutch DPA that (unless specific technologies are used) when using biometric systems, ‘changing pseudo-identities is not possible’ and that it is desirable ‘to limit the ability to link the different databases’ to protect the privacy (emphasis added). The issue that one will not be able to remain anonymous when for example voice recognition would be used for identifying customers when ordering electronic commerce services and the use of biometric systems for profiling purposes is also mentioned. Hes, Hooghiemstra and Borking, At Face Value, 1999, p. 45.

  239. 239.

    Such universal identifier could be e.g., a name or a biometric sample. The service provider hence can only identify the data subject for purposes of the application by the additional information about the person to whom the pseudonymous identifier relates. If in the application context no other identity details are available to the service provider, the latter could not ‘identify’ the data subject in another way than for purposes of the application (in a technical sense). In that case, only the (trusted) identity provider would be informed of the identity details of the data subject.

  240. 240.

    See also in this context the comment in the report At Face Value for the Dutch DPA, that there is an issue that one will not be able to remain anonymous when for example voice recognition would be used for identifying customers when ordering electronic commerce services. See Part II, Chap. 6, § 501.

  241. 241.

    We would refer to such use of biometric data as to ‘anonymous use’ of biometric data. At first sight, one would think that biometric data cannot be used anonymously, i.e. as understood under the data protection legislation (see above). Biometric data as understood and defined for purposes of this research is as such linked per se to the individual to whom the data belong. Therefore, using biometric data ‘anonymously’ would seem to be contradictory, but it is not as we explain.

  242. 242.

    In this case, only the claim of the data subject as to whether he or she belongs to the same group is in fact verified. A practical application could be, e.g., to the extent the need to process biometric data could be demonstrated (e.g., misuse of cards), an access card to a (university) library allowing access to persons of a particular group (e.g., students, …) and/or to lend books (e.g., after a due (anonymous) deposit equal to the books that can be lend out). However, in practice, anonymous biometric access control is currently almost never used. Generally, in most access control systems, the controller wants to know who (i.e. which badge) had access. In that case, the information about the user of the badges, if recorded, if necessary, shall be secured and protected and could for example only be used or revealed with a legal order.

  243. 243.

    This device should securely store the biometric reference and allow comparison on card or on token with a stored (protected) template.

  244. 244.

    See on this type of anonymous use, see J. Bringer, H. Chabanne, D. Pointcheval and S. Zimmer, ‘An Application of the Boneh and Shacham Group Signature Scheme to Biometric Authentication’, K. Matsuura and E. Fujisaki (eds.), IWSEC 2008, Berlin, Springer, 2008, pp. 219–230; see also and compare with Müller and Kindt, Model implementation, Fidis, D.3.14, 2009, p. 33: if the biometric data is solely used for verification purposes under the control of the data subject, and this data subject is entitled to use pseudonyms, even if these pseudonyms need to be securely linked to an identity, and whereby these pseudonyms are not linked and do not permit to reveal the real identity of the user, biometric data could under these conditions also be fit to be used anonymously to a certain degree (as far as the biometric data are concerned, and in particular at the level of a service for enhanced authentication purposes).

  245. 245.

    See Part II, Chap. 5, §§ 389–390 and § 402; see also Part I, Chap. 3, § 202 and § 227 and Van Kralingen, Prins en Grijpink, Het lichaam als sleutel, 1997, p. 32, footnote 40, where the authors refer to the use of zero-knowledge protocols which do not reveal the secret. Use of these techniques (zero-knowledge protocols) are another example of how advanced cryptographic primitives, like group signatures, can provide a certain degree of anonymity.

  246. 246.

    In the Turbine project, this protocol was implemented in a mock up demonstrator permitting biometric access control for a group of pharmacists for particular applications, whereby the identity of the pharmacists was not required but only that the persons concerned belonged to a group of pharmacists and had such professional qualification. Some may require for ‘fully anonymous verification’ that additional requirements are fulfilled, in particular that the sensor is the only device (entity) that receives the sample and transforms it before feeding it into the system. In other words, it should be impossible to link different transactions from the same users (e.g., by the group signature techniques). See also and compare with the AXS passport system for which no biometric data and no civil identity data is disclosed to a service provider using a special protocol. About the AXS passport system, see above § 74 et seq.

  247. 247.

    See also Part I. We would like to add thereto that Korff in his comparative study of 2010 made in the discussion about anonymization, pseudonymization and re-identifiability a point, which is because of the increasing number of biometric data processing for our study more than relevant as well: ‘(…) it is effectively impossible to keep data truly unidentifiable once the basic information is released, even in encoded form, if there are other data sets against which the pseudonymised or anonymised data can be matched (…)’. In his view, the discussion should no longer be about the key to render data anonymous or pseudonymous, but rather about the relevant data set in our ‘new global-technical environment’ where ‘highly sophisticateddata matchingsoftware will be much more readily available, to law enforcement agencies and other public bodies, but also to companies and private individuals’. (Korff, New Challenges to Data Protection. Working Paper N° 2, 2010, pp. 50–51).

  248. 248.

    See and compare with A. Cavoukian and M. Snijder, A Discussion of Biometrics for Authentication Purposes: The Relevance of Untraceable Biometrics and Biometric Encryption, July 2009, 7 p. (‘Cavoukian and Snijder, Untraceable biometrics 2009’).

  249. 249.

    For example, for justifying the increasing police co-operation for the exchange of fingerprints and DNA profiles since the Prüm Treaty (see above), it is defended that since only ‘anonymous’ profiles are compared, where personal data is only exchanged after a ‘hit’, the hit/no hit system guarantees an adequate system of data protection (see recital 18 of the Council Decision on the stepping up of cross-border cooperation).

  250. 250.

    Term 37.03.01 ISO Vocabulary for Biometrics 2012. See and compare with SD2 Version 12 – Harmonized Biometric Vocabulary, term 3.2.2.2.2.1.

  251. 251.

    Metadata are data describing the content of data files.

  252. 252.

    See SD2 Version 12 – Harmonized Biometric Vocabulary, Annex A Other terms, term A.2.6. Reference is made to the Oxford dictionary.

  253. 253.

    A note with the term 37.03.01 does correct the definition to some extent, where the note states that ‘biometric data within the biometric data record ultimately remains attributable to an individual’, hereby stating indirectly that the ‘anonymized biometric data’ remains personal data. Nevertheless, this note does not eliminate the confusion that does exist on this point.

  254. 254.

    This balancing is especially relevant if an application risks to interfere with the fundamental rights of the data subjects and the proportionality of the interference, both under the Directive 95/46/EC, but also under the fundamental rights, as discussed in Part II, shall be reviewed.

  255. 255.

    See Part I, Chap. 2, §§ 109–123.

  256. 256.

    Interferences with fundamental rights of data subjects are only allowed if the interference is ‘relevant and sufficient’ ànd ‘efficient’. See Part II, Chap. 5, §§ 347–350. See and compare with the use of DNA. For example, in relation with the Prüm Treaty as discussed in Part I, it has been criticized that information about the number of ‘matches’ of DNA-data upon automated comparison is often published. However, successful comparisons should be distinguished from successful use because the number of ‘hits’ is hereby not equal to the number of solved crimes.

  257. 257.

    E.g., biometric vendors sometimes use data collected by their clients for further testing purposes. See also below § 169 and footnote 414.

  258. 258.

    About the performance issues of biometric systems, see Part I, Chap. 2, §§ 109–136.

  259. 259.

    NRC, Biometric Recognition, 2010, p. 5.

  260. 260.

    EDPS, Opinion on Turbine, 2011, p. 8, §§ 35–37. See also and compare with the Taiwanese Constitutional Court requiring in relation to the collection of fingerprint for eID cards, to demonstrate the necessity and relevance. About the decision, see also Part II, Chap. 4, footnote 80.

  261. 261.

    Article 5a Regulation No 444/2009. Emphasis on benchmarking and quality control is seemingly a priority in the Indian Unique Identity Project (see also Part I, Chap. 2, footnote 187) as mentioned in Snijder, Crash of zachte landing, 2010, pp. 76–78.

  262. 262.

    WP 29 Opinion 2/2005 on VIS and exchange of data (WP110), p. 12.

  263. 263.

    An example of such reports with technology assessments, includes FBI, State-of-the-Art Biometric Excellence Roadmap (SABER) Report, 24.11.2008, available at http://www.biometriccoe.gov/SABER/index.htm; about such public databases, see below §§ 171–173.

  264. 264.

    Some of these institutes were mentioned in Part I, Chap. 2, §§ 172–176.

  265. 265.

    See Newton, Large-scale evaluation, 2011, slide 8. These rates would be EER. See also E. Newton, Biometrics and Surveillance: Identification, De-Identification, and Strategies for Protection of Personal Data, thesis, 2009, p. 34, available at http://books.google.be/books?id=-B4JYm0-6bAC&printsec=frontcover&hl=nl&source=gbs_ge_summary_r&cad=0#v=onepage&q&f=false

  266. 266.

    Grother, Quinn and Phillips, Multiple Biometric Evaluation (MBE), 2010, p. 34. About the performance results of face recognition, see also Part I, Chap. 2, § 128. ‘At a FRR of 0.001’ meaning 1 false reject for every 1,000 comparisons. In other words, the results improved from a FNMR of 20 % at a FAR of 0.1 % in 2002 to a FNMR of 0.3 % at a FAR of 0.1 %. About the results of face recognition at the FRVT 2002 competition, see also JRC, Biometrics at the Frontiers, 2005, p. 107. About a general improvement of the error rates for facial recognition, see als the performance results of facial comparison techniques in the 3D Face project (2006–2009).

  267. 267.

    Newton, Large-scale evaluation, 2011, slide 11. In other words, iris have an average of a FNMR of 1,46 % at a FMR of 0.1 %. The results are based on the parameters of the testing, on particular databases with certain number of data subjects, as further mentioned.

  268. 268.

    See also Part I, Chap. 2, § 128 and §§ 131–136. For example, in 2010, a success rate of 91 % was mentioned as a best case result in a proof of concept testing of a face recognition project with identification functionality and privacy protective technology involving up to 20,000 visitors per day. (See A. Cavoukian and T. Marinelli, Privacy-Protective Facial Recognition: Biometric Encryption.Proof of Concept, Information and Privacy Commissioner Ontario, Ontario Lottery and Gaming Corporation, November 2010, p. 13 (‘Cavoukian and Marinelli, Privacy-Protective Facial Recognition, 2010’), available at www.ipc.on.ca).

  269. 269.

    About the concept of protected templates, see below, § 315 et seq.

  270. 270.

    See on this aspect also NRC, Authentication Report, 2003, p. 156. In this report, the example is given of the Walt Disney Company (WDW), using a fingerprint technology system since 1996, testing several biometric (other) technologies over the years, in particular laboratory testing, technology testing, scenario testing and operational evaluation, whereby the controller is able, based on its substantial amount of experience to set a threshold for the performance.

  271. 271.

    See and compare with the discussion on the use of the identification functionality of SIS II in the House of Lords, Schengen Information System II (SIS II). Report with evidence, London, House of Lord, HL Paper 49, 2 March 2007, p. 44 (‘House of Lords, SIS II, 2007’).

  272. 272.

    The use of biometric systems by local or central public governmental authorities, and not controllers of the private sector are not analyzed in particular in this treatise. This is out of the scope of our research.

  273. 273.

    See also Part II, Chap. 6, §§ 614–617. For an example wherein interests are discussed and defined in relation with a (Dutch) ID card, see Hoge Raad, 9.09.2011, LJN BQ4105, mentioned also in Part II, Chap. 4, footnote 42. This impact is sometimes also referred to as an IA (see also above § 7).

  274. 274.

    Other interests (and rights) are the right not to be treated as suspect, although it can be disputed whether this is also a right outside the context of criminal law, and the right to a fair trial.

  275. 275.

    Although error rates cannot be excluded by deploying biometric systems, these error rates are significantly reduced, for example if used in one-to-one comparisons.

  276. 276.

    See also recent legal regulation, confirming to some extent the interest (also at the side of the data subjects) in avoiding ‘excessive waiting time’: Article 1 Regulation (EC) No 81/2009 of the European Parliament and of the Council of 14 January 2009 amending Regulation (EC) No 562/2006 as regards the use of the Visa Information System (VIS) under the Schengen Border Code, O.J. L 35, 4.02.2009, pp. 56–58 (‘Regulation (EC) No 81/2009 amending VIS’).

  277. 277.

    The risks related to the discriminatory use of biometric systems is one aspect that has been discussed in Chap. 4. as well. Strictly speaking, it may not be considered a privacy risk but a specific threat to another fundamental right, i.e. the right to non-discrimination.

  278. 278.

    The interests involved were however not always clearly discussed by the DPAs.

  279. 279.

    These common values of a society will be related to ethical issues.

  280. 280.

    The police or law enforcement authorities have separate interests in the (biometric) data collected by the data controller. Without the intention of being exhaustive, these interests could include the prevention and prosecution of crimes, public security, maintaining public order and/or safety.

  281. 281.

    See e.g., Vedder, van de Wees, Koops, de Hert, Van privacyparadijs tot een controlestaat?, 2007. For example, in case of the use of special methods invading fundamental rights, this shall be submitted for review by independent magistrates as determined by law. See, e.g., in Belgium, the legal framework for intelligence activities, set forth in the Act of 4 February 2010 (see also footnote 49 in Chap. 8 below).

  282. 282.

    We consider such public authority in particular in its relations with citizens or to protect public interests or interests of the State. E.g., the use of biometric systems for the verification of the identity of citizens, when applying for a passport, falls within – what we would call – the domain of public sector use of biometric data.

  283. 283.

    Private sector use by the government of biometric systems would include, for example, the use of a system for verifying the identity of personnel accessing specific places (e.g., office spaces) or for accessing particular databases or services for meeting its obligation of safeguarding the confidentiality of its information. By researching the proportionality principle and the use of biometric data, we narrowed our field to the use of biometric data by actors in the private sector. The results of our research however should in principle however also apply to public entities if they process biometric data in a role which is similar as a private actor, for example, the use of biometric data as employer.

  284. 284.

    A similar migration characterizes also the conception, the development and the use of nuclear energy, first in the military, later for general energy purposes.

  285. 285.

    We refer, by way of example, to the principle of availability, discussed above, §§ 15–19.

  286. 286.

    Other authors and organizations have made a more in depth study of the use of biometric applications in the private sector and some practical cases, to which we refer. See, for the Netherlands, P. De Hert and A. Sprokkereef, The Use of Privacy Enhancing Aspects of Biometrics: Biometrics as PET in the Dutch Private and Semi-Public Domain, Tilburg, TILT, January 2009, 50 p., available at http://arno.uvt.nl/show.cgi?fid=93109 (‘De Hert and Sprokkereef, The Use of Privacy Enhancing Aspects of Biometrics, 2009’). About Germany, see Cehajic and Sprokkereef, Germany, in Kindt and Müller, Fidis, D.13.4, 2009, pp. 74–75; for biometric applications in several Member States, see several of the studies of Unisys Corporation, a main IT system supplier, which opened in Brussels, Belgium, a Biometrics Centre of Excellence in 2006 to serve its clients established in EU countries, e.g., Unisys, Biometrics in Europe: Trend Report, European Biometrics Portal, 2007, 39 p. The overview, however, mainly describes the implementation of biometric data in passports. About the studies of the BEST network, and in particular the survey of existing biometric applications, see below footnote 363.

  287. 287.

    For a more complete overview of large-scale biometric deployments in Europe, we refer to the following study: J. Goldstein, R. Angeletti, M. Holzbach, D. Konrad, M. Snijder, Large-scale Biometrics Deployment in Europe: Identifying Challenges and Threats, P. Rotter (ed.), JRC Scientific and Technical Reports, European Commission JRC – IPTS, Seville, 2008, 135 p. (‘JRC Report Large-scale Biometrics Deployment 2008’). Pages 14–16 give an overview of the deployments (as of end of 2007) on EU and national level.

  288. 288.

    For example, for Eurodac, VIS and SIS II.

  289. 289.

    Art. 8 (4) Directive 95/46/EC. About this article and the Reform proposals, see also Part II, Chap. 6, footnote 199 above. Other exceptions mentioned in Art. 8 (2) are when the ‘processing relates to data which are manifestly made public by the data subject’ (Art. 8 (2) e). For an application of this exception, see Murray v. Express Newspapers & Big Pictures (UK) Ltd [2007] EWHC 1908 in which the High Court in the United Kingdom while accepting that the photograph constituted in its view sensitive personal data (see also Part I, Chap. 3, footnote 182) agreed that the processing was permissible because ‘exposure of that image to the public by appearing in a public place’.

  290. 290.

    For example, the Belgian DPA stated in its annual report for 2008 that it has received more than ever questions about especially biometric data processing in employment relations. See CBPL, Jaarverslag 2008, p. 6. Employers may also collect data from employees of contract partners and from individual independent contractors. We here focus, however, on biometric data collection by the employer from employees.

  291. 291.

    In principle, and as stated, it will not be of importance whether the employer is active in the private or public sector (see, e.g., and compare in the United States which planned to introduce a biometric access card for government officials of the Department of Defense (DOD)), unless in case of specific statutory provisions which apply and provide specific protection to employees in the public sector.

  292. 292.

    Art. 3, Council of Europe, Recommendation No. R(89) of the Committee of Ministers to Member States on the protection of personal data for employment purposes, 18 January 1989, 5 p., available at http://www.coe.int/t/dg3/healthbioethic/texts_and_ documents/Rec(89)2E.pdf; About work councils and some examples in Member States, see, e.g., Kuner, European Data Protection Law, 2007, pp. 277–282, nos 5.101–5.118.

  293. 293.

    See also Part II, Chap. 5, § 478.

  294. 294.

    E.g., in Belgium, where social partners have an important role with regard to regulating working conditions, such as confirmed, for example also in the Collective agreement No. 39 on the information and consultation with regard to the social consequences of the introduction of new technologies of 13 December 1983 (B.S. 8.2.1984); see also the court cases referred to in Part II, Chap. 5, § 380, footnote 418; see also for Germany, where the Association for the Improvement of Trust for Information and Communications technologies (‘Verein zur Fördering der Vertrauenswürdigkeit von Informations- und Kommunikationstechnik’) (Teletrust) provided guidelines in 2005 for the negotiation and agreement with representative organizations upon implementation of biometric systems in a company: A. Albrecht, M. Bromba, G. Hornung, G. Laβmann, and G. Quiring-Kock, “Orientierungshilfe für eine Betriebsvereinbarung beim Einsatz biometrischer Systeme. Arbeitsgruppe 6, Biometrische IdentifikationsverfahrenAKRechtliche Aspekte der Biometrie”, Teletrust, 21 September 2005, 17 p., available at http://www.teletrust.de/uploads/media/TTT-AG_Biometrie-Orientierungshilfe-Betriebsvereinbarung.pdf

  295. 295.

    We advise to review legislation in relation to medical examination of workers and employees as well, under which the collection of health information may be restricted. See, e.g., for France, Part I, Chap. 3, § 367.

  296. 296.

    See, e.g., in Belgium, Collectieve arbeidsovereenkomst No. 68 van 16 juni 1998 gesloten in de Nationale arbeidsraad, betreffende de bescherming van de persoonlijke levenssfeer van werknemers ten opzichte van de camerabewaking op de arbeidsplaats, rendered generally applicable by Royal Decree of 20.9.1998 (B.S. 2.2.1998) – for further reading on this collective labor agreement No. 68, see Van Eecke P. and B. Ooms, ‘De nieuwe wet op de camerabewaking: een analyse van een langverwachte wet na een rechtsonzekere periode’, in Computerrecht 2008, pp. 147–153.

  297. 297.

    See, e.g., in Belgium, Collectieve arbeidsovereenkomst No. 89 betreffende de diefstalpreventie en de uitgangscontroles van werknemers bij het verlaten van de onderneming of de werkplaats, 30.01.2007, 9 p., available on http://www.cnt-nar.be/CAO-COORD/cao-089.pdf The representative organization (‘ondernemingsraad’/‘conseil dentreprise’) shall be informed if such measure of control at the exit is taken. About this Collective Agreement no 89 for prevention of theft and exit-control of employees, see also Blanpain, Labour Law in Belgium, 2010, pp. 171–172. Blanpain states however that systematic controls would be allowed when using ‘electronic detection systems’.

  298. 298.

    These additional rules shall be distinguished from specific data protection legislation for the processing of data of employees in the employment context. For an idea of different data protection obligations for the controllers, see also the Analysis and impact study on the implementation of the Directive 95/46/EC, pp. 12–13.

  299. 299.

    See also CBPL, Advice N° 17/2008 biometric data, § 6. See and compare also with the Type II Security and Access control (authorization) applications suggested and discussed, as well as previous attempts of classification of biometric systems in Kindt, Meints, Hansen and Müller, Control schemes, in Kindt and Müller, Biometrics in identity management, Fidis, D.3.10, 2007, pp. 55–67.

  300. 300.

    In particular, when the de visu control by colleagues, by receptionists or security personnel prove to be no longer sufficient. Such systems may be no longer effective, e.g., in case of a high number of personnel.

  301. 301.

    See, however the French DPA. Four of the five UAs relate to the use of biometric data in the employment context. The UAs of the CNIL, however, require strict compliance with very specific requirements, including the type of characteristic used. In case of non-compliance, prior authorization remains required. The Belgian DPA does not elaborate in its Opinion N°17/2008 on the specific situation of employees and the use of biometric systems (see CBPL, Advice N° 17/2008 biometric data, § 6, § 71 and §§ 75–76 and Part II, Chap. 5, § 381 et seq.). The Dutch DPA also does not discuss the use of biometric systems in particular for this domain of application.

  302. 302.

    Our reference to safeguards is a reference to the application of the criteria which we described in Sect. 7.2 (use of templates and verification only with local storage), as well as application of the additional criteria which we further explain in Sect. 8.3 of Chap. 8.

  303. 303.

    We have explained this proportionality requirement in detail in Part II, Chap. 5, §§ 242–255.

  304. 304.

    Examples could include access to places with storage of expensive goods, access to critical (financial) information about the company, access to health related data of other employees, access to (images of) (private) surveillance cameras, … See and compare also with advances in technology for behavioral monitoring, e.g., of truck drivers, preventing their ‘falling asleep’ and therefore arguably for aims of public security (see also above Part I). However, such application risks also to be used for surveillance purposes (see below).

  305. 305.

    This includes their reasonable expectations to privacy.

  306. 306.

    See also Part II, Chap. 6, § 554 and footnote 46. On consent in the context of employment, see also WP 29 Opinion Consent (WP187), p. 14.

  307. 307.

    E.g., in case of use of a biometric access control application for enhanced security, same shall not be used for time and attendance control.

  308. 308.

    Several distinct national data protection legislations may apply to one system, if such system and the data processed are used for the activities of several establishments (e.g., offices) (e.g., one in Brussels and one in Paris) of the employer, as determined by the provisions of the applicable national data protection legislation(s) determining the scope ‘ratione loci’ of the legislation.

  309. 309.

    See, e.g., in the Netherlands, where the data protection legislation contains an explicit exception to the prohibition of the processing of racial information for identification purposes (art. 18 (a) Wbp). E.g., the processing of pictures for access control by an employer could fall under this exception. It can be debated what form such law shall take, for example, by a clear opinion of the competent DPA, by a collective labor agreement in some countries, or by a law.

  310. 310.

    See and compare with the reasoning of the French DPA in its 2007 Communication for allowing central fingerprint storage, analyzed in Part II, Chap. 5, §§ 473–478. The CNIL therein refers to restricted access to an area representing or containing a major stake which surpasses the strict interest of the organization (‘un enjeu majeur dépassant linterêt strict de lorganisme’) and, in other documents, to ‘an important security need’ (‘un fort imperative de sécurité’) (see CNIL, Guide pour les employeurs et les salaries, p. 36.). Examples given by the French DPA in this context include access control for protection of the physical integrity of persons to places with explosion risks or with dangerous goods, or with risks of theft of such goods (e.g., access to specific zones of nuclear installations) or for the protection of goods and installations, biometric access control to specific area’s which could cause irreversible and important damages going beyond the strict interest of the organization (e.g., a company engaged in national defense) and access control for information in need for protection in particular because of the risks of divulgation, other use or of destruction.

  311. 311.

    Especially the last legitimate aim for interference will be important for employers. See also Hendrickx, Privacy en Arbeidsrecht, 1999. A general comparison in this regard can also be made with modifications in the French Code of Criminal Procedure facilitating preventive identity control of any other persons whatever their behavior, and in particular for the prevention of disturbing public order, in particular to prevent an attack to the security of persons or goods. The Code of Criminal Procedure however provides that the identity control should be effectuated by the police (See Part II, Chap. 4, §§ 18–19).

  312. 312.

    See and compare, e.g., with legislation imposing upon airport and port infrastructure companies an obligation to protect particular ‘critical’ infrastructure. See also the above Part II.

  313. 313.

    See also Part II, Chap. 5, §§ 347–350. This could, e.g., be demonstrated with the accuracy level and reliability of the system.

  314. 314.

    Under existing regulations, such as the need for prior checking and authorization, this could also be a decision upon the request for authorization.

  315. 315.

    An explicit legal basis remains in our view required for such case. The provision in the data protection legislation that the processing of ‘sensitive data’ is exempted if ‘necessary for the purposes of carrying out the obligations and specific rights of the controller in the field of employment law in so far as it is authorized by national law providing for adequate safeguards’ (emphasis added) (Article 8 (2) (b) Directive 95/46/EC) in fact confirms this. In some countries, where the guidelines of the DPA are sufficiently precise, accessible and foreseeable (see ECtHR jurisprudence on the quality requirements of a legal basis (see Part II, Chap. 5, § 311 et seq.)), these guidelines could arguably by some be defended as to qualify as legal basis (see and compare with the CNIL, Communication central storage fingerprint 2007). However, ‘the law must be sufficiently clear in its terms to give citizens an adequate indication as to the circumstances in which and the conditions’ on which interference with the right to respect for private life and correspondence is allowed (Malone 1984, § 67).

  316. 316.

    In some cases, monitoring employees is also involved (see below). Some could also argue that security is involved (in particular, security of e.g., badges, which can, by implementing a biometric system, not be transmitted anymore) in this type of administrative control, but this aspect is not decisive and in our view subject to the administrative (and monitoring) aspects. Again, a clear determination of the interests remains crucial.

  317. 317.

    See the French DPA who in 2006 has authorized by the Unique Authorization N° AU-007 under strict conditions the use of hand geometry for access control to the work place, time and attendance and the use of the canteen, but modified this N°AU-007 later in 2012 by excluding the use of such system for time and attendance after consultation with various labour organizations: ‘[u]n consensus s’est clairement exprimé considérant l’utilisation de la biométrie aux fins de contrôle des horaires comme un moyen disproportionné d’atteindre cette finalité’. See also Part II, Chap. 5, §§ 439–443 and our critical observations.

  318. 318.

    This shall include, e.g., the prevention of misuse of the data, including impersonation, and of function creep, such as the use for time and attendance or to monitor employees’ positions or activities.

  319. 319.

    See and compare also with the Type V Tracking and tracing (surveillance) applications in the classification suggested and discussed in Kindt, Meints, Hansen and Müller, Control schemes, in Kindt and Müller, Biometrics in identity management, Fidis, D.3.10, 2007, pp. 55–67. In this group, a public or private authority takes the initiative to collect and process the biometric data for surveillance purposes.

  320. 320.

    See also the decisions of the EctHR in this regard.

  321. 321.

    As stated, specialized publications on this subject for each country shall be consulted for the regulation in place, case law and doctrine on this subject. For such treatise, see, e.g., S. Nouwt, B. de Vries and C. Prins (eds.), Reasonable Expectations of Privacy? Eleven Country Reports on Camera Surveillance and Workplace Privacy, Den Haag, Asser Press, 2005, 363 p.; for Belgium, see, e.g., Hendrickx, Privacy en Arbeidsrecht 1999, pp. 160–179; see also P. de Hert and M. Loncke, ‘Camera surveillance and workplace privacy in Belgium’, in S. Nouwt, B. de Vries and C. Prins (eds.), Reasonable Expectations of Privacy? Eleven Country Reports on Camera Surveillance and Workplace Privacy, Den Haag, Asser Press, 2005, pp. 167–209 (‘de Hert and Loncke, Camera surveillance, 2005’).

  322. 322.

    E.g., the use of biometric sensors in drivers cabinets of trucks to measure the activity of the driver who shall not fall asleep. The biometric application may in this case combine identity recognition with behavioral analysis (for profiling purposes). See also above. See also and compare with research in e.g., HUMABIO mentioned in Part I, Chap. 2, footnote 64. The surveillance, however, shall not be secret. See also de Hert and Loncke, Camera surveillance, 2005, p. 174, footnote 20.

  323. 323.

    It includes that there are no alternative system available for reaching the same goal of the employer and that the system is reliable and efficient.

  324. 324.

    Different distinctions of ‘places’ (such as places which are ‘public’ or which are ‘private’ or ‘closed’ but 4 accessible for the public’ or ‘closed’ and not accessible to the public but 5 only to customers) are made, sometimes also in legislation, in particular about the use of camera surveillance. The terms used here do not necessarily coincide with the same distinctions in this context (see e.g., the definitions of article 2 of the Belgian surveillance camera legislation of 21 March 2007 (as modified); these distinctions, however, still give rise to many questions and difficulties in the application thereof, especially with regard to access to the images, including by law enforcement: see, e.g., an interpretation letter modifying an earlier interpretation letter relating to the fore mentioned Act: Ministeriële Omzendbrief van 13 mei 2011 tot wijziging van de omzendbrief van 10 December 2009 betreffende de wet van 21 maart 2007 tot regeling van de plaatsing en het gebruik van bewakingscamera’s, B.S. 20.05.2011, p. 29017).

  325. 325.

    See also the presentation of a pilot in private-public partnership of a biometric access control system to fast food places in the Netherlands, using facial images in combination with a black list maintained by the police, mentioned below.

  326. 326.

    For some references to examples in different countries, such as the use for access control to swimming pools, see, e.g., Kindt and Müller, Privacy legal framework for biometrics, D13.4, Fidis, 2009. For field research, see also W. Van Laethem, T. Decorte and R. Bas, Private politiezorg en grondrechten, Leuven, Universaire Pers Leuven, 1995, p. 87 et seq. and in particular pp. 110–112 where they discuss the use of smart card systems in dancing halls based on their field research.

  327. 327.

    See also the presentation on the use of a biometric access control system for soccer stadia at the Mid-Winter meeting of WICT (Werkgemeenschap Informatie- en Communicatietheorie Benelux) on the topic of public safety, 31 January 2008, TU Eindhoven, Eindhoven, the Netherlands: K. van Woerden, ‘The Happy Crowd Control Concept’, slides previously available at http://www.sas.el.utwente.nl/wic2008mwm/; for the MasterClub systems, deployed by Crazy Daisy, part of a chain of 55 nightclubs in Denmark, based on fingerprint identification in combination with a list of banned people, see also our brief discussion in Chap. 8, at § 262 and the references thereto.

  328. 328.

    See the MasterClub systems described in London Economics, Study on the economic benefits of privacy-enhancing technologies (PETs). Final Report to the European Commission DG Justice, Freedom and Security, July 2010, pp. 117–120 (‘London Economics, PETs, 2010’). See also Part III, Chap. 8, § 262. This view, however, does not address the issue as to how and under which conditions individuals become listed in the database of ‘potential trouble-makers’ which seems subjective if not clearly defined in regulation.

  329. 329.

    See our suggestions in this regard in Chap. 9.

  330. 330.

    Such controllers could also include entities organizing child care. They have an interest in securing access to the day care centers against invaders or unauthorized persons to protect the children.

  331. 331.

    See also above, § 107 and Chap. 8, § 323.

  332. 332.

    For the reasons, see Part II, §§ 553–556. Consent would also imply that an alternative system is provided as well in order to assure a free choice. See also our recommendation in this regard in Chap. 9.

  333. 333.

    We discussed the legal aspects of the use of black lists in Part II, Chap. 5, §§ 272–273.

  334. 334.

    The implications of these requirements for biometric systems is explained in Part II, Chap. 5, §§ 347–353.

  335. 335.

    Other safeguards include inter alia additional information and a right to appeal for the data subjects.

  336. 336.

    One could argue that if the list only contains the names (and biometric data) of persons who ‘merely’ breached an internal house reglementation of the place, not consisting of any disorder or crime, such list could be maintained by the (private) controller (e.g., persons not having paid the membership fees, people who ‘misbehaved’ (see Discopas Opinion, in which case risks for discriminating practices are high) (the need of biometric data for such purposes, however, could be questioned) (see and compare also with the Datassur case, where an organization of insurance companies kept a negative list, as discussed in Part II, Chap. 5, § 271). This would, however, not imply that no law is required. We plead that a clear distinction is made for this reason between the uses and purposes of the lists in order to be able to determine who shall control such list. In practice, however, the house rules will in many cases refer to incidents such as disorder or crime. Persons committing such facts shall only be prosecuted by police and law enforcement, and not by controllers in the private sector.

  337. 337.

    See also the clear position of the Belgian DPA in this regard in several advices, discussed in Part II, Chap. 5, § 272.

  338. 338.

    See also Part II, Chap. 4, §§ 180–184.

  339. 339.

    See Part II, Chap. 6, §§ 624–626). We have mentioned briefly the opinion of the Dutch DPAs with regard to a particular system, named ‘VIS 2000’, whereby biometric data would not only be stored on a smart card, used for membership verification, but also in a central way, to check whether the person requesting access is not on a black list of persons who ‘misbehaved’ (see Part II, Chap. 5, §§ 508–509). Images taken by surveillance cameras of ‘troublemakers’ would also be used and compared with biometric data collected from members for identification (see and compare also with the use of a biometric system at Super Bowl, briefly described in § 163 below). The position of the DPA was that the use of biometric data for marketing purposes was (clearly) not proportional. The system was submitted for opinion early 2000 and it should be taken into account that the operation of such systems, as described in the opinion, have further evolved.

    The Belgian DPA has reviewed at that time a similar system named VIS and was of the opinion that the use of biometric data was not proportionate. About these diverging opinions, see also Kindt, Biometric applications and the data protection legislation, 2007).

  340. 340.

    For example, in Belgium, specific legislation for guaranteeing public safety during soccer games was adopted, but the use of biometric data was therein not mentioned.

  341. 341.

    On the proportionality test sensu strictu, see Part II, Chap. 5, §§ 340–359; the use of central databases, also for negative identification, however, needs to be carefully considered in such practical cases. The system discussed in the Discopas opinion of the Dutch DPA, for example, deploying black lists but also camera surveillance (see footnote 308), in our view allowed de facto persons not only to be identified (i) if on the black list (negative identification), but also (ii) at the entrance as member, (iii) during presence at the place (by surveillance camera images and comparison with the members list if intended), and (iv) post factum, in case of incidents.

  342. 342.

    The latter is required for such black lists because of the interference by the central storage of biometric data as defended. When adopting such legislation, requirements of Article 8 (4) of the Directive 95/46/EC shall also be taken into account.

  343. 343.

    See also below §§ 163–168.

  344. 344.

    In the Netherlands, for example, the first Dutch schools started in March 2008, using biometrics as a key for personnel and parents. Schools use biometric systems in Belgium, France and since quite some time in the United Kingdom as well. See in this context a parliamentary question of a member to the Education Minister of the Walloon community in Belgium, Maria Arena of February 6, 2007 about the use of biometric access control in schools in Belgium. See also Kindt and Müller, Privacy legal framework for biometrics, D13.4, Fidis, 2009. About recent plans for the collection of biometric data from very young children, see Part II, Chap. 4, footnote 522; but: see China. Some mentioned that fingerprinting in schools have been banned in China (see, Baroness Walmsley, as mentioned during the parliamentary debate about schools and biometric data in the United Kingdom in 2007, Lords Hansard text, 19.03.2007, Column 1008, available at http://www.publications.parliament.uk/pa/ld200607/ldhansrd/text/70319-0002.htm#0703193000008)

  345. 345.

    See, for a description of a (fictive) case of the use of finger and voice for both meal administration and payment and banking account administration, the Roadmap of 2003 of the BioVision project (M. Rejman-Greene (ed.), Roadmap for Biometrics in Europe to 2010, BioVision, 15 October 2003, pp. 50–51, previously available at http://www.eubiometricsforum.com/dmdocuments/BIOVISION_Roadmap.pdf (‘Rejman-Greene, Roadmap for Biometrics, 2003’); compare with the UA of the CNIL for hand geometry and meal administration discussed above; the adoption of legislation on health promotion and nutrition in schools has also been given as a reason for increased use of biometric systems in schools (see, e.g., in Scotland, the Schools (Health Promotion and Nutrition) (Scotland) Act adopted in 2007).

  346. 346.

    See De Hert and Sprokkereef, The Netherlands, in Kindt and Müller, Fidis, D.13.4, 2009, p. 84. The authors refer to their study P. De Hert and A. Sprokkereef, The Use of Privacy Enhancing Aspects of Biometrics: Biometrics as PET in the Dutch Private and Semi-Public Domain, Tilburg, TILT, January 2009, 50 p. available at http://arno.uvt.nl/show.cgi?fid=93109 (‘De Hert and Sprokkereef, The Use of Privacy Enhancing Aspects of Biometrics, 2009’); see also A. Sprokkereef, ‘Chap. 13. The Introduction of Biometrics in the Netherlands: An Evaluation Under Data Protection and Administrative Law’, in S. van der Hof and M. Groothuis (eds.), Innovating Government. Normative, Policy and Technological Dimensions of Modern Government, The Hague, Asser, 2011, p. 220 (Sprokkereef, Chap. 13. Biometrics in the Netherlands, in van der Hof and Groothuis, Innovating Government, 2011’): ‘Originally, these schools never intended acquiring a biometric entry system, and indeed never paid for it: they were offered free trials to use the system’. We fully concur with this finding, based upon personal experience during discussions with heads of schools in the region of Brussels, Belgium.

  347. 347.

    The French DPA has recognized this interest in relation with the use of biometric data for the organization of the GMAT test. See Part II, Chap. 5, footnote 507.

  348. 348.

    See Part II, Chap. 5, § 427 and footnote 506. The use of biometric data, even in a central database, by schools, but from personnel, was in early decisions of the CNIL however positively advised (see Part II, Chap. 5, § 428).

  349. 349.

    See Part II, Chap. 5, §§ 451–458.

  350. 350.

    Information Commissioner’s Office, Fingerprinting in schools, available at http://www.ico.gov.uk/for_the_public/topic_specific_guides/schools/fingerprinting.aspx; about the protection of children’s personal data in general, see also the guidelines of the Article 29 Data Protection Working Party: Article 29 Data Protection Working Party, Opinion 2/2009 on the protection of childrens personal data (General Guidelines and the case of schools), WP 160, 11 February 2009, 20 p. (Art. 29 working Party, Opinion 2/2009 children’s personal data, WP160’).

  351. 351.

    Special rules on obtaining the consent of minors (e.g., regarding minimum age) may apply and shall be checked. See also footnote 358 below. One could even defend that consent below a particular age should be excluded by law. See also the Protection of Freedoms Act 2012 in the U.K. which requires the consent of at least one parent, whose consent can in addition be withdrawn at any time (Chap. 2, § 26(3) and § 27(3)).

  352. 352.

    It means that in fact a double legitimate basis would be required as well. In case schools (or the organizing organization(s) behind the schools) apply biometric applications, the use of consent should be extremely carefully deployed in order to assure free, specific and informed consent, because minors (and their parents or other custodians) are in a dependent position.

  353. 353.

    See also Art. 29 working Party, Opinion 2/2009 children’s personal data (WP160), p. 15.

  354. 354.

    For each particular situation, the controller shall be determined. This will often be the school, but could also be an organization of (or behind the) schools, or even a private or public entity organizing the school.

  355. 355.

    However, other legitimate aims, in particular the protection of the person or goods, as rights or interests of others, could be invoked. See and compare with the legitimate aims for the use of camera surveillance (in non-public places).

  356. 356.

    As explained, the Member States selected for our research as to the position of DPAs and their application of the proportionality principle are Belgium, France and the Netherlands. But: see the legislation adopted in the U.K., mentioned in footnote 354 above.

  357. 357.

    See Part II.

  358. 358.

    It is noteworthy that as long as the child is a minor, the use of the biometric information is according to this Act in principle not depending on the minor’s consent (which could be required in addition to the consent of the legal custodian(s)). See and compare with an opinion of the Belgian DPA, advising to obtain the consent of pupils as well (at least if the latter is able to act in this matter) for the use and publication of images taken in a school environment: CBPL, Advies no 33/2007 inzake de verspreiding van beeldmateriaal, 28.11.2007, pp. 2–3.

  359. 359.

    Act 095-0232 concerning education whereby the School Code was amended, adopted in 2007, available at http://www.ilga.gov/legislation/fulltext.asp?DocName=&SessionId=51&GA=95&DocTypeId=SB&DocNum=1702&GAID=9&LegID=29842&SpecSess=&Session=

  360. 360.

    See and compare also with the Type II Security and Access control (authorization) applications and Type IV Convenience and Personalization Applications of the classification of biometric systems, suggested and discussed in Kindt, Meints, Hansen and Müller, Control schemes, in Kindt and Müller, Biometrics in identity management, Fidis, D.3.10, 2007, pp. 55–67.

  361. 361.

    The French DPA, e.g., authorized in 2010 the use of fingerprint centrally stored of patients by way of experiment. The interests invoked are public health interest in secured identification of the patient and limiting human errors. See also CNIL, La biométrie entre à lhôpital pour identifier des patients traités par radiothérapie, 15.04.2010, available at http://www.cnil.fr/linstitution/actualite/article/article/la-biometrie-entre-a-lhopital-pour-identifier-des-patients-traites-par-radiotherapie-1/

  362. 362.

    For a brief discussion of some practical use cases in this sector, see E. Kindt, D.1.4.3 Practical Guidelines for a privacy friendly implementation of biometric data for identity verification, Paris, Turbine, pp. 35–38, available at http://www.turbine-project.eu/dowloads/TURBINE_KUL_ICRI-D1_4_3_BEST_PRACTICES_R2_3.pdf (‘Kindt, Turbine, D.1.4.3 Best Practices, 2011’). See also Kindt, Best Practices, 2013 [forthcoming], mentioned in footnote 72 in Chap. 8 below.

  363. 363.

    ATM stands for Automated Teller Machine, referring to an automated device, allowing customers of a financial institution who identify themselves (usually by inserting their (credit) card) to withdraw (or submit) cash after checking (online, through an internal and/or external online network) their personal (banking) details, as well as providing access to other financial transactions. In Poland, the first biometric ATM was installed in 2010. See Best Network, D2.1. Survey of existing (non-governmental applications) and emerging biometric applications, Best Network, 2010, p. 13 (‘Best, Survey, 2010’).

  364. 364.

    Bank organizations may also have an interest as employer in imposing the use of biometric access control applications upon their employees for access to particular (confidential) information. In this case, we refer to §§ 132–135 above.

  365. 365.

    See, e.g., ISO 19092:2008 Financial services – Biometrics – Security Framework.

  366. 366.

    Article 8 (4) of the Directive 95/46/EC.

  367. 367.

    See also above §§ 66–70.

  368. 368.

    E.g., because of the need for central storage.

  369. 369.

    Banking organizations may also invoke customer verification procedures requiring identification in compliance with anti-money laundering legislation (see Part II), but the controller will in that case have to show that the use of biometric data is the only means to reach these aims.

  370. 370.

    Such rights could consist of property rights of others (i.e., banking customers). Even though a banking organization could invoke such legitimate aim, it should avoid an interference with fundamental rights. The local storage of biometric data, permitting verification only, could in such cases still be defended and will be preferred, unless under the proportionality check proven to be not sufficient, and for which no other means exist, and hence justifying an interference, for example by central storage of biometric data.

  371. 371.

    At the same time, it also serves the commercial interest of the bank by offering an innovative service to (potential) new clients.

  372. 372.

    See, for examples of such use in Germany, in Best, Survey, 2010, pp. 14–15.

  373. 373.

    See also the Belgian DPA which states that the convenience of central storage (one does not need to carry a badge or chip card) does not justify these risks (CBPL, Advice N° 17/2008 biometric data).

  374. 374.

    This protection is needed, since, although all adequate safeguards may be taken as suggested, storage on a device under the control of the data subject would not be possible.

  375. 375.

    The Stockholm programme also mentioned the need to examine the issue of automated border controls and other issues for rendering border management more efficient. See, e.g., European Council, The Stockholm Programme – An open and secure Europe serving and protecting citizens, O.J. C 115, 4.05.2010, p. 27 (about the Stockholm Programme, see above § 18). RT programs should however not be confused with the exchange of Passenger Name Records, presently under discussion again, which in principle does not include biometric data.

  376. 376.

    One of the first was installed in Portugal (Rapid). About Rapid, see also Frontex, BIOPASS II. Automated biometric border crossing systems based on electronic passports and facial recognition: RAPID and Smartgate, Warsaw, Frontex, 2010, 50 p. Currently, several are piloted in many more countries in the EU. For an overview, see Best Network, D3.1. European RT, Inventory of Best Practices, Best Network, 2010, 16 p. (‘Best, European RT, 2010’). ABC systems or ABC egates are to be distinguished from the envisaged entry-exit system (EES) for third country nationals.

  377. 377.

    Best, European RT, 2010, pp. 5–6.

  378. 378.

    See and compare also with the Type III Public/private partnership applications in the classification suggested and discussed in Kindt, Meints, Hansen and Müller, Control schemes, in Kindt and Müller, Biometrics in identity management, Fidis, D.3.10, 2007, pp. 55–67.

  379. 379.

    See also Commission, Smart borders COM(2011) 680 final; for (older) studies of the use of biometric data of travelers, see also Organization For Economic Co-Operation And Development, Background material on biometrics and enhanced network systems for the security of international travel, Paris, OECD, DSTI/ICCP/REG(2003)3/FINAL, 23 December 2004, 53 p (‘OECD, Background material biometrics and international travel, 2004’).

  380. 380.

    See also Hays and Vermeulen, Borderline, 2012. For the various aspects of the privacy risks, such as no use for other purposes, transparency, etc. reference is made to Chap. 4.

  381. 381.

    For an overview of some of these error rates in several RTPs, see Best, European RT, 2010, pp. 9–10.

  382. 382.

    Art. 8(4) Directive 95/46/EC. About this article and the Reform proposals, see also Part II, Chap. 6, footnote 199.

  383. 383.

    See Part I, Chap. 3, §§ 294–297. See A. Acquisti, R. Gross and F. Stutzman, Faces of Facebook: Privacy in the Age of Augmented Reality, Carnegie Mellon University, Black Hat Conference 2011 (Las Vegas), 48 slides, of which the draft presentation is available at http://www.heinz.cmu.edu/~acquisti/face-recognition-study-FAQ/acquisti-faces-BLACKHAT-draft.pdf. The study ran three experiments, including identification of students based on their Facebook profile photos and the prediction of someone’s preferences including sometimes one’s social security number (the equivalent of one’s civil identity in civil law countries) based on SNS photos. See also Welinder, A face tells more, 2012. About the findings of this study, see also FTC, Facing Facts, 2012.

  384. 384.

    For example, for hiring activities. See also ECJ, Lindqvist, 2003.

  385. 385.

    Article 29 Data Protection Working Party, Opinion 5/2009 on online social networking, WP163, 12 June 2009, 13 p. (‘WP 29 Opinion on online social networking 2009 (WP163)’).

  386. 386.

    This existing exemption under Directive 95/46/EC (and which is maintained in a slightly different wording in the Reform Proposals – see Part I) remains in our view useful (a comparison could be made with exemptions under copyright law in some legal systems for private use as well). It can be argued that therefore, exemptions for unstructured materials or processing of personal data, such as under Swedish data protection legislation, are for the processing of personal data in electronic communications in this context not essential. Furthermore, one could also question whether a same level of protection remains guaranteed, since searches, even if the processing or data are not structured, will increasingly be possible. About this exemption under Swedish law, see also Kosta, Unravelling consent, 2011, p. 318.

  387. 387.

    We would esteem that the same conclusions would apply for the upload and use (including the use of ‘tagging’ tools) of images on sites for the creation of digital photo albums, with limited accessibility for the users for personal use or household members only (for example, by reserving access by the use of a password). However, whether or not the use for ‘purely personal or household activities’ can be defended, will depend much on whether access would be granted to others than household members (e.g., friends) (although it could be defended that this also would apply to making the images available to selected friends, although this remains less certain), the factual operation and also the terms and conditions of such sites. If users would cede rights in the images to the owner of the digital album software and/or website and have no control over their images (for example, because these are not deleted once used for the (digital) albums), the household exemption may no longer apply as it needs to be further ascertained and reviewed in that case whether the owners of such software and sites are likely to become controller, to whom all obligations of the Directive 95/46/EC would apply.

  388. 388.

    WP 29 Opinion on online social networking 2009 (WP163), p. 6; for some critical comments, see Van Eecke and Truyens, Privacy en sociale netwerken, 2010, pp. 120–122; see also B. Van Alsenoy, J. Ballet, A. Kuczerawy, J. Dumortier, ‘Social networks and web 2.0: are users also bound by data protection regulations?’, Identity in the Information Society, 2009, pp. 65–79.

  389. 389.

    See, e.g., Data Protection Commissioner, Facebook Ireland Ltd. Report of Re-Audit, 21.9.2012, p. 50, referring to the issue that employers were requiring employees to administer Business Pages on Facebook. See also WP 29 Opinion on online social networking 2009 (WP163), p. 6: ‘A growing trend of SNS is the “shift fromWeb 2.0 for funto Web 2.0 for productivity and services” where the activities of some SNS users may extend beyond a purely personal or household activity, for example when the SNS is used as a collaboration platform for an association or a company’.

  390. 390.

    See and compare also with the concept of ‘closed circle’ or ‘closed group’ in discussions about exemptions from copyright protection. On this issue, see, for a comparative overview, e.g., G. Mom, ‘Uitvoering in (strikt) besloten kring’, AMI 2010, pp. 81–91.

  391. 391.

    See, about such invitation by an Israelian company in 2009, Part I, Chap. 3, § 297. Such company may, depending on the particular circumstances and conditions, in such case become the entity determining the purposes and the means, and hence become controller. Insofar the processing is not transparent, the processing will not be fair, not legitimate and even illicit.

  392. 392.

    See and compare with ECJ, Lindqvist, 2003. The upload of the images on the platform would, in accordance with the interpretation by the Court of Justice in Lindqvist not constitute a transfer of data.

  393. 393.

    See also and compare with a decision of 12.03.2003 of the Italian DPA, concerning the taking of pictures using mobile devices with embedded camera for multimedia messaging services, available at http://www.garanteprivacy.it/garante/doc.jsp?ID=1672134

  394. 394.

    See and compare also with the solutions suggested for the use of body scanners at airports, as discussed below. In other applications, such alternative measures could consist of the possibility of making abstraction of or blurring images in pictures uploaded on the SNS, to remove one’s own images, etc. In the same sense, see WP 29 Opinion on facial recognition 2012 (WP192), p. 6. It shall be noted that the Article 29 Working Party further makes a distinction for the legal basis between consent of the image uploader and safeguards and a legitimate interest completed with more safeguards for image processing of other individuals who appear in the image (p. 6). See also and compare with FTC, Best Practices 2012. As already mentioned, in the Proposal for General Data Protection Regulation 2012, more strict conditions for the consent are proposed, including the right to withdraw and that consent is an insufficient legal basis in case of ‘significant imbalance’.

  395. 395.

    This is also stressed by the Article 29 Working Party in its Opinion 02/2012 on facial recognition in online and mobile services. See also Article 29 Data Protection Working Party, Opinion 1/2010 on the concept of “controller” and “processor”, WP169, 16.02. 2010, p. 21 and example 12. See also EPIC et al., Complaint In re Facebook, 2011; see the DPA of the state of Hamburg, sending in August 2011 a letter to Facebook requiring to disable face recognition software for photo-tagging and to delete previously stored data. The DPA estimates that 75 billion images have been uploaded on the SNS and that 450 million individuals have been tagged and that Facebook hence is creating the world’s largest biometric database. The FTC also reported that in a single month in 2010 2.5 billion photos were uploaded to Facebook (see FTC, Best Practices 2012, p. 4). See also the German DPA for Schleswig-Holstein (ULD)’s press release of 19.08.2011 on Facebook (available (also in English) at https://www.datenschutzzentrum.de/presse/index.htm and the reply of Facebook of 16.09.2011, available at https://www.datenschutzzentrum.de/facebook/kommunikation/20110916_Facebook_english.pdf, and the reply of ULD of 5.09.2011, available at https://www.datenschutzzentrum.de/facebook/kommunikation/20110905_ULD_english.pdf); Similar investigations by the DPA of Ireland against Facebook were started. The Artikel 29 Working group also announced in June 2011 the start of an investigation. See Ch. Albanesius, ‘Regulators Eyeing Facebook Facial Recognition’, 8.06.2011, PCMag.com, available at http://www.pcmag.com/article2/0,2817,2386621,00.asp Compare also with the automatic upload by Apple for Apple’s iCloud service announced in June 2011 of images from consumer devices to Apple Services in EPIC et al., Complaint In re Facebook, 2011, § 114. In October 2012, Facebook reached the number of 1 billion users.

  396. 396.

    Some could argue that the recommendations not to use the identification functionality as well as the prohibition to store samples would in that case require exemptions. Because of the risks, even involving SNS, this should be carefully reviewed. See also the EDPS, warning for this risk in EDPS, Annual Report 2010, p. 67: ‘The combination of the brute force of millions of social network users “armed” with smart mobile devices uploading photos on which they tag faces of individuals dramatically expands the scope of face recognition technology and even contributes to its improvement. This new emerging trend might also allow the creation of unprecedented large biometric databases from social network platforms’. Approving the need of a gradual evolution of the concept of privacy, due to the ‘increasing massification of our society’, see also Lemmens, Het recht op eerbiediging van het privé-leven, Liga voor Mensenrechten, 1989, p. 20, as cited in Part I, Chap. 3, footnote 631.

  397. 397.

    See V. Chachere, ‘Biometrics Used to Detect Criminals at Super Bowl’, 13.02.2001, available at http://abcnews.go.com/Technology/story?id=98871; Th. Greene, ‘Feds use biometrics against Super Bowl fans’, The Register, 7.02.2001, available at http://www.theregister.co.uk/2001/02/07/feds_use_biometrics_against_super/

  398. 398.

    See K van Woerden, The Happy Crowd Control Concept, 31.01.2008, Werkgemeenschap voor Informatie- en Communicatietheorie, conference ‘Public Safety’, Eindhove, slides previously available at http://www.sas.el.utwente.nl/wic2008mwm/PresentatieVanWoerden.pdf

  399. 399.

    For other examples of such private-public sharing of biometric information, see also Sprokkereef, Chap. 13. Biometrics in the Netherlands, in van der Hof and Groothuis, Innovating Government, 2011, p. 220; governments are also considering to or deploy biometric data, in particular facial images, for surveillance and law enforcement purposes in public places. This type of use is to be distinguished from our subject as such.

  400. 400.

    For example, the major of the city where the event takes place, but also police and law enforcement authorities.

  401. 401.

    See also and compare with proposals and some initiatives to use the eID for access to specific events, e.g., by youth. About the use of black lists, see also NRC, Authentication Report, 2003, pp. 180–181. Another example of a legitimate interest is to ‘blacklist’ data subjects at their own request.

  402. 402.

    This practical case is in fact also closely related to the issue of the deployment of black lists, as briefly discussed in Part II, and which, as we demonstrated, requires a legal basis, also for biometric applications.

  403. 403.

    See footnote 407 below. See also and compare with legislation imposing a prohibition to enter a soccer stadium (e.g., in Belgium, the Act of 21 December 1998 relating to the security at soccer games (B.S., 3.2.1999) modified later by the Acts of 10 March 2003, 27 December 2004 and 25 April 2007, and in which all conditions are carefully outlined).

  404. 404.

    Art. 8(4) Directive 95/46/EC. A ‘decision of the supervisory authority’ is also mentioned in the Directive 95/46/EC for this case. However, for the reasons set out before, a law should besides other reasons provide more legal certainty. About this article and the Reform proposals, see also Part II, Chap. 6, footnote 199.

  405. 405.

    This view, however, does not seem to be always maintained. See, e.g., in the Netherlands where the Dutch DPA proposed to take a positive decision for maintaining a black list of guests and visitors having disturbed order or committed criminal facts in the past, by a commercial organization for the registration of unwanted guests (‘Bureau Ongewenste Gasten Registratie’ or ‘BOGR’), established by members of hotel and restaurant facilities and security bureaus. See CBP, Bekendmaking ontwerpbesluit voorafgaand onderzoek Protocol Bureau Ongewenste Gasten Registratie, 27.05.2011, available at http://www.cbpweb.nl/Pages/med_20110527_ontwerpbesluit_bogr.aspx The information processed and maintained by the organization includes information ‘which may allow to deduce’ data relating to offences and criminal convictions (‘strafrechtelijke gegevens’) (see CBP, Explanation to the Protocol, p. 9) and data relating to illicit or disturbing behavior (‘gegevens over onrechtmatig of hinderlijk gedrag’). See and compare with Article 8, 5 of the Directive 95/46/EC which requires that data relating to offences and criminal convictions shall only be carried out ‘under the control of official authority’ or as determined by ‘national provisions providing suitable safeguards’, subject to notification to the EU Commission. It can be disputed whether an authorization of the DPA fits this requirement. See also and compare with European Commission, Proposal for General Data Protection Regulation (COM(2012)11 final, Article 9(j).

  406. 406.

    Article 34 §1 para. 2 Act on the Police Function.

  407. 407.

    See Article 34 §3 on the Police Function. Art. 34 § 2 Police Function also allows police officials to perform identity controls of individuals wanting to access a place subject to a threat as specified, in particular public gatherings which are a realistic threat for public order or where the public order is threatened. See also Part II, Chap. 4, § 13.

  408. 408.

    See Part II, Chap. 4, § 18.

  409. 409.

    See Part II, Chap. 4, §§ 16 and 19. See also and compare with the U.K. Protection of Freedoms Act 2012, art. 63D(5) allowing ‘speculative search’ with fingerprints or a DNA profile.

  410. 410.

    College van Procureurs-generaal, Aanwijzing Bestrijding van voetbalbandalisme en -geweld, 2010A023, 11 October 2010, see 2.2.2 and available at http://www.om.nl/organisatie/beleidsregels/overzicht/openbare_orde/@152577/aanwijzing_0/

  411. 411.

    See and compare with this explicit requirement according to the proportionality principle in Union law, as set out in Article 5 of the Protocol on the application of the principles of subsidiarity and proportionality (see also Part II). Another aspect is the uncertainty to what extent processing operations for public security are within the scope of the Directive 95/46/EC (see Art. 3.2). See on this issue also Part I, Chap. 3, §§ 228–229.

  412. 412.

    See also the case Scheichelbauer of 16.12.1970 cited in J. Velu and R. Ergec, La convention européenne des droits de lhomme, Brussels, Bruylant, 1990 p. 425. In Germany, e.g. field tests were done in 2007 in order to determine the effectiveness and results of face recognition. The tests proved that the use of face recognition was in a real environment rather poor (see Part I, Chap. 2, § 128).

  413. 413.

    WP 29 Opinion 2/2005 on VIS and exchange of data (WP110), p. 12.

  414. 414.

    This data is collected from data subjects by companies using or buying biometric systems from developers or suppliers of biometric systems. In some cases, it seems that biometric pilots in schools are set up, at very advantageous rates or for free (see also above footnote 346) for these schools, whereby data of the pupils are collected. It is not clear whether the data collected from the pupils in such cases are further used for research by the vendors, selling and setting up these pilots, for example for the fine tuning of algorithms.

  415. 415.

    In case suppliers of biometric systems use the biometric data bases of clients, this consent is less certain.

  416. 416.

    Only in particular circumstances, for example, if a data subject would express the wish to be removed from the list, such functionality may be used if needed. However, other means may be available, such as determining the data to be removed by a code number referring to the data subjects.

  417. 417.

    Protection against copying, re-use, identity theft, …

  418. 418.

    For example, as an exception for research databases to a general prohibition to store biometric data in central databases. See also below, § 175 and § 399. See and compare also with European Commission, Proposal for General Data Protection Regulation, COM(2012)11 final, Article 9(i) and Article 83.

  419. 419.

    For FVC 2002, see http://bias.csr.unibo.it/fvc2002, where the database is available in the DVD included in the publicly available handbook on fingerprint recognition mentioned below.

  420. 420.

    See the official website from the Biolab, University of Bologna, available at http://bias.csr.unibo.it/fvc2000/

  421. 421.

    The database are also available with the purchase of treatises on biometrics in any regular bookstore. See Maltoni, Maio, Jain and Prabhakar, Handbook Fingerprint, 2009. In this Handbook, the finality of the distribution and the use of the fingerprint images is described as ‘to allow interested readers to evaluate various modules of their own fingerprint recognition systems and to compare their developments with the state-of-the-art algorithms’ (see Preface).

  422. 422.

    Generally, scientific publications report the biometric performance in terms of False-Accept-Rates (FAR) and False-Reject-Rates (FRR) according to ISO standards, including e.g., ISO 19795-1:2006 Information technology – Biometric performance testing and reporting Part 1. In order to make publications and their algorithm comparable, testing is conducted on such public databases as mentioned; see also, about one of such (first) ‘performance competitions’, e.g., D. Maio, D. Maltoni, R. Cappelli, J. Wayman, A. Jain, FVC2000: Fingerprint Verification Competition, 43 p.

  423. 423.

    See e.g., NIST Special Database 4, commercially available at http://www.nist.gov/srd/nistsd4.cfm

  424. 424.

    See, e.g., NIST Special Database 29, commercially available at http://www.nist.gov/ts/msd/srd/nistsd29.cfm Before the FVC databases, the NIST databases were the only large public domain fingerprint datasets which could be used for benchmarking. More recently, the purchase of various NIST databases is now also made online at the official website of the fore mentioned governmental agency. See also NIST, Multiple Biometric Grand Challenge, available at http://www.nist.gov/itl/iad/ig/mbgc.cfm

  425. 425.

    For example, Article 13(2) of the Directive provides for an exemption, by legislative measure, to the right to access, if data is solely processed for purposes of scientific research or for creating statistics (Article 13 (2) Directive 95/46/EC). For Belgium, see, e.g., Chapter II of the Royal Decree of 13 February 2001.

  426. 426.

    See Part I, Chap. 3, §§ 220–225. See, as discussed, about the importance of the purpose(s) of the use of the data, WP 29 Opinion personal data (WP136), p. 16.

  427. 427.

    See also EDPS, Opinion on Turbine, 2011, p. 10, § 46.

  428. 428.

    See, e.g., for Belgium, Royal Decree of 13 February 2001, Art. 3, imposing an obligation to render personal data further used for scientific purposes, anonymous. Other safeguards often recommended are restricted access to the data and functional separation, whereby the data, such as biometric data, would be stripped of direct identifiers, such as name etc., and whereby the data and other identifiers are stored in separate places.

  429. 429.

    Article 29 Data Protection Working Party, Working Document on Genetic Data, WP91, 17 March 2004, p. 11 (‘WP 29 Working document genetic data (WP91), 2004’).

  430. 430.

    This should also be kept in mind in case some would defend to use biometric data collected by governments for other purposes, such as research. A rule, that it is acceptable to re-use such data if anonymized, can in our view in principle not be applied to biometric data. Such re-use, if any, should for this reason, if proportionate, be subject to specific legislation as well.

  431. 431.

    This presumably also applies for private and proprietary research databases.

  432. 432.

    Data protection legislation of the countries such as Belgium and the Netherlands do provide an exemption to the information obligation in such case. For Belgium, see exemption a) (use for scientific research) of information obligation for personal data not obtained directly from the data subject as set forth in Article 9 §2 of the Data Protection Act of 1992 (as modified). For the Netherlands, see Article 34.4 of Data Protection Act 2000, and the Code of Conduct for the use of personal data for scientific research, approved by the Dutch DPA (Article 3.7.2).

  433. 433.

    For an example of authorization by the French DPA, see CNIL, Délibération N° 2010-336 du 22 juillet 2010 autorisant la mise en oeuvre par la société morpho dun traitemet de données à caractère personnel ayant pour finalité principale la recherce sur les algorithmes biométriques, available at http://legimobile.fr/fr/cnil/del/aut/2010/2010-336

  434. 434.

    E.g., in the Netherlands. See Article 30 of the Exemption Decree of 7 May 2001 combined with Article 29.1 of the Dutch Data Protection Act and the Code of Conduct for the use of personal data for scientific research, approved by the Dutch DPA (Article 3.8).

  435. 435.

    This is required since biometric data, especially since samples are used, are sensitive data. See also and compare with European Commission, Proposal for General Data Protection Regulation (COM(2012)11 final, Article 10, which may be of relevance for public biometric databases for research purposes.

  436. 436.

    Many safeguards that we suggest in our last Chap. 9, such as the local storage, the storage of templates only or the use of privacy-enhancing technologies can not be applied in this case. Interference, however, could possible be assumed for data subjects, even if they would consent to participate, and hence there is a need to protect their rights and freedoms.

  437. 437.

    See also above § 170. See also the Opinion of the EDPS on Turbine in which it was stated that because of the specific risks to the rights and freedoms for the data subjects, it is necessary for the controller obtaining publicly available databases to verify whether the biometric data of such publicly available databases have been ‘collected in compliance with the national regulatory framework’ and that the DPA ‘has issued an opinion/authorization on the legality of the database’ (See EDPS, Turbine Opinion, 2011, p. 10, §47). In the paragraphs thereafter, however, the EDPS seems to say that the controllers should obtain guarantees that such databases are ‘legally compliant’ with the national laws where the initial controller is established (see § 49). We do not agree with this view and (far reaching) recommendation, however. If controllers receive (biometric) data, they should comply with the national data protection legislation applicable to their (own) research activities, without being obliged to actively investigate the compliance by previous controllers established in another Member State from whom the data are received. This would be different only if subsequent controllers would make an error in assuming that the previous controller complied with national applicable data protection legislation applicable to that previous controller (e.g., in case of indications of fraud, crime etc.).

  438. 438.

    As we have explained in Part I, this example of processing of biometric data may not fall within the scope of the Directive 95/46/EC.

  439. 439.

    Art. 3.2 Directive 95/46/EC.

  440. 440.

    However, it should be noted that family members (e.g., children) have also fundamental rights vis-à-vis other members (e.g., parents).

  441. 441.

    See also the N°AU-0027 of the French DPA, discussed in Part II.

  442. 442.

    A decision whether a processing is for a purely personal or household activity should in principle be taken by the individual – natural person. This also implies that the individual has ‘full control’ over the system and the data. See also Part I, Chap. 3, § 230 et seq.

  443. 443.

    A recent study showed that organizations are planning to allow staff to bring their own ‘new technologies’ such as smart phones and tablets, to work and even to provide IT support to such employee owned computing devices. See C. Saran, ‘Bring your own devices to work is the future of desktop computing’, in Computerweekly.com, 20.09.2011, available at http://www.computerweekly.com/Articles/2011/09/20/247938/Bring-your-own-devices-to-work-is-the-future-of-desktop.htm The major driver is not fully clear to us. If these devices are used for mixed (personal and professional) purposes, the data protection obligations will apply if biometric data are processed for securing such devices.

  444. 444.

    About use in the Netherlands, see R. Kaspersen, ‘Lange vingers’, Computerrecht 2008, p. 184. In many cases, however, it seems that the projects are still pilots. In the Netherlands, it seems that some have been interrupted (see Sprokkereef, Chap. 13. Biometrics in the Netherlands, in van der Hof and Groothuis, Innovating Government, 2011, p. 220).

  445. 445.

    The disadvantages however is exclusion of the possibility to pay anonymous and the risks of impersonation (see also Part II on the risks and on this issue).

  446. 446.

    About this use case and our further analysis, see also above § 152.

  447. 447.

    See, e.g., the United States, and in particular the USA Patriot Act and the Enhanced Border Security and Visa Entry Reform Act 2002 adopted shortly after the events of 9/11. See Part I, Chap. 2, § 165; about the US-VISIT program, see also M. Hoernlein, ‘Unites States Visitor and Immigrant Status Indicator Technology Program’, The Practitioners Guide to Biometrics, W. Coats, A. Bagdasarian, T. Helou and T. Lam (eds.), Chicago, American Bar Association, 2007, pp. 37–47; see also Part I, Chap. 2, § 142 et seq.

  448. 448.

    See also about the The Hague Programme, Part I, Chap. 2, § 145. See also the Stockholm Programme, emphasizing inter alia the need for automated border control management.

  449. 449.

    For the full reference, see Part I, Chap. 2, footnote 201. This Regulation has furthermore been extensively discussed and analyzed in several publications, including in for example A. Juels, D. Molnar, and D. Wagner, ‘Security and Privacy Issues in E-passports’, Proc. 1st Intl. Conf. On Security and Privacy for Emerging Areas in Communications Networks, Los Alamitos, IEEE Computer Society, 2005, 14 p., also available at http://eprint.iacr.org/2005/095.pdf (‘Juels, Molnar and Wagner, E-Passports, 2005’); G. Hornung, ‘The European Regulation on Biometric Passports: Legislative Procedures, Political Interactions, Legal Framework and Technical Safeguards’, SCRIPTed 2007, pp. 246–262, available at http://www2.law.ed.ac.uk/ahrc/script-ed/vol4-3/hornung.asp; Meints and Hansen, Study on ID Documents, Fidis, D.3.6, 2006, p. 49 et seq.

  450. 450.

    For the technical specifications, see Commission Decision of 28 June 2006 laying down the technical specifications on the standards for security features and biometrics in passports and travel documents issued by Member States [C(2006)2909 final- Not published in the Official Journal]. It addresses primary biometric (face), secondary biometric (fingerprints), storage media, electronic passport chip layout, data security and integrity issues and conformity assessment. E.g., public key infrastructure (PKI) is used for authentication purposes of the data stored in the chip of the ePassport.

  451. 451.

    See also WP 29 Opinion on Implementing Regulation No 2252/2004 (WP112).

  452. 452.

    G. Avoine, K. Kalach & J-J. Quisquater, Belgian Biometric Passport does not get a pass… Your personal data are in danger’, available on http://www.uclouvain.be/crypto/passport/index.html; see also E. Kindt, ‘Belgisch biometrisch paspoort onveilig’, Computerrecht 2007, pp. 221–223.

  453. 453.

    The weakness of Belgian biometric passports, however, is considered worse, because the information needed to read the chip, the two coded lines at the bottom of the first page, containing birth date, expiry date and passport number, can be guessed in about one hour with a search of all possible combinations if the data of birth and the date of expiry are known. The reason is that the passports numbers are given in an increasing order, are linked to the language and that the passports are only valid for 5 years, thus limiting the possible combinations to be ‘guessed’.

  454. 454.

    JRC Report Large-scale Biometrics Deployment 2008, p. 83.

  455. 455.

    BAC is a recommendation of the International Civil Aviation Organization (ICAO). It has been imposed upon the EU Member States for the issuance of passports.

  456. 456.

    See also the assessment by Fidis researchers of the (failing) security architecture of ePassports, resulting in Fidis, Budapest Declaration on Machine Readable Travel Documents, 2006, available at http://www.fidis.net/press-events/press-releases/budapest-declaration/#c1307 The major shortcomings identified by the group included failure of a revocation mechanism, improper key (management) for accessing the personal data stored on the chip and risks of eavesdropping when data are read out, putting ‘the security and privacy of European citizens at significant risk’. The protocol however is effective against simple skimming attacks if the attacker does not know much about the victim. See W. Fumy, ‘Machine Readable Travel Documents’, in W. Fumy and M. Paeschke (eds.), Handbook of eID Security, Erlangen, Publicis, 2011, (94), p. 101 (‘Fumy, MRTD, 2011’).

  457. 457.

    Such readers are using the PKI certificates for being authorized by the chip to access the data stored.

  458. 458.

    JRC Report Large-scale Biometrics Deployment 2008, pp. 81–82; see also Fumy, MRTD, 2011, pp. 102–106.

  459. 459.

    While terminal authentication is meant to be mandatory for accessing data groups which are optional, such as fingerprint, ICAO specifications require that mandatory data groups, such as the facial image, must remain readable without EAC. See Fumy, MRTD, 2011, p. 103; about the security measures in the ePassports, and several related aspects, see also Meints and Hansen, Study on ID Documents, Fidis, D.3.6, 2006; see also Frontex, Operational and Technical security of Electronic Passports, Warsaw, Frontex, 2011, 189 p.

  460. 460.

    By comparison, this would reportedly have been said by a representative of the United States’ Department of Homeland Security referring to the U.S.-VISIT system. See JRC, Report Large-scale Biometrics Deployment 2008, p. 64.

  461. 461.

    See for such questions asked by the Committee on Citizens’ Freedoms and Rights, Justice and Home Affairs (LIBE committee) on the implications of biometrics on the future everyday life of citizens through an analysis, JRC, Biometrics at the Frontiers, 2005, pp. 131–133 and the answers provided by the study. In 2012, however, the parliament started to ask several questions and debated about the biometric passport. See, e.g., Parliamentary questions, 6.3.2012, Subject: Biometric passports, 0-000052/2012, available at http://www.europarl.europa.eu/sides/getDoc.do?type=OQ&reference=O-2012-000052&language=EN

  462. 462.

    See, for example, P. De Hert and W. Schreurs, ‘Legal Grounds for ID Documents in Europe’ (sections 4.1.1–4.1.5) in M. Meints, and M. Hansen, M. (eds.), D.3.6 Study on ID Documents, FIDIS, 2006, (40), pp. 60–62.

  463. 463.

    European Commission, Proposal for a Regulation of the European Parliament and of the Council amending Council Regulation (EC) No 2252/2004, COM (2007) 619 final, COM(2007) 619 final, 18.10.2007, available at http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=COM:2007:0619:FIN:EN:PDF

  464. 464.

    Ibid., p. 2.

  465. 465.

    The proposal also introduced the principle of ‘one passport-one person’ as an additional security measure, as recommended by the International Civil Aviation Organization (ICAO). This would ensure that the passport and the biometric features are only linked to the person holding the passport and could help combat child trafficking by requiring children to have their own passport with their own biometric identifiers.

  466. 466.

    EDPS, Opinion of 26 March 2008 on the proposal for a Regulation of the European Parliament and of the Council amending Council Regulation No 2252/2004, O.J. C 200, 6.08.2008, p. 1, available at http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=OJ:C:2008:200:0001:0005:EN:PDF

  467. 467.

    See, e.g., the legislative resolution of the European Parliament on the amended proposal (including raising the age from six to twelve) of 14 January 2009; for the final Regulation: Regulation (EC) No 444/2009 of the European Parliament and of the Council of 28 May 2009 amending Council Regulation (EC) No 2252/2004 on standards for security features and biometrics in passports and travel documents issued by Member States, O.J. L 142, 06.06.2009, pp. 1–4 (‘Regulation (EC) No 444/2009’).

  468. 468.

    See also Kindt and Müller, Privacy legal framework for biometrics, Fidis, D.13.4, 2009.

  469. 469.

    The Decree 2008-426 modified the decree N° 2005-1726 of 30 December 2005 relating to passports; for the advice of the CNIL about the proposed modifications, see CNIL, Délibération n° 2007-368 du 11 décembre 2007 portant avis sur un projet de décret en Conseil dEtat modifiant le décret n° 2005-1726 du 30 décembre 2005 relatif aux passeports électroniques available at http://www.cnil.fr/documentation/deliberations/deliberation/delib/130/; about France and the biometric ePassport and legislation adopted, see also E. Kindt and F. Coudert, ‘France’ in E. Kindt and L. Müller (eds.), D13.4. The privacy legal framework for biometrics, Frankfurt, FIDIS, May 2009, pp. 52–67.

  470. 470.

    Article 6-1 of the Decree N° 2005-1726 provides for the taking of eight fingerprints upon application for an electronic passport and Article 18 allows an automated processing called ‘TES’ for the issuance, delivery, renewal and revocation of passports and for the prevention and detection of false passports. Article 19 of the decree N° 2005-1726 further provides for the storage of the digital image of the face and the fingerprints. Article 21 provides for access to the information stored on the chip of the passport for identity control and control of the authenticity of the passport by the police and Article 21-1 provides for access to the central database, excluding the facial images and the fingerprints, for police and intelligence services for specific cases in the sphere of the fight against terrorism after due authorization by the head of the police or the intelligence service. Article 23 provides for interconnection with the information systems of Schengen and Interpol, but only based on alphanumerical data, such as the numbers of stolen or lost passports.

  471. 471.

    Conseil d’Etat, N° 317827, 317952, 318013, 318051, 26 October 2011. The central database, called TES, however, was upheld.

  472. 472.

    See inter alia Article 3 para. 3 of the Act of 26 September 1991 which now states that a travel document is provided with a facial image (‘gezichtsopname’), two fingerprints and the written signature of the holder according to further rules to be stipulated by the Minister. See the Act of 26 September 1991 containing the rules for the issuance of travel documents (Passport Act 1991), as modified by Act of 11 June 2009 modifying the Passport Act relating to the modification of the travel document administration, the latter published in Stb. 2009, 252, also available at https://zoek.officielebekendmakingen.nl/stb-2009-252.html The consolidated version of the Passport Act 1991 (version as of July 2009) is available at http://wetten.overheid.nl/BWBR0005212/geldigheidsdatum_23-07-2009/afdrukken

  473. 473.

    The central storage of two finger prints to be determined by further regulation (and other than the prints which are provided for in the travel document itself) in the ‘travel document administration’ (‘reisdocumentenadministratie’) is provided for in Article 4a para. 2 b of the Act of 11 June 2009. The purpose of the ‘travel document administration’ (‘reisdocumentenadministratie’) is described in the same Act as ‘providing these data as mentioned in Article 4 a para. 1 to the authorities competent on the basis of the same Act for the execution of the Act and insofar as necessary for such execution (Article 4 b para. 1 of the Act of 11 June 2009).

  474. 474.

    For example, if and to what extent the public prosecutor will have access to the data for other purposes. E.g., in the Netherlands, the State minister responsible for the new ePassports mentioned in the public hearing with the Senate that the Public Prosecutor shall only use the biometric data of the database to verify whether a suspect and the holder of the passport is the same person. It was stated that the Public Prosecutor will not have access to the central database and that the biometric database shall not be used for data mining; but: for legal provisions allowing the use of fingerprint data for the investigation of crimes, see § 188 below.

  475. 475.

    Snijder, Crash of zachte landing, 2010, p. 131.

  476. 476.

    X., ‘Vingerafdrukken op chip van paspoort in 2012’, De Standaard, 12.10.2011, available at http://www.standaard.be/artikel/detail.aspx?artikelid=DMF20111012_011 The introduction has been postponed several times. See, e.g., D. Reijerman, ‘Belgen krijgen medio 2010 eerste biometrische paspoorten’, 14.09.2009, available on http://tweakers.net/nieuws/62439/belgen-krijgen-medio-2010-eerste-biometrische-paspoorten.html Modalities remain unclear. In this press article of Reijerman, it was stated, e.g., that for the Belgian ePassports, the signature written on a digital pad may be collected from the applicants as well, besides fingerprints and facial image. Facial images of identity cards, however, are already centrally stored. See for the amendment to proposed legislation introducing this central storage, without debate, footnote 197 in Chap. 8 below.

  477. 477.

    See, e.g., Raad Van State, 28.09.2012, 201205423/1/A3, available at http://www.raadvanstate.nl/uitspraken/zoeken_in_uitspraken/zoekresultaat/?verdict_id=Q%2BwiycihpIM%3D; several courts in Germany have sent questions to the European Court of Justice as well about the validity of Regulation 2252/2004.

  478. 478.

    See, e.g., Vzr. Rb. Utrecht, 15.07.2011, LJN BR2009, available at http://zoeken.rechtspraak.nl/detailpage.aspx?ljn=br2009 In this decision in preliminary proceedings, the court does not consider the collection of fingerprint for a Dutch eID card unlawful. Several elements retained by the judge and on which the decision is based, however, such as the ‘(very) short period of storage’ of the fingerprint, seem, based on other elements mentioned in the judgment, not correct (two fingerprints would be centrally stored as well for a longer period); for a similar conclusion that the collection and storage is not unlawful, Rechtbank’s-Gravenhage, 23.3.2011, LJN BP8841, available at http://zoeken.rechtspraak.nl/resultpage.aspx?snelzoeken=true&searchtype=ljn&ljn=BP8841&u_ljn=BP8841 The court based its decision inter alia on several technical elements, e.g., the fact that some provisions of the Act allowing for biometric ePassports with central storage of fingerprints did not yet take effect; Rb. Utrecht, 25.5.2012, LJN BW6545. But: see and compare with Decision 603 of 28.9.2005 of the Taiwanese Constitutional Court, referenced in Part II, Chap. 4, footnote 80.

  479. 479.

    See T. Bourlai, A. Ross and A. Jain, ‘On Matching Digital Face Images Against Scanned Passport Photos’, in Proc. of First IEEE Intern. Conf. on Biometrics, Identity and Security, September 2009, p. 9, also available at http://www.cse.msu.edu/rgroups/biometrics/Publications/Face/BourlaiRossJain_BIDS2009.pdf (‘Bourlai, Ross and Jain, Matching Digital Face Images, 2009’).

  480. 480.

    See Snijder, Crash of zachte landing, 2010. Several findings were confirmed in a later report commissioned by the Dutch Minister Donner: R. Bekker, Betreft: Onderzoek naar besluitvorming biometrie op reisdocumenten, 21.2.2012, Kamerstukken II 2011/12, 25 764, nr. 54, Annex, available at https://zoek.officielebekendmakingen.nl/blg-155412.html

  481. 481.

    Article 4 b para. 2 and para. 3 Passport Act.

  482. 482.

    Article 4b, 4 para. 2 and para. 4 Passport Act. From the text of the legislative provision, it is in our view not clear if these two conditions are cumulative or not. This is however important. In case both conditions have to be fulfilled, it means that the fingerprints can only be used for double-checking the identity of the suspect or criminal whose prints have been taken already, on the basis of the fingerprint data that is stored in the travel document administration. If the conditions are not cumulative, it means that the fingerprints of potential suspects could be requested in order to verify whether they are involved in a particular crime. This could be done by a 1:n or 1:1 check. The legal provision, however, does not clarify the way how the checks would be made. It makes, however, an important difference. In the first case, the biometric data are only used to verify the identity of persons arrested. In the second case, the database is used to identify suspects in the interest of criminal investigations. See also, in this context, the decision of the Supreme Court of the Netherlands which found earlier no breach of Article 8 ECHR for a similar communication of a facial image (see Part II, Chap. 4, § 17).

  483. 483.

    See also Juels, Molnar and Wagner, E-Passports, 2005.

  484. 484.

    See Snijder, Crash of zachte landing, 2010. For example, according to Snijder, 3 % of the documents (which may even according to Snijder be a very optimistic estimation) would lead to FRR (pp. 119–120).

  485. 485.

    The introduction of the ePassport was by legislation, whereby several aspects however needed to be further determined later.

  486. 486.

    About the Identity Card Bill, see also LSE, Identity Project, 2005.

  487. 487.

    T. Parker, ‘Are we protected? The Adequacy of Existing Legal Frameworks for Protecting Privacy in the Biometric Age’, Ethics and Policy of Biometrics, Lecture Notes in Computer Science, 2010, p. 45.

  488. 488.

    See Identity cards scheme will be axedwithin 100 days’, 27.05.2010, BBC News, available at http://news.bbc.co.uk/2/hi/8707355.stm; see also the Identity Documents Act 2010, available at http://www.legislation.gov.uk/ukpga/2010/40/contents/enacted (see Art. 1 in which the Identity Cards Act 2006 is repealed). About the Identity Cards Act 2006, see, e.g., C. Sullivan, ‘The United Kingdom Identity Cards Act 2006 – Civil or Criminal?’, International Journal of Law and Information Technology 2007, pp. 320–361.

  489. 489.

    See, e.g., M. Snijder and J. Grijpink, ‘Twee jaar Paspoortwet: terug naar af?’ ¨P&I 2011, pp. 142–144; X., Een biometrische database: een stap te ver? Roundtable Commission internal affairs, Rathenau Instituut, Notes, 20.4.2011, 1 p., available at http://www.rathenau.nl/uploads/tx_tferathenau/Gespreksnotitie_biometrie_Tweede_Kamer_april_2011_-_Rathenau_Instituut.pdf; see also Snijder, Crash of zachte landing, 2010, 145 p. which is one of the two reports that brought the issues again under the attention of the public and policy makers, in particular the insufficient quality, resulting in changed views of politicians. The other report was: V. Böhre, Happy Landings? Het Biometrische Paspoort als zwarte doos, Wetenschappelijke Raad voor het Regeringsbeleid, Webpublicatie nr. 46, 2010, 155 p.

  490. 490.

    Letter to the Parliament of 26 April 2011 (2011/U51459), Kamerstukken II 2010/2011, 25 764, nr. 46.p. 5. In October 2012, a Bill has been introduced to modify the Passport Act. See Kamerstukken II 2011/12, 33 440, nr. 2, Wijziging van de Paspoortwet (…).

  491. 491.

    See also EDPS, EDPS comments on the Communication COM(2010) 311 final from the Commission to the European Parliament and the Council on the Use of Security Scanners at EU airports, 2 p., available at http://www.aedh.eu/plugins/fckeditor/userfiles/file/Protection%20des%20donn%C3%A9es%20personnelles/Commentaire%20EDPS.pdf

  492. 492.

    See also the hearing of the candidate commissioner for Justice, Viviane Reding, on 12 January 2010, where she stated that fundamental rights and data protection are major concerns. See EU parliament, summary of hearing of Viviane RedingJustice, fundamental rights and citizenship, Press release, 3 p., available at http://www.europarl.europa.eu/sides/getDoc.do?pubRef=-//EP//NONSGML+IM-PRESS+20100111IPR67125+0+DOC+PDF+V0//EN

  493. 493.

    These criteria seem to have been inherent to a particular brand of scanners.

  494. 494.

    We have no conclusive information as to what extent the images could be considered anonymous (i.e., cannot be linked to (the identity of) a person). The fact that the images could and should be anonymous, however, might have been an additional requirement, however, which would very much improve the privacy and data protection rights (if still applicable) of the persons concerned.

  495. 495.

    See M. Vandersmissen, ‘Het maatschappelijk draagvlak voor bodyscanners groeit snel. Liever bloot dan dood’, De Standaard, 8–9 January 2010.

  496. 496.

    These specifications seem to have been designed and further implemented by the manufacturer of a particular brand of scanners.

  497. 497.

    This is relevant as the Commission plans to add body scanners to the list of EU-authorized methods for screening passengers. See EU Parliament, Transport, ‘Strict safeguards needed for airport body scanners, say MEPs’, 25.5.2011, available at http://www.europarl.europa.eu/en/pressroom/content/20110523IPR19946/html/Strict-safeguards-needed-for-airport-body-scanners-say-MEPs

  498. 498.

    Quote after Benjamin Franklin (1706–1790), a politician and scientist, excelling in a significant number of disciplines, one of the founders of the United States.

  499. 499.

    See and compare also with the protection of identity in relation to misuse of someone’s identity, for example on SNS. Case law is emerging, including in Belgium.

  500. 500.

    See also the lengthy discussions in the Netherlands in relation to the modification of the legislation on the identification obligation in relation to police and law enforcement authorities. See, e.g., the letter of the Ministry of Justice to the Queen, legislative services, of 17.09.2003, p. 2, also available at http://www.identificatieplicht.nl/5245796

  501. 501.

    Müller and Kindt, Model implementation, Fidis, D.3.14, 2009, p. 30.

  502. 502.

    Ibid., p. 35.

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer Science+Business Media Dordrecht

About this chapter

Cite this chapter

Kindt, E.J. (2013). The Criteria for the Correct ‘Balancing of Rights’. In: Privacy and Data Protection Issues of Biometric Applications. Law, Governance and Technology Series, vol 12. Springer, Dordrecht. https://doi.org/10.1007/978-94-007-7522-0_7

Download citation

Publish with us

Policies and ethics