Skip to main content

Autonomy, Liberty and Privacy

  • Chapter
  • First Online:
Face Recognition Technology

Part of the book series: Law, Governance and Technology Series ((LGTS,volume 41))

Abstract

General ignorance of the range and kinds and implications of the use of FRT raises ethical and legal questions. Some general points may be widely known. For instance, how FRT is used to identify individuals by converting their facial features into digital data and comparing that real-time or recorded data with images stored in databases. The stored images have usually been harvested from individuals who supplied an identity photograph, such as for a passport, driving licence or travel pass. So far, so good. But how far does the average citizen understand that whilst this use of images in identity photographs can also facilitate the application process, it is also an integral part of the document containing other information, and the database that is created may be ethically problematic. This is especially true when used or accessed covertly by agencies without explicit consent and not directly associated with the primary purpose of the photograph. Therefore, this chapter examines how autonomy, liberty and privacy are affected by FRT and presents some elements of an ethical framework, from which FRT’s impact on autonomy, liberty and privacy can be assessed.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 109.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 139.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 139.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    EFF.

  2. 2.

    Federal Bureau of Investigation: Next Generation Identification (NGI).

  3. 3.

    Integrated Automated Fingerprint Identification System.

  4. 4.

    EFF op cit.

  5. 5.

    Edgar (2017).

  6. 6.

    Chesterman (2011), p. 145. Citing ICO and Parliamentary reports.

  7. 7.

    Ibid, p. 150.

  8. 8.

    Solove (2011), p. 180. Citing Gill and Spriggs (2005).

  9. 9.

    Ibid.

  10. 10.

    Dworkin (1988, reprinted 1997).

  11. 11.

    Kant (1781).

  12. 12.

    Mill (1859).

  13. 13.

    Dworkin (1988), p. 17 op cit.

  14. 14.

    Frankfurt (1971).

  15. 15.

    Dworkin (1988), pp. 17; 20–21 op cit.

  16. 16.

    See Waldron (2005).

  17. 17.

    Heteronomy: the opposite of autonomy.

  18. 18.

    Waldron (2005), p. 307 (my italics).

  19. 19.

    Lawrence (2005), p. 136.

  20. 20.

    Nussbaum (1995), p. 257.

  21. 21.

    Garfinkel (2001), p. 65.

  22. 22.

    Pink (2011), pp. 541–563.

  23. 23.

    Christman (2018).

  24. 24.

    Berlin (1958).

  25. 25.

    Lane (2007).

  26. 26.

    Ibid.

  27. 27.

    For instance, GDPR. But, whether the GDPR alleviates these uncertainties, especially after, for example, the debacle over the Facebook app that Cambridge Analytica (the UK based data analytics consultancy) devised to unlawfully harvest the data from Facebook account holders, remains to be seen.

  28. 28.

    Kafka (1925).

  29. 29.

    Ibid.

  30. 30.

    Ibid, pp. 1–3.

  31. 31.

    Hague (2011).

  32. 32.

    Foucault (1977), p. 213. The King was Louis XIV who ruled France from 1643 to 1715.

  33. 33.

    Ibid, p. 214.

  34. 34.

    Ibid, p. 214.

  35. 35.

    The Panopticon in the above ways can act as a laboratory that acts as a machine that “carries out experiments, to alter behaviour, to train or correct individuals” (Foucault 1977, p. 203). Thus, the Panopticon not only acts as a vehicle to carry out tasks but also as a disciplinary apparatus.

  36. 36.

    Dixon (2010). See Sect. 6.7.

  37. 37.

    See Bloustein (1964), Fried (1970), Inness (1992) and Rachels (1975).

  38. 38.

    See MacKinnon (1989).

  39. 39.

    Gavison (1980), pp. 421, 471.

  40. 40.

    Ibid, p. 423.

  41. 41.

    Ibid, p. 424.

  42. 42.

    Solove (2011).

  43. 43.

    Assuming that is no return to sender address on the back of the envelope.

  44. 44.

    Nissenbaum (2010).

  45. 45.

    Berle (2011), pp. 43–44.

  46. 46.

    Gavison (1980), p. 436.

  47. 47.

    Electronic Privacy Information Center (EPIC).

  48. 48.

    Wacks (1989) (revised 1993), p. 20.

  49. 49.

    Warren and Brandeis (1890).

  50. 50.

    Berlin (1958).

  51. 51.

    Wacks R op cit, p. 14.

  52. 52.

    Campbell v MGN Ltd.

  53. 53.

    Marshall (2009).

  54. 54.

    Ibid, p. 52.

  55. 55.

    Ibid cited by Marshall (2009).

  56. 56.

    Ibid, p. 52.

References

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Berle, I. (2020). Autonomy, Liberty and Privacy. In: Face Recognition Technology. Law, Governance and Technology Series, vol 41. Springer, Cham. https://doi.org/10.1007/978-3-030-36887-6_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-36887-6_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-36886-9

  • Online ISBN: 978-3-030-36887-6

  • eBook Packages: Law and CriminologyLaw and Criminology (R0)

Publish with us

Policies and ethics