Skip to main content

The Challenges of Risk Profiling Used by Law Enforcement: Examining the Cases of COMPAS and SyRI

  • Chapter
  • First Online:
Regulating New Technologies in Uncertain Times

Part of the book series: Information Technology and Law Series ((ITLS,volume 32))

Abstract

The use of Big Data in the law enforcement sector turns the traditional practices of profiling to search for suspects or determining the threat level of a suspect into a data-driven process. Risk profiling is frequently used in the USA and is becoming more prominent in national law enforcement practices in Member States of the European Union. While risk profiling creates challenges that differ per jurisdiction in which it is used and vary along the purpose for which the profiling is deployed, this technological development brings fundamental changes that are quite universal. Risk profiling of suspects, or of large parts of the population to detect suspects, brings challenges of transparency, discrimination and challenges procedural safeguards. After exploring the concept of risk profiling, this chapter discusses those fundamental challenges. To illustrate the challenges, the chapter uses two main examples of risk profiling: COMPAS and SyRI.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 99.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Broeders et al. 2017; Joh 2016, p. 16.

  2. 2.

    Zouave and Marquenie 2017.

  3. 3.

    Fundamental Rights Agency 2018.

  4. 4.

    Fundamental Rights Agency 2018.

  5. 5.

    Mittelstadt et al. 2016.

  6. 6.

    Ferguson 2018; Pasquale 2015.

  7. 7.

    Marks et al. 2017.

  8. 8.

    For example predictive policing software was first introduced in the USA before it was used in European countries.

  9. 9.

    Brayne et al. 2015; Christin et al. 2015; AI Now Institute 2018.

  10. 10.

    Hildebrandt 2008, p. 23.

  11. 11.

    Hildebrandt 2008, p. 19.

  12. 12.

    Zarsky 2014; Keats Citron and Pasquale 2014.

  13. 13.

    Zarsky 2014; Keats Citron and Pasquale 2014.

  14. 14.

    Swedloff 2014.

  15. 15.

    O’Neil 2016.

  16. 16.

    Van Brakel 2016.

  17. 17.

    Mittelstadt et al. 2016.

  18. 18.

    Clavell 2016.

  19. 19.

    Clavell 2016.

  20. 20.

    Van Brakel 2016.

  21. 21.

    With the use of PredPol software.

  22. 22.

    Besluit SUWI, Staatsblad 2014, 320. Available only in Dutch at: https://zoek.officielebekendmakingen.nl/stb-2014-320.html. Last accessed 30 September 2018.

  23. 23.

    Besluit SUWI, Staatsblad 2014, 320. Available only in Dutch at: https://zoek.officielebekendmakingen.nl/stb-2014-320.html. Last accessed 30 September 2018.

  24. 24.

    Besluit SUWI, Staatsblad 2014, 320. Available only in Dutch at: https://zoek.officielebekendmakingen.nl/stb-2014-320.html. Last accessed 30 September 2018.

  25. 25.

    Information about the pending court case in English is available at: https://pilpnjcm.nl/en/dossiers/profiling-and-syri/. Last accessed 30 September 2018.

  26. 26.

    Information about the pending court case in English is available at: https://pilpnjcm.nl/en/dossiers/profiling-and-syri/. Last accessed 30 September 2018.

  27. 27.

    Robinson 2017.

  28. 28.

    Brkan 2017.

  29. 29.

    Ferguson 2016.

  30. 30.

    Angwin et al. 2016.

  31. 31.

    The issues raised in the petition are: (1) Whether it is a violation of a defendant’s constitutional right to due process for a trial court to rely on the risk assessment results provided by a proprietary risk assessment instrument such as the Correctional Offender Management Profiling for Alternative Sanctions at sentencing because the proprietary nature of COMPAS prevents a defendant from challenging the accuracy and scientific validity of the risk assessment; and (2) whether it is a violation of a defendant’s constitutional right to due process for a trial court to rely on such risk assessment results at sentencing because COMPAS assessments take gender and race into account in formulating the risk assessment.

  32. 32.

    Loomis v. Wisconsin, docket no. 16-6387, available at: http://www.scotusblog.com/case-files/cases/loomis-v-wisconsin/. Last accessed 30 September 2018.

  33. 33.

    Dressel and Farid 2018.

  34. 34.

    Angwin et al. 2016. Together with their report, the researchers of ProPublica made several files publicly available, such as a list with the factors that COMPAS uses in scoring.

  35. 35.

    Hildebrandt 2008, pp. 21–22.

  36. 36.

    Hildebrandt 2008, pp. 21–22.

  37. 37.

    Hildebrandt and Koops 2010.

  38. 38.

    Rauhofer 2008.

  39. 39.

    Leese 2014.

  40. 40.

    Mittelstadt et al. 2016.

  41. 41.

    Ferguson 2018; Brinkhoff 2017, p. 68.

  42. 42.

    Koops 2009.

  43. 43.

    Koops 2009.

  44. 44.

    Brinkhoff 2017, p. 68.

  45. 45.

    Ferguson 2015; Broeders et al. 2017.

  46. 46.

    Ferguson 2015.

  47. 47.

    Ferguson 2015; Broeders et al. 2017.

  48. 48.

    Simmons 2016.

  49. 49.

    Hildebrandt and Koops 2010.

  50. 50.

    Taylor et al. 2017; Mantelero 2016.

  51. 51.

    Directive (EU) 2016/680 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of such data, and repealing Council Framework Decision 2008/977/JHA, L 119/89.

  52. 52.

    Brkan 2017.

  53. 53.

    Rauhofer 2008, p. 192.

  54. 54.

    Koops 2009.

  55. 55.

    Ferguson 2018; Leese 2014.

  56. 56.

    Leese 2014.

  57. 57.

    Angwin et al. 2016.

  58. 58.

    Angwin et al. 2016.

  59. 59.

    Dressel and Farid 2018.

  60. 60.

    Data & Society 2015.

  61. 61.

    Leese 2014.

  62. 62.

    Leese 2014.

  63. 63.

    Kosta 2017.

  64. 64.

    Such as the court case pertaining to SyRI in the Netherlands.

References

  • AI Now Institute (2018) Litigating algorithms: Challenging government use of algorithmic decision systems. https://ainowinstitute.org/litigatingalgorithms.pdf. Last accessed 30 September 2018

  • Angwin J, Larson J, Mattu S, Kirchner L (2016) Machine Bias: There’s software used across the country to predict future criminals. And it’s biased against blacks. https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing. Last accessed 30 September 2018

  • Brayne S, Rosenblat A, Boyd D (2015) Predictive Policing. https://datacivilrights.org/2015/. Last accessed 30 September 2018

  • Brinkhoff S (2017) Big Data Data Mining by the Dutch Police: Criteria for a Future Method of Investigation. European Journal for Security Research 2:57–69

    Google Scholar 

  • Brkan M (2017) Do algorithms rule the world? Algorithmic decision-making in the framework of the GDPR and beyond. https://ssrn.com/ab-stract=3124901. Last accessed 30 September 2018

  • Broeders D, Schrijvers E, Hirsch Ballin E (2017) Big Data and Security Policies: Serving Security, Protecting Freedom. WRR-Policy Brief. https://english.wrr.nl/publications/policy-briefs/2017/01/31/big-data-and-security-policies-serving-security-protecting-freedom. Last accessed 30 September 2018

  • Christin A, Rosenblat A, Boyd D (2015) Courts and Predictive Algorithms. https://datacivilrights.org/2015/. Last accessed 30 September 2018

  • Clavell GG (2016) Policing, Big Data and the Commodification of Security. In: Van der Sloot B et al. (eds) Exploring the Boundaries of Big Data. Amsterdam University Press, Amsterdam, pp 89–116

    Google Scholar 

  • Data & Society (2015) Data & Civil Rights: A New Era of Policing and Justice. http://www.datacivilrights.org/pubs/2015-1027/executive_summary.pdf. Last accessed 30 September 2018

  • Dressel J, Farid H (2018) The accuracy, fairness, and limits of predicting recidivism. Science Advances 4; eaao5580

    Google Scholar 

  • Ferguson A (2015) Big Data and predictive reasonable suspicion. University of Pennsylvania Law Review 163:327–410

    Google Scholar 

  • Ferguson A (2016) Predictive Prosecution. Wake Forest Law Review 51:705–744

    Google Scholar 

  • Ferguson A (2018) Illuminating Black Data Policing. Ohio State Journal of Criminal Law 15:503–525

    Google Scholar 

  • Fundamental Rights Agency (2018) Big Data: Discrimination in data-supported decision making. http://fra.europa.eu/en/publication/2018/big-data-discrimination. Last accessed 30 September 2018

  • Hildebrandt M (2008) Defining Profiling: A New Type of Knowledge? In: Hildebrandt M, Gutwirth S (eds) Profiling the European Citizen. Springer, Dordrecht, pp 17–45

    Google Scholar 

  • Hildebrandt M, Koops EJ (2010) The Challenges of Ambient Law and Legal Protection in the Profiling Era. Modern Law Review 73:428–460

    Google Scholar 

  • Joh EE (2016) The New Surveillance Discretion: Automated, Suspicion, Big Data, and Policing. Harvard Law & Policy Review 10:15–42

    Google Scholar 

  • Keats Citron D, Pasquale F (2014) The Scored Society: Due Process for Automated Predictions. Washington Law Review 89:1–33

    Google Scholar 

  • Koops EJ (2009) Technology and the Crime Society: Rethinking Legal Protection. Law Innovation and Technology 1:93–124

    Google Scholar 

  • Kosta E (2017) Surveilling Masses and Unveiling Human Rights - Uneasy Choices for the Strasbourg Court. Tilburg Law School Research Paper No. 2018-10. https://ssrn.com/abstract=3167723. Last accessed 30 September 2018

  • Leese M (2014) The new profiling: Algorithms, black boxes, and the failure of anti-discriminatory safeguards in the European Union. Security Dialogue 45:494–511

    Google Scholar 

  • Mantelero A (2016) Personal data for decisional purposes in the age of analytics: From an individual to a collective dimension of data protection. Computer Law & Security Review 32:238–255

    Google Scholar 

  • Marks A, Bowling B, Keenan C (2017) Automatic justice? Technology, Crime and Social Control. In: Brownsword R, Scotford E, Yeung K (eds) The Oxford Handbook of the Law and Regulation of Technology. Oxford University Press, Oxford, pp 705–730

    Google Scholar 

  • Mittelstadt BD et al (2016) The ethics of algorithms: Mapping the debate. Big Data & Society 3:1–21

    Google Scholar 

  • O’Neil C (2016) Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. Crown Publishers, New York

    Google Scholar 

  • Pasquale F (2015) The Black Box Society: The Secret Algorithms That Control Money and Information. Harvard University Press, Cambridge

    Google Scholar 

  • Rauhofer J (2008) Privacy is dead, get over it! Information privacy and the dream of a risk-free society. Information & Communications Technology Law 17:185–197

    Google Scholar 

  • Robinson D (2017) The Challenges of Prediction: Lessons from Criminal Justice. I/S: A Journal of Law and Policy for the Information Society. https://ssrn.com/abstract=3054115. Last accessed 30 September 2018

  • Simmons R (2016) Quantifying Criminal Procedure: How to Unlock the Potential of Big Data in our Criminal Justice System. Michigan State Law Review 2016:947–1017

    Google Scholar 

  • Swedloff R (2014) Risk Classification’s Big Data (R)evolution. Connecticut Insurance Law Journal 21:339–373

    Google Scholar 

  • Taylor L, Floridi L, Van der Sloot B (eds) (2017) Group Privacy: New Challenges of Data Technologies. Springer, Dordrecht

    Google Scholar 

  • Van Brakel R (2016) Pre-Emptive Big Data Surveillance and its (Dis)Empowering Consequences: The Case of Predictive Policing. In: Van der Sloot B et al. (eds) Exploring the Boundaries of Big Data. Amsterdam University Press, Amsterdam, pp 117–141

    Google Scholar 

  • Zarsky T (2014) Understanding Discrimination in the Scored Society. Washington Law Review 89:1375–1412

    Google Scholar 

  • Zouave ET, Marquenie T (2017) An Inconvenient Truth: Algorithmic Transparency & Accountability in Criminal Intelligence Profiling. European Intelligence and Security Informatics Conference. https://ieeexplore.ieee.org/document/8240764. Last accessed 30 September 2018

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sascha van Schendel .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 T.M.C. Asser press and the authors

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

van Schendel, S. (2019). The Challenges of Risk Profiling Used by Law Enforcement: Examining the Cases of COMPAS and SyRI. In: Reins, L. (eds) Regulating New Technologies in Uncertain Times. Information Technology and Law Series, vol 32. T.M.C. Asser Press, The Hague. https://doi.org/10.1007/978-94-6265-279-8_12

Download citation

  • DOI: https://doi.org/10.1007/978-94-6265-279-8_12

  • Published:

  • Publisher Name: T.M.C. Asser Press, The Hague

  • Print ISBN: 978-94-6265-278-1

  • Online ISBN: 978-94-6265-279-8

  • eBook Packages: Law and CriminologyLaw and Criminology (R0)

Publish with us

Policies and ethics