Skip to main content

Technocratic Automation and Contemplative Overlays in Artificially Intelligent Criminal Sentencing

  • Chapter
  • First Online:
Co-Designing Economies in Transition
  • 576 Accesses

Abstract

In this chapter, Butler explores potential realities of technocratic automation at the intersection of criminal sentencing, artificial intelligence (AI), and race. The chapter begins with a synopsis of the role automation plays in technocratic electronic governance. It then moves to demonstrate how the implementation of automation has adversely affected Black communities. Butler then illustrates how AI is currently outpacing human performance, implying that soon in the realm of criminal sentencing, artificially intelligent judges will emerge, outperforming and eventually replacing human judges. Next, he applies the lens of race to outline how current concepts of artificial cognitive architectures merely reiterate oppressive racial biases. The chapter concludes by imagining how contemplative overlays might be applied to artificial cognitive architectures to allow for more mindful and just sentencing.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Notes

  1. 1.

    Blackness is a term I choose over African American, mainly because the fact that American has a disclaimer for Black bodies is problematic for me. Some Black folks prefer the term African American. I do not. If I have to place a caveat to my American identity in front of my American identity then it somehow demonstrates that I am not fully American. You do not see the term White American. There is not even the term European American. The assumption normally is that if you are American then you are white—and vice versa.

  2. 2.

    Certain laws already have minimum punishments or automatic punitive actions.

  3. 3.

    Path dependence is essentially the use of long-term implementation to test and determine the accuracy of a newly implemented governmental policy. It justifies the keeping of policies and procedures in place for its need to determine longitudinal efficiency often overlooking initial setbacks. But it does applaud early stage success. Fountain, On the Effects of e-Government, 473.

  4. 4.

    A theoretical sketch of the added layers of associated with laws becoming digital ontologies might assume this structure: Environment (where everything happens), People (that are governed and live in the environment), Data (contains raw info from real world interactions between people, other people and the environment), Technical specialists (who process data), Democratic Process (If this is the structure of the government, it includes the legislation process—legislators, voters, etc.), Programmers (writers of code), Hardware (components that are run on previously written software that allow for the creation of new code to write new software geared toward legislation), Storage (multiple hardware units, that together, maintain the relationship between data, hardware and software for the continual running of the system), Code (in the specific case of digital ontologies, it is the logic used to run contingency models—based on the processes created by technical specialists however programmers can also serve as technical specialists—through the structure of a particular programming language to create a functioning software program used to determine outcomes and the implementation of laws), Media (websites, phones, digital applications, etc.), and People. It could be argued that these structures already mimic previous governmental modes of layering (legislation process, paper, storage, people), but the added layering of technology, technical experts and technologically mediated storage units (which can be backed up in a cloud , another layer altogether) make it an incredibly more buttressed system. This is admittedly a very linear approach. It does not begin to include the added variables of the inverse parallel process of the order I’ve created or most importantly the invariable way that these layers can be by passed. For example, the communality of personhood needs to be accounted for as to how someone who is governed can either influence another person in the order, i.e. legislator, technical specialist of programmer, or have the ability to become one of those roles or not based on socioeconomic status or other marginalizations.

  5. 5.

    Some might argue that religion or reasoned based morality were the bases for law creation and castigation principles. But the emotions that were fostered from the acceptance of either religious dogma or reason—as normative—helped determine the severity of punishment. It also helped to determine priorities based on perceptions of vulnerability, privilege, in-group and out-group.

  6. 6.

    Neutral emotion falls into two categories. The first is non-reactive emotionality, which is a reference to a state of calm (often experienced from a spiritual practice—mindfulness, Jesus prayer, compassion practice, etc.). The second form of neutral emotions manifest as individual emotional homeostasis. It is where an individual is neither particularly aroused nor calm. Although emotions are involuntary electro-biochemical responses/reactions a neutral emotion is not neutral because it is not influenced by outside stimuli. It is neutral because of the perception of the individual experiencing a state of emotional equilibrium.

  7. 7.

    This includes the trying of Black defendants who were children as adults, 28 year operation that sold Black teen defendants into prison, and the utter disregard for the personhood of Black defendants in judging, policing and projection of certain images in society by social elite, that is, super predators by Hillary Clinton, animals by scientific racists, etc.

References

  • Abrams, A. I., & Siegel, L. M. (1978). The transcendental meditation program and rehabilitation at Folsom State Prison: A cross-validation study. Correctional Psychologist, 5(1), 3–20.

    Article  Google Scholar 

  • Alexander, M. (2012). The new Jim Crow: Mass incarceration in the age of colorblindness. New York, NY: The New Press.

    Google Scholar 

  • Astin, J. A. (1997). Stress reduction through mindfulness meditation. Psychotherapy and Psychosomatics, 66(2), 97–106.

    Article  Google Scholar 

  • Bonnet, J., Subsoontorn, P., & Endy, D. (2012). Rewritable digital data storage in live cells AI engineered control of recombination directionality. Proceedings of the National Academy of Sciences, 109(23), 8884–8889.

    Article  Google Scholar 

  • Chen, M., Mao, S., & Liu, Y. (2014). Big data: A survey. Mobile Networks and Applications, 19(2), 171–209.

    Article  Google Scholar 

  • Crowder, J. A., Carbone, J. N., & Friess, S. A. (2014). Artificial cognition architectures. New York, NY: Springer.

    Book  Google Scholar 

  • Deane, G. (2013). Technological unemployment: Panacea or poison? Institute of Ethics and Emerging Technologies. Retrieved from https://ieet.org/index.php/IEET2/more/7303

  • Fountain, J. E. (2014). On the effects of e-government on political institutions. In D. L. Kleinman & K. Moore (Eds.), Routledge handbook of science, technology, and society (pp. 462–478). Florence, KY: Routledge.

    Google Scholar 

  • Haimerl, C. J., & Valentine, E. R. (2001). The effect of contemplative practice on intrapersonal, interpersonal, and transpersonal dimensions of the self-concept. Journal of Transpersonal Psychology, 33(1), 37–52.

    Google Scholar 

  • Hanley, C. P., & Spates, J. L. (1978). Transcendental meditation and social psychological attitudes. The Journal of Psychology, 99(2), 121–127.

    Article  Google Scholar 

  • Hjelle, L. A. (1974). Transcendental meditation and psychological health. Perceptual and Motor Skills, 39(1), 623–628.

    Article  Google Scholar 

  • Leibo, J. Z., Zambaldi, V., Lanctot, M., Marecki, J., & Graepel, T. (2017). Multi-agent reinforcement learning in sequential social dilemmas. arXiv preprint arXiv:1702.03037.

    Google Scholar 

  • Lesh, T. V. (1970). Zen meditation and the development of empathy in counselors. Journal of Humanistic Psychology, 10, 37–74.

    Google Scholar 

  • Marchant, G. E., Stevens, Y. A., & Hennessy, J. M. (2014). Technology, unemployment & policy options: Navigating the transition to a better world. Technology, 24(1), 26–44.

    Google Scholar 

  • Mettler, S., & Soss, J. (2004). The consequences of public policy for democratic citizenship: Bridging policy studies and mass politics. Perspectives on Politics, 2(1), 55–73.

    Article  Google Scholar 

  • Pelletier, K. R. (1974). Influence of transcendental meditation upon autokinetic perception. Perceptual and Motor Skills, 39, 1031–1034.

    Article  Google Scholar 

  • Penner, W. J., Zingle, H. W., Dyck, R., & Truch, S. (1974). Does an in-depth transcendental meditation course effect change in the personalities of the participants. Western Psychologist, 4, 104–111.

    Google Scholar 

  • Smith, A. (2016). Public predictions of the future of workforce automation. Washington, DC: Pew Research Center. Retrieved from http://goo.gl/1L2QSk

  • Tufekci, Z. (2014). Engineering the public: Big data, surveillance and computational politics. First Monday, 19(7). Retrieved from http://goo.gl/TXRr5D

  • Walshe, S. (2013). How US prison labour pads corporate profits at taxpayers’ expense. The Guardian. Retrieved from http://goo.gl/Ie3YIk

  • Weller, C. (2016). The world’s first artificially intelligent lawyer was just hired at a law firm. Business Insider. Retrieved from http://goo.gl/GhZHFr

  • Wiseman, O. (2017) The AI with an intuition, “AlphaGo” has defeated the human champion of the world’s most complex game by speculating about the future. Prime Mind. Retrieved from http://goo.gl/ugGw5l

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 The Author(s)

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Butler, P. (2018). Technocratic Automation and Contemplative Overlays in Artificially Intelligent Criminal Sentencing. In: Giorgino, V., Walsh, Z. (eds) Co-Designing Economies in Transition. Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-319-66592-4_14

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-66592-4_14

  • Published:

  • Publisher Name: Palgrave Macmillan, Cham

  • Print ISBN: 978-3-319-66591-7

  • Online ISBN: 978-3-319-66592-4

  • eBook Packages: Economics and FinanceEconomics and Finance (R0)

Publish with us

Policies and ethics