A growing literature is taking an institutionalist and governance perspective on how algorithms shape society based on unprecedented capacities for managing social complexity. Algorithmic governance altogether emerges as a novel and distinctive kind of societal steering. It appears to transcend established categories and modes of governance—and thus seems to call for new ways of thinking about how social relations can be regulated and ordered. However, as this paper argues, despite its novel way of realizing outcomes of collective steering and coordination, it can nevertheless be grasped with an old and fundamental figure in political philosophy: that of Thomas Hobbes’ Leviathan. Comparing algorithmic governance with this figure serves to highlight their similarities as socio-political arrangements, and specifically to clarify how algorithmic governance parallels the apolitical traits of the Leviathan—it eliminates the political as it requires compliance and forgoing contestation to best fulfill its role and to produce satisfying outcomes.
This is a preview of subscription content, log in to check access.
Buy single article
Instant access to the full article PDF.
Price includes VAT for USA
Subscribe to journal
Immediate online access to all issues from 2019. Subscription will auto renew annually.
This is the net price. Taxes to be calculated in checkout.
Algorithmic governance can take various forms. Yeung (2017b) has provided a taxonomy of designs that vary along the three dimensions of standard-setting, monitoring, and sanctioning. The most potent designs combine a flexible, dynamic standard-setting with a pre-emptive monitoring and operating.
It should be noted that this does not mean that all outputs—e.g., information, suggestions, decisions—are adapted to the particularities of every individual, but rather to features or group traits that individuals share with others.
Rahwan (2017) has touched upon this figure thinking about what a social contract about algorithmic systems could look like. In the following, a different perspective is chosen, one that looks at how algorithmic governance itself amounts to a sort of social contract.
The Chinese example of the Social Credit System, however, also shows that this can be turned around into effecting behavior and decisions through the fear of losing social status and access to various public or commercial offers and services.
Moreover, algorithmic systems process inputs in the form of data that cannot speak for itself—it must be processed based on selection rules and decisions about what counts as relevant; and the specific ways of selecting, categorizing, and making distinctions based on available information always impose a certain way of seeing (Ananny and Crawford 2016; Mittelstadt and Floridi 2016; Floridi 2012).
Ananny, M., & Crawford, K. (2016). Seeing without knowing: limitations of the transparency ideal and its application to algorithmic accountability. New Media & Society, online first.
Arendt, H. (1998). The human condition (2nd ed.). Chicago: University of Chicago Press.
Arthur, B. W. (2011). The second economy. McKinsey Quarterly, 2011, 3, 1–3, 9.
Barber, B. (2003). Strong democracy: participatory politics for a new age. Berkeley: University of California Press.
Baruh, L., & Popescu, M. (2017). Big data analytics and the limits of privacy self-management. New Media & Society, 19(4), 579–596.
Bauman, Z. (2017). Retrotopia. Cambridge: Polity.
Beer, D. (2009). Power through the algorithm? Participatory web cultures and the technological unconscious. New Media & Society, 11(6), 985–1002.
Bennett, W. L., & Iyengar, S. (2008). A new era of minimal effects? The changing foundations of political communication. Journal of Communication, 58(4), 707–731.
Berk, R., Heidari, H., Jabbari, S., Kearns, M., & Roth, A. (2018). Fairness in criminal justice risk assessments: the state of the art. Sociological Methods & Research, 004912411878253.
Bimber, B. (2014). Digital Media in the Obama Campaigns of 2008 and 2012: adaptation to the personalized political communication environment. Journal of Information Technology & Politics, 11(2), 130–150.
Brandimarte, Laura, and Alessandro Acquisti (2012). ‘The Economics of Privacy’, in Martin Peitz and Joel Waldfogel (eds.), The Oxford handbook of the digital economy, vol. New York: Oxford University Press, 547–571.
Brauneis, R., & Goodman, E. P. (2017). Algorithmic transparency for the smart city. SSRN Electronic Journal, https://www.ssrn.com/abstract=3012499 (Accessed May 16, 2018).
Bucher, T. (2012). Want to be on top? Algorithmic power and the threat of invisibility on Facebook. Culture Machine, 13, 1–13.
Chen, Y.-C., & Hsieh, T.-C. (2014). Big data for digital government: opportunities, challenges, and strategies. International Journal of Public Administration in the Digital Age, 1(1), 1–14.
Clarke, A., & Margetts, H. (2014). Governments and citizens getting to know each other? Open, closed, and big data in public management reform. Policy & Internet, 6(4), 393–417.
Coletta, C., & Kitchin, R. (2017). Algorhythmic governance: Regulating the “heartbeat” of a city using the Internet of things. Big Data & Society, 4(2), 205395171774241.
Curry, Edward (2016). ‘The Big Data Value Chain: definitions, concepts, and theoretical approaches’, in José Cavanillas, Edward Curry, and Wolfgang Wahlster (eds.), New horizons for a data-driven economy, vol. Cham: Springer International Publishing, 29–37. http://link.springer.com/10.1007/978-3-319-21569-3_3 (Accessed January 31, 2017).
Dahlberg, L. (2007). Rethinking the fragmentation of the cyberpublic: from consensus to contestation. New Media & Society, 9(5), 827–847.
Danaher, J. (2016). The threat of algocracy: reality, resistance and accommodation. Philosophy & Technology, 29(3), 245–268.
Dee, M. (2013). Welfare surveillance, income management and new paternalism in Australia. Surveillance & Society, 11(3), 272–286.
van Dijck, J. (2013). Facebook and the engineering of connectivity: a multi-layered approach to social media platforms. Convergence: The International Journal of Research into New Media Technologies, 19(2), 141–155.
Dunleavy, P. (2016). “Big data” and policy learning. In G. Stoker & M. Evans (Eds.), Evidence-based policy making in the social sciences: methods that matter. Bristol Chicago, IL: Policy Press.
Dylko, I. B., Beam, M. A., Landreville, K. D., & Geidner, N. (2012). Filtering 2008 US presidential election news on YouTube by elites and nonelites: an examination of the demoratizing potential of the internet. New Media and Society, 14(5), 832–849.
Floridi, L. (2012). Big data and their epistemological challenge. Philosophy & Technology, 25(4), 435–437.
Gillespie, T. (2014). The relevance of algorithms. In T. Gillespie, P. J. Boczkowski, & K. A. Foot (Eds.), Media technologies: essays on communication, materiality, and society (pp. 167–194). Cambridge, Massachusetts: The MIT Press.
Helbing, Dirk (2015). Thinking ahead - essays on big data, digital revolution, and participatory market society. Cham: Springer International Publishing. http://link.springer.com/10.1007/978-3-319-15078-9 (Accessed July 27, 2015).
Hersh, E. (2015). Hacking the electorate: how campaigns perceive voters. New York, NY: Cambridge University Press.
Hildebrandt, Mireille (2008). ‘Defining profiling: a new type of knowledge?’, in Mireille Hildebrandt and Serge Gutwirth (eds.), Profiling the European citizen, vol. Dordrecht: Springer Netherlands, 17–45. http://link.springer.com/10.1007/978-1-4020-6914-7_2 (Accessed January 31, 2017).
Hildebrandt, M. (2016). Law as information in the era of data-driven agency: law as information. The Modern Law Review, 79(1), 1–30.
Hobbes, T. (1909). Hobbes’s leviathan : reprinted from the edition of 1651. Oxford: Clarendon Press https://archive.org/details/hobbessleviathan00hobbuoft.
Hofmann, J., Katzenbach, C., & Gollatz, K. (2017). Between coordination and regulation: finding the governance in Internet governance. New Media & Society, 19(9), 1406–1423.
Hood, C., & Margetts, H. (2007). The tools of government in the digital age. Basingstoke: Palgrave Macmillan.
van den Hoven, J. (2005). E-democracy, E-contestation and the monitorial citizen*. Ethics and Information Technology, 7(2), 51–59.
John, P. (2016). Behavioral approaches: how nudges lead to more intelligent policy design. In B. Guy Peters & P. Zittoun (Eds.), Contemporary approaches to public policy: theories, controversies and perspectives, vol., International series on public policy (pp. 113–131). London: Palgrave Macmillan.
Just, N., & Latzer, M. (2017). Governance by algorithms: reality construction by algorithmic selection on the Internet. Media, Culture & Society, 39(2), 238–258.
Kitchin, R. (2014a). Big data, new epistemologies and paradigm shifts. Big Data & Society, 1(1) http://bds.sagepub.com/lookup/doi/10.1177/2053951714528481 (Accessed May 25, 2016.
Kitchin, R. (2014b). The real-time city? Big data and smart urbanism. GeoJournal, 79(1), 1–14.
Kratochwil, F. (2013). Communication, Niklas Luhmann, and the Fragmentation Debate in International Law. In R. J. Beck (Ed.), Law and disciplinarity: thinking beyond borders, vol., International law, crime and politics (pp. 257–288). New York, NY: Palgrave Macmillan.
de Laat, Paul B. (2017). ‘Algorithmic decision-making based on machine learning from big data: can transparency restore accountability?’, Philosophy & Technology, http://link.springer.com/10.1007/s13347-017-0293-z (Accessed June 1, 2018).
Lambin, J.-J. (2014). A digital and networking economy. in Rethinking the Market Economy, vol. London: Palgrave Macmillan UK, 147–163 http://link.springer.com/10.1057/9781137392916_8 (Accessed October 7, 2016).
Leese, M. (2014). The new profiling: Algorithms, black boxes, and the failure of anti-discriminatory safeguards in the European Union. Security Dialogue, 45(5), 494–511.
Lepri, B., Oliver, N., Letouzé, E., Pentland, A., & Vinck, P. (2018). Fair, transparent, and accountable algorithmic decision-making processes: the premise, the proposed solutions, and the open challenges. Philosophy & Technology, 31(4), 611–627.
Lessig, L. (2002). Code: and other laws of cyberspace. In Nachdr. New York: The Perseus Books Group.
Leszczynski, A. (2016). Speculative futures: cities, data, and governance beyond smart urbanism. Environment and Planning A: Economy and Space, 48(9), 1691–1708.
Linders, D. (2012). From e-government to we-government: defining a typology for citizen coproduction in the age of social media. Government Information Quarterly, 29(4), 446–454.
Lyon, David (2003). ‘Surveillance as social sorting. Computer codes and mobile bodies’, in David Lyon (ed.), Surveillance as social sorting: privacy, risk, and digital discrimination, vol. London; New York: Routledge, 13–30.
Mackenzie, A. (2013). Programming subjects in the regime of anticipation: Software studies and subjectivity. Subjectivity, 6(4), 391–405.
Margetts, H., & Dunleavy, P. (2013). The second wave of digital-era governance: a quasi-paradigm for government on the Web. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 371(1987), 20120382–20120382.
Marx, K., & Engels, F. (1962). Marx / Engels: Werke: Band 20: Anti-Dühring - Dialektik der Natur. Berlin: Dietz.
Meijer, A., & Bolívar, M. P. R. (2016). Governing the smart city: a review of the literature on smart urban governance. International Review of Administrative Sciences, 82(2), 392–408.
Mittelstadt, B. D., Allo, P., Taddeo, M., Wachter, S., & Floridi, L. (2016). The ethics of algorithms: mapping the debate. Big Data & Society, 3(2), 205395171667967.
Mittelstadt, B. D., & Floridi, L. (2016). The ethics of big data: current and foreseeable issues in biomedical contexts. Science and Engineering Ethics, 22(2), 303–341.
Morozov, E. (2014). To save everything, click here: technology, solutionism and the urge to fix problems that don’t exist. London: Penguin Books.
Nam, T. (2012). Suggesting frameworks of citizen-sourcing via Government 2.0. Government Information Quarterly, 29(1), 12–20.
Napoli, P. M. (2014). Automated media: an institutional theory perspective on algorithmic media production and consumption: automated media. Communication Theory, 24(3), 340–360.
Newell, S., & Marabelli, M. (2015). Strategic opportunities (and challenges) of algorithmic decision-making: a call for action on the long-term societal effects of “datification”. The Journal of Strategic Information Systems, 24(1), 3–14.
Oliver, A. (2015). Nudging, shoving, and budging: behavioral economic-informed policy. Public Administration, 93(3), 700–714.
O’Reilly, T. (2011). Government as a platform. Innovations: Technology, Governance, Globalization, 6(1), 13–40.
Pagallo, Ugo (2017). ‘Algo-rhythms and the beat of the legal drum’, Philosophy & Technology. http://link.springer.com/10.1007/s13347-017-0277-z (Accessed June 2, 2018).
Pentland, A. (2013). The data-driven society. Scientific American, 309(4), 78–83.
Rahwan, I. (2017). Society-in-the-loop: programming the algorithmic social contract. Ethics and Information Technology, (online first), 1–10.
Rancière, J. (1999). Disagreement: politics and philosophy. Minneapolis: Univ. of Minnesota Press.
Schmitt, C. (1996). The leviathan in the state theory of Thomas Hobbes: meaning and failure of a political symbol. Westport, Conn: Greenwood Press.
Schroeder, R., & Ling, R. (2014). Durkheim and Weber on the social implications of new information and communication technologies. New Media & Society, 16(5), 789–805.
Treib, O., Bähr, H., & Falkner, G. (2007). Modes of governance: towards a conceptual clarification. Journal of European Public Policy, 14(1), 1–20.
Tully, J. (1999). The agonic freedom of citizens. Economy and Society, 28(2), 161–182.
Urbinati, N. (2014). Democracy disfigured: opinion, truth, and the people. Cambridge, Massachusetts: Harvard University Press.
Veale, Michael, Max Van Kleek, and Reuben Binns (2018). ‘Fairness and accountability design needs for algorithmic support in high-stakes public sector decision-making’, in Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems - CHI ‘18, vol. Montreal QC, Canada: ACM Press, 1–14 http://dl.acm.org/citation.cfm?doid=3173574.3174014 (Accessed May 16, 2019).
Wachter, S., Mittelstadt, B., & Floridi, L. (2017). Why a right to explanation of automated decision-making does not exist in the general data protection regulation. International Data Privacy Law, 7(2), 76–99.
Williamson, B. (2014). Knowing public services: cross-sector intermediaries and algorithmic governance in public sector reform. Public Policy and Administration, 29(4), 292–312.
Wohlers, T. E., & Bernier, L. L. (2016). Transformation of local government in the digital age. in Setting Sail into the Age of Digital Local Government, vol. Boston, MA: Springer US, 29–36. http://link.springer.com/10.1007/978-1-4899-7665-9_3 (Accessed November 7, 2016).
Yeung, K. (2017a). “Hypernudge”: big data as a mode of regulation by design. Information, Communication & Society, 20(1), 118–136.
Yeung, K. (2017b). Algorithmic regulation: a critical interrogation: algorithmic regulation. Regulation & Governance, (online first), 1–19.
Ziewitz, M. (2016). Governing algorithms: myth, mess, and methods. Science, Technology, & Human Values, 41(1), 3–16.
Zuboff, S. (2019). The age of surveillance capitalism: the fight for the future at the new frontier of power. London: Profile Books.
Zweig, K. A., Wenzelburger, G., & Krafft, T. D. (2018). On chances and risks of security related algorithmic decision making systems. European Journal for Security Research, 3(2), 181–203.
I would like to thank the reviewers for their valuable comments and suggestions. Thanks also go to Joschka Frech for assisting with the preparation of an earlier version of themanuscript.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
König, P.D. Dissecting the Algorithmic Leviathan: On the Socio-Political Anatomy of Algorithmic Governance. Philos. Technol. 33, 467–485 (2020). https://doi.org/10.1007/s13347-019-00363-w
- Collective action
- Thomas Hobbes