While organizations today make extensive use of complex algorithms, the notion of algorithmic accountability remains an elusive ideal due to the opacity and fluidity of algorithms. In this article, we develop a framework for managing algorithmic accountability that highlights three interrelated dimensions: reputational concerns, engagement strategies, and discourse principles. The framework clarifies (a) that accountability processes for algorithms are driven by reputational concerns about the epistemic setup, opacity, and outcomes of algorithms; (b) that the way in which organizations practically engage with emergent expectations about algorithms may be manipulative, adaptive, or moral; and (c) that when accountability relationships are heavily burdened by the opacity and fluidity of complex algorithmic systems, the emphasis of engagement should shift to a rational communication process through which a continuous and tentative assessment of the development, workings, and consequences of algorithms can be achieved over time. The degree to which such engagement is, in fact, rational can be assessed based on four discourse-ethical principles of participation, comprehension, multivocality, and responsiveness. We conclude that the framework may help organizations and their environments to jointly work toward greater accountability for complex algorithms. It may further help organizations in reputational positioning surrounding accountability issues. The discourse-ethical principles introduced in this article are meant to elevate these positioning contests to extend beyond mere adaption or compliance and help guide organizations to find moral and forward-looking solutions to accountability issues.
This is a preview of subscription content, log in to check access.
Buy single article
Instant access to the full article PDF.
Price includes VAT for USA
Subscribe to journal
Immediate online access to all issues from 2019. Subscription will auto renew annually.
This is the net price. Taxes to be calculated in checkout.
See, for example, their documentation of a machine learning model for their recipe database: https://open.nytimes.com/our-tagged-ingredients-data-is-now-on-github-f96e42abaa1c.
ACM Association for Computing Machinery US Public Policy Council. (2017). Statement on algorithmic transparency and accountability. Retrieved December 1, 2017, from https://www.acm.org/binaries/content/assets/public-policy/2017_usacm_statement_algorithms.pdf.
Ananny, M. (2016). Toward an ethics of algorithms: Convening, observation, probability, and timeliness. Science, Technology and Human Values,41(1), 93–117.
Ananny, M., & Crawford, K. (2016). Seeing without knowing: Limitations of the transparency ideal and its application to algorithmic accountability. New Media & Society,3(2), 1–17.
Bachmann, P. (2017). Medienunternehmen und der strategische Umgang mit Media Responsibility und Corporate Social Responsibility. Wiesbaden: Springer.
Barnet, B. A. (2009). Idiomedia: The rise of personalized, aggregated content. Continuum,23(1), 93–99.
Bartlett, J. L., Pallas, J., & Frostenson, M. (2013). Reputation and legitimacy: Accreditation and rankings to assess organizations. In C. E. Carroll (Ed.), The handbook of communication and corporate reputation (pp. 530–544). Malden, MA: Wiley.
Beck, M. (2016). Can a death-predicting algorithm improve care? Wall Street Journal, 2. December 2016.
Beer, D. (2009). Power through the algorithm? Participatory web cultures and the technological unconscious. New Media & Society,11(6), 985–1002.
Beller, M., Zaidman, A., Karpov, A., & Zwaan, R. (2017). The last line effect explained. Empirical Software Engineering,22(3), 1508–1536. https://doi.org/10.1007/s10664-016-9489-6.
Bernaz, N. (2013). Enhancing corporate accountability for human rights violations: Is extraterritoriality the magic potion? Journal of Business Ethics,117(3), 493–511. https://doi.org/10.1007/s10551-012-1531-z.
Borgman, C. L. (2015). Big data, little data, no data: Scholarship in the networked world. Cambridge, MA: MIT Press.
Bovens, M. (2007). Analysing and assessing accountability: A conceptual framework. European Law Journal,13(4), 447–468.
Bovens, M. (2010). Two concepts of accountability: Accountability as a virtue and as a mechanism. West European Politics,33(5), 946–967.
Burrell, J. (2016). How the machine ‘thinks’: Understanding opacity in machine learning algorithms. Big Data & Society,3(1), 1–17.
Busuioc, M. (2013). European agencies: Law and practices of accountability. Oxford: Oxford University Press.
Busuioc, M., & Lodge, M. (2017). Reputation and accountability relationships: Managing accountability expectation through reputation. Public Administration Review,77(1), 99–100.
Carlson, M. (2015). The robotic reporter. Digital Journalism,3(3), 416–431.
Carmona, S., Donoso, R., & Reckers, P. M. J. (2013). Timing in accountability and trust relationships. Journal of Business Ethics,112(3), 481–495. https://doi.org/10.1007/s10551-012-1273-y.
Colquitt, J., & George, G. (2011). Publishing in AMJ. Part one: topic choice. Academy of Management Journal, 54(3), 432–435.
Coombs, T. W. (2013). Situational theory of crisis: Situational crisis communication theory and corporate reputation. In C. E. Carroll (Ed.), The handbook of communication and corporate reputation (pp. 262–278). Malden, MA: Wiley-Blackwell.
Crawford, K. (2016). Can an algorithm be agonistic? Ten scenes from life in calculated publics. Science, Technology and Human Values,41(1), 77–92.
Danna, A., & Gandy, O. H. (2002). All that glitters is not gold: Digging beneath the surface of data mining. Journal of Business Ethics,40(4), 373–386.
Datta, A., Sen, S., & Zick, Y. (2016, May). Algorithmic transparency via quantitative input influence: Theory and experiments with learning systems. In 2016 IEEE symposium on security and privacy (SP) (pp. 598–617). IEEE.
De Cremer, D., & Barker, M. (2003). Accountability and cooperation in social dilemmas: The influence of others’ reputational concerns. Current Psychology,22(2), 155–163. https://doi.org/10.1007/s12144-003-1006-6.
Deephouse, D. L., & Carter, S. M. (2005). An examination of differences between organizational legitimacy and organizational reputation. Journal of Management Studies,42(2), 329–360. https://doi.org/10.1111/j.1467-6486.2005.00499.x.
Desmarais, S. L., & Singh, J. P. (2013). Risk assessment instruments validated and implemented in correctional settings in the United States. Council of State Governments: Lexington.
DeZoort, F. T., & Harrison, P. D. (2016). Understanding auditors’ sense of responsibility for detecting fraud within organizations. Journal of Business Ethics. https://doi.org/10.1007/s10551-016-3064-3.
Diakopoulos. N. (2013). Algorithmic defamation: The case of the shameless autocomplete. Tow Center.
Diakopoulos, N. (2015). Algorithmic accountability. Digital Journalism,3(3), 398–415.
Diakopoulos, N., & Koliska, M. (2017). Algorithmic transparency in the news media. Digital Journalism,5(7), 809–828.
Diakopoulos, N., Friedler, S., & Arenas, M. et al. (2018). Principles for Accountable Algorithms and a Social Impact Statement for Algorithms. Retrieved September 1, 2018, from https://www.fatml.org/resources/principles-for-accountable-algorithms.
Dörr, K. N., & Hollnbuchner, K. (2017). Ethical challenges of algorithmic journalism. Digital Journalism,5(4), 404–419.
Doshi-Velez, F., & Kortz, M. (2017). Accountability of AI under the law: The role of explanation. Berkman Klein Center Working Group on explanation and the law, Berkman Klein Center for Internet & Society working paper.
Dubnick, M. J., & Frederickson, H. G. (2010). Accountable agents: Federal performance measurement and third-party government. Journal of Public Administration Research and Theory,20(suppl_1), i143–i159. https://doi.org/10.1093/jopart/mup039.
Edwards, L., & Veale, M. (2017). Slave to the algorithm? Why a ‘Right to Explanation’ is probably not the remedy you are looking for. 16 Duke Law & Technology Review 18 (2017). SSRN: https://ssrn.com/abstract=2972855 or http://dx.doi.org/10.2139/ssrn.2972855.
Eisenegger, M., & Imhof, K. (2008). The true, the good and the beautiful: Reputation management in the media society. In A. Zerfaß, B. V. Ruler, & K. Sriramesh (Eds.), Public relations research: European and international perspectives and innovations (pp. 125–146). Wiesbaden: VS Verlag.
EPIC, Electronic Privacy Information Center. (2017). Algorithms in the criminal justice system. Retrieved August 25, 2018, from https://epic.org/algorithmic-transparency/crim-justice/.
Fanta, A. (2017). Putting Europe’s robots on the map: Automated journalism in news agencies. Retrieved December 19, 2017, from https://reutersinstitute.politics.ox.ac.uk/our-research/putting-europes-robots-map-automated-journalism-news-agencies.
Ferraro, F., Etzion, D., & Gehman, J. (2015). Tackling grand challenges pragmatically: robust action revisited. Organization Studies, 36(3), 363–390.
Floridi, L. (2012). Big data and their epistemological challenge. Philosophy & Technology,25(4), 435–437.
Franzke, A., & Schäfer, M.T. (2017). DEDA Worksheet. Poster, Utrecht Data School, Retrieved December 14, 2017, from https://dataschool.nl/deda/deda-worksheet/?lang=en.
French, W., Zeiss, H., & Scherer, A. G. (2001). Intercultural discourse ethics: Testing Trompenaars’ and Hampden-Turner’s conclusions about Americans and the French. Journal of Business Ethics,34(3–4), 145–159.
Garber, M. (2016). When algorithms take the stand. The Atlantic. June 30, 2016.
Gilad, S., Maor, M., & Bloom, P. B.-N. (2015). Organizational reputation, the content of public allegations, and regulatory communication. Journal of Public Administration Research and Theory,25(2), 451–478. https://doi.org/10.1093/jopart/mut041.
Gilbert, D. U., & Rasche, A. (2007). Discourse ethics and social accountability: The ethics of SA 8000. Business Ethics Quarterly,17(2), 187–216.
Gillespie, T. (2014). The relevance of algorithms. In T. Gillespie, P. J. Boczkowski, & K. A. Foot (Eds.), Media technologies. Essays on communication, materiality, and society (pp. 167–194). Cambrdige/MA: MIT Press.
Glenn, T., & Monteith, S. (2014). Privacy in the digital world: Medical and health data outside of HIPAA protections. Current Psychiatry Reports, 16(11), 494, 1–11.
Graham, S. D. N. (2016). Software-sorted geographies. Progress in Human Geography, 29(5), 562–580.
Gray, R. (2002). The social accounting project and Accounting Organizations and Society Privileging engagement, imaginings, new accountings and pragmatism over critique? Accounting, Organizations and Society,27(7), 687–708. https://doi.org/10.1016/S0361-3682(00)00003-9.
Greenwood, M. (2007). Stakeholder engagement: Beyond the myth of corporate responsibility. Journal of Business Ethics,74(4), 315–327.
Greenwood, R., Raynard, M., Kodeih, F., Micelotta, E. R., & Lounsbury, M. (2011). Institutional complexity and organizational responses. Academy of Management Annals,5(1), 317–371. https://doi.org/10.5465/19416520.2011.590299.
Habermas, J. (1999). Moral consciousness and communicative action (C. Lenhardt & S. W. Nicholsen, Trans.). Cambridge, MA: MIT Press.
Habermas, J. (2006). Political communication in media society: Does democracy still enjoy an epistemic dimension? The impact of normative theory on empirical research. Communication Theory,16(4), 411–426.
Hildebrandt, M. (2008). Profiling and the rule of law. Identity in the Information Society,1(1), 55–70.
Hoos, F., Pruijssers, J. L., & Lander, M. W. (2017). Who’s watching? Accountability in different audit regimes and the effects on auditors’ professional skepticism. Journal of Business Ethics. https://doi.org/10.1007/s10551-017-3603-6.
Hunt, S. K., & Dumville, R. (2016). Recidivism among federal offenders: A comprehensive overview. United States Sanctioning Commission. Retrieved December 19, 2017, from https://www.ussc.gov/sites/default/files/pdf/research-and-publications/research-publications/2016/recidivism_overview.pdf.
Karpoff, J. M. (2012). Does reputation work to discipline corporate conduct? In M. L. Barnett & T. G. Pollock (Eds.), The Oxford handbook of corporate reputation (pp. 361–382). Oxford: Oxford University Press.
Kehl, D., Guo, P., & Kessler, S. (2017). Algorithms in the criminal justice system: Assessing the use of risk assessments in sentencing. Berkman Klein Center for Internet & Society, Harvard Law School. Retrieved December 19, 2017, from https://cyber.harvard.edu/publications/2017/07/Algorithms.
Kim, M., Bergman, L., Lau, T., & Notkin, D. (2004). An ethnographic study of copy and paste programming practices in OOPL. In 2004 international symposium on empirical software engineering, 2004. ISESE’04. Proceedings (pp. 83–92). IEEE.
Kim, H., Giacomin, J., & Macredie, R. (2014). A qualitative study of stakeholders’ perspectives on the social network service environment. International Journal of Human-Computer Interaction,30(12), 965–976.
King, B. G., & Whetten, D. A. (2008). Rethinking the relationship between reputation and legitimacy: A social actor conceptualization. Corporate Reputation Review,11(3), 192–207. https://doi.org/10.1057/crr.2008.16.
Leese, M. (2014). The new profiling: Algorithms, black boxes, and the failure of anti-discriminatory safeguards in the European Union. Security Dialogue,45(5), 494–511.
Lubit, R. (2001). The keys to sustainable competitive advantage: Tacit knowledge and knowledge management. Organizational Dynamics,29(3), 164–178. https://doi.org/10.1016/S0090-2616(01)00026-2.
Marr, D. (1982). Vision: A computational investigation into the human representation and processing of visual information. San Francisco: W.H. Freeman & Company.
Martin, K. (2018). Ethical implications and accountability of algorithms. Journal of Business Ethics. https://doi.org/10.1007/s10551-018-3921-3.
McDonnell, M.-H., & King, B. (2013). Keeping up appearances: Reputational threat and impression management after social movement boycotts. Administrative Science Quarterly,58(3), 387–419. https://doi.org/10.1177/0001839213500032.
Mingers, J., & Walsham, G. (2010). Toward ethical information systems: The contribution of discourse ethics. MIS Quarterly,34(4), 833–885.
Minsky, M. (1967). Why programming is a good medium for expressing poorly understood and sloppily formulated ideas. In M. Krampen & P. Seitz (Eds.), Design and planning II-computers in design and communication (pp. 120–125). New York: Hastings House Publishers.
Mittelstadt, B. (2016). Auditing for transparency in content personalization systems. International Journal of Communication,10, 4991–5002.
Mittelstadt, B. D., Allo, P., Taddeo, M., Wachter, S., & Floridi, L. (2016). The ethics of algorithms: Mapping the debate. Big Data & Society. https://doi.org/10.1177/2053951716679679.
Montal, T., & Reich, Z. (2017). I, Robot. You, Journalist. Who is the Author? Authorship, bylines and full disclosure in automated journalism. Digital Journalism,5(7), 829–849.
Nanz, P., & Steffek, J. (2005). Assessing the democratic quality of deliberation in international governance: Criteria and research strategies. Acta Politica,40, 368–383.
Naughton, J. (2016). Opinion, even algorithms are biased against black men. The Guardian. June 26, 2016.
Niemi, J. I. (2008). The foundations of Jürgen Habermas’s discourse ethics. The Journal of Value Inquiry,42(2), 255–268.
Nissenbaum, H. (2011). A contextual approach to privacy online. Dædalus, the Journal of the American Academy of Arts & Sciences,140(4), 32–48.
Norris, P. (2014). Watchdog journalism. In M. Bovens, R. E. Goodin, & T. Schillemans (Eds.), The Oxford handbook of public accountability. Oxford: Oxford University Press.
Owen, D. L., Swift, T. A., Humphrey, C., & Bowerman, M. (2000). The new social audits: Accountability, managerial capture or the agenda of social champions? European Accounting Review,9(1), 81–98. https://doi.org/10.1080/096381800407950.
Paßmann, J., & Boersma, A. (2017). Unknowing algorithms. On transparency of un-openable black boxes. In K. van Es & M. T. Schäfer (Eds.), The Datafied Society. Studying Culture through Data (pp. 139–146). Amsterdam: Amsterdam University Press.
Palazzo, G., & Scherer, A. G. (2006). Corporate legitimacy as deliberation: A communicative framework. Journal of Business Ethics,66(1), 71–88.
Pasquale, F. (2010). Beyond innovation and competition: The need for qualified transparency in internet intermediaries. Northwestern University Law Review,104, 105.
Pasquale, F. (2015). The black box society: The secret algorithms that control money and information. Cambridge, MA: Harvard University Press.
Passig, K. (2017). Fünfzig Jahre black box. Merkur. Gegründet 1947 als Deutsche Zeitschrift für europäisches Denken,823(12), 16–30.
Posner, L., & Shahan, A. (2014). Audit institutions. In M. Bovens, R. E. Goodin, & T. Schillemans (Eds.), The Oxford handbook of public accountability. Oxford: Oxford University Press.
Rasche, A., & Esser, D. (2006). From stakeholder management to stakeholder accountability applying habermasian discourse ethics to accountability research. Journal of Business Ethics,65(3), 251–267.
Rindova, V. P., Pollock, T. G., & Hayward, M. L. A. (2006). Celebrity firms: The social construction of market popularity. The Academy of Management Review,31(1), 50–71. https://doi.org/10.2307/20159185.
Romenti, S. (2010). Reputation and stakeholder engagement: An Italian case study. Journal of Communication Management,14(4), 306–318.
Sandvig, C., Hamilton, K., Karahalios, K., & Langbort, C. (2014a). An algorithm audit. In S. P. Gangadharan (Ed.), Data and discrimination: Collected essays (pp. 6–10). Washington, DC: New America Foundation.
Sandvig, C., Hamilton, K., Karahalios, K., & Langbort, C. (2014b). Auditing algorithms: Research methods for detecting discrimination on internet platforms. Paper presented to “Data and discrimination: Converting critical concerns into productive inquiry”, a preconference at the 64th annual meeting of the international communication association. May 22, 2014, Seattle, WA, USA.
Scherer, A. G., Palazzo, G., & Seidl, D. (2013). Managing legitimacy in complex and heterogeneous environments: Sustainable development in a globalized world. Journal of Management Studies,50(2), 259–284. https://doi.org/10.1111/joms.12014.
Scurich, N., & Monahan, J. (2016). Evidence-based sentencing: Public openness and opposition to using gender, age, and race as risk factors for recidivism. Law and Human Behavior,40(1), 36.
Seele, P., & Lock, I. (2015). Instrumental and/or deliberative? A typology of CSR communication tools. Journal of Business Ethics,131(2), 401–414.
Smith, M. (2016). In Wisconsin, a backlash against using data to foretell defendants’ Futures. NY Times. June 22, 2016.
Stalder, F. (2016). Kultur der Digitalität. Berlin: Suhrkamp.
Stark, M., & Fins, J. J. (2013). What’s not being shared in shared decision making? Hastings Center Report,43(4), 13–16.
Steenbergen, M. R., Bachtiger, A., Sporndli, M., & Steiner, J. (2003). Measuring political deliberation: A discourse quality index. Comparative European Politics,1(1), 21–48.
Suchman, M. C. (1995). Managing legitimacy: Strategic and institutional approaches. Academy of Management Review,20(3), 571–610. https://doi.org/10.5465/amr.1995.9508080331.
Suurmond, G., Swank, O. H., & Visser, B. (2004). On the bad reputation of reputational concerns. Journal of Public Economics,88(12), 2817–2838. https://doi.org/10.1016/j.jpubeco.2003.10.004.
Swift, T. (2001). Trust, reputation and corporate accountability to stakeholders. Business Ethics, a European Review,10(1), 16–26.
Tutt, A. (2016). An FDA for algorithms. Social science research network. Retrieved December 14, 2017, from http://papers.ssrn.com/abstract=2747994.
Van Buren, H. J. (2001). If fairness is the problem, is consent the solution? Integrating ISCT and stakeholder theory. Business Ethics Quarterly,11(3), 481–499.
Van de Walle, S., & Cornelissen, F. (2014). Performance reporting. In M. Bovens, R. E. Goodin, & T. Schillemans (Eds.), The Oxford handbook of public accountability. Oxford: Oxford University Press.
Van Otterlo, M. (2013). A machine learning view on profiling. In M. Hildebrant & K. de Vries (Eds.), Privacy, due process and the computational turn: Philosophers of law meet philosophers of technology (pp. 46–64). London, UK: Routledge.
Wiener, N. (1960). Some moral and technical consequences of automation. Science,131, 1355–1358.
Wojciechowski, B. (2010). Discourse ethics as a basis of the application of law. In J. Jemielniak & P. Miklaszewicz (Eds.), Interpretation of law in the global world: From particularism to a universal approach (pp. 53–69). Berlin: Springer.
Zarsky, T. (2016). The trouble with algorithmic decisions an analytic road map to examine efficiency and fairness in automated and opaque decision making. Science, Technology and Human Values,41(1), 118–132.
This work was financially supported by the Norwegian Research Council as part of their Fair Labor in the Digitized Economy project (Grant Number 247725/O70).
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Buhmann, A., Paßmann, J. & Fieseler, C. Managing Algorithmic Accountability: Balancing Reputational Concerns, Engagement Strategies, and the Potential of Rational Discourse. J Bus Ethics 163, 265–280 (2020). https://doi.org/10.1007/s10551-019-04226-4
- Discourse ethics