Abstract
This paper applies social-relational models of moral standing of robots to cases where the encounters between the robot and humans are relatively brief. Our analysis spans the spectrum of non-social robots to fully-social robots. We consider cases where the encounters are between a stranger and the robot and do not include its owner or operator. We conclude that the developers of robots that might be encountered by other people when the owner is not present cannot wash their hands of responsibility. They must take care with how they develop the robot’s interface with people and take into account how that interface influences the social relationship between it and people, and, thus, the moral standing of the robot with each person it encounters. Furthermore, we claim that developers have responsibility for the impact social robots have on the quality of human social relationships.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsNotes
- 1.
The hitchBOT handlers refer to their robot using “who,” a pronoun typically reserved for humans. Further, ascribing the act of developing friendships is typically not ascribed to robots. These issues are taken up later in the paper.
References
Coeckelbergh, Mark. 2010. Robot rights? Towards a social-relational justification of moral consideration. Ethics and Information Technology 12 (3): 209–221.
———. 2014. The moral standing of machines: Towards a relational and non-Cartesian moral hermeneutics. Philosophy & Technology 27 (1): 61–77.
Darling, Kate. 2015. ‘Who’s Johnny?’ Anthropomorphic framing in human-robot interaction, integration, and policy. Proceedings of WE Robot Conference on Robotics, Law & Policy 2015.
de Laat, Paul B. 2016. Trusting the (ro)botic other: By assumption. ACM SIGCAS Computers and Society 45 (3): 255–260.
Grodzinsky, Frances S., Keith W. Miller, and Marty J. Wolf. 2011. Developing artificial agents worthy of trust: “Would you buy a used car from this artificial agent?”. Ethics and Information Technology 13 (1): 17–27.
———. 2015. Developing automated deceptions and the impact on trust. Ethics and Information Technology 28 (1): 91–105.
hitchBOT: A robot exploring the world. 2015. www.hitchbot.me/about. Accessed 28 Dec 2015.
Naughton, K. 2015. Humans are slamming into driverless cars and exposing a key flaw. BloombergNews. www.bloomberg.com/news/articles/2015-12-18/humans-are-slamming-into-driverless-cars-and-exposing-a-key-flaw. Accessed 8 Jan 2016.
Turkle, Sherry. 2011. Alone together. New York: Basic Books.
VanderMaas, Johanna. 2015. hitchBOT USA tour comes to an early end in Philadelphia. cdn1.hitchbot.me/wp-content/uploads/2015/08/hitchBOT-USA-Trip-End-Press-Release-FINAL.pdf. Accessed 28 Dec 2015.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Grodzinsky, F., Wolf, M.J., Miller, K. (2019). Applying a Social-Relational Model to Explore the Curious Case of hitchBOT. In: Berkich, D., d'Alfonso, M. (eds) On the Cognitive, Ethical, and Scientific Dimensions of Artificial Intelligence. Philosophical Studies Series, vol 134. Springer, Cham. https://doi.org/10.1007/978-3-030-01800-9_17
Download citation
DOI: https://doi.org/10.1007/978-3-030-01800-9_17
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-01799-6
Online ISBN: 978-3-030-01800-9
eBook Packages: Computer ScienceComputer Science (R0)