Advertisement

Applying a Social-Relational Model to Explore the Curious Case of hitchBOT

  • Frances GrodzinskyEmail author
  • Marty J. Wolf
  • Keith Miller
Chapter
Part of the Philosophical Studies Series book series (PSSP, volume 134)

Abstract

This paper applies social-relational models of moral standing of robots to cases where the encounters between the robot and humans are relatively brief. Our analysis spans the spectrum of non-social robots to fully-social robots. We consider cases where the encounters are between a stranger and the robot and do not include its owner or operator. We conclude that the developers of robots that might be encountered by other people when the owner is not present cannot wash their hands of responsibility. They must take care with how they develop the robot’s interface with people and take into account how that interface influences the social relationship between it and people, and, thus, the moral standing of the robot with each person it encounters. Furthermore, we claim that developers have responsibility for the impact social robots have on the quality of human social relationships.

Keywords

HitchBOT Robot-human interaction Robotic interfaces Social robotics Social-relational model Anthropomorphic framing Robotic design 

References

  1. Coeckelbergh, Mark. 2010. Robot rights? Towards a social-relational justification of moral consideration. Ethics and Information Technology 12 (3): 209–221.CrossRefGoogle Scholar
  2. ———. 2014. The moral standing of machines: Towards a relational and non-Cartesian moral hermeneutics. Philosophy & Technology 27 (1): 61–77.CrossRefGoogle Scholar
  3. Darling, Kate. 2015. ‘Who’s Johnny?’ Anthropomorphic framing in human-robot interaction, integration, and policy. Proceedings of WE Robot Conference on Robotics, Law & Policy 2015.Google Scholar
  4. de Laat, Paul B. 2016. Trusting the (ro)botic other: By assumption. ACM SIGCAS Computers and Society 45 (3): 255–260.CrossRefGoogle Scholar
  5. Grodzinsky, Frances S., Keith W. Miller, and Marty J. Wolf. 2011. Developing artificial agents worthy of trust: “Would you buy a used car from this artificial agent?”. Ethics and Information Technology 13 (1): 17–27.CrossRefGoogle Scholar
  6. ———. 2015. Developing automated deceptions and the impact on trust. Ethics and Information Technology 28 (1): 91–105.Google Scholar
  7. hitchBOT: A robot exploring the world. 2015. www.hitchbot.me/about. Accessed 28 Dec 2015.
  8. Naughton, K. 2015. Humans are slamming into driverless cars and exposing a key flaw. BloombergNews. www.bloomberg.com/news/articles/2015-12-18/humans-are-slamming-into-driverless-cars-and-exposing-a-key-flaw. Accessed 8 Jan 2016.
  9. Turkle, Sherry. 2011. Alone together. New York: Basic Books.Google Scholar
  10. VanderMaas, Johanna. 2015. hitchBOT USA tour comes to an early end in Philadelphia. cdn1.hitchbot.me/wp-content/uploads/2015/08/hitchBOT-USA-Trip-End-Press-Release-FINAL.pdf. Accessed 28 Dec 2015.

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Frances Grodzinsky
    • 1
    Email author
  • Marty J. Wolf
    • 2
  • Keith Miller
    • 3
  1. 1.Hersher InstituteSacred Heart UniversityFairfieldUSA
  2. 2.Bemidji State UniversityBemidjiUSA
  3. 3.University of MissouriSt. LouisUSA

Personalised recommendations