Skip to main content

Institutional Facts and AMAs in Society

  • Conference paper
  • First Online:
Philosophy and Theory of Artificial Intelligence 2017 (PT-AI 2017)

Part of the book series: Studies in Applied Philosophy, Epistemology and Rational Ethics ((SAPERE,volume 44))

Included in the following conference series:

Abstract

Which moral principles should the artificial moral agents, AMAs, act upon? This is not an easy problem. But, even harder is the problem of identifying and differentiating the elements of any moral event; and then finding out how those elements relate to your preferred moral principle, if any. This is because of the very nature of morally relevant phenomena, social facts. As Searle points out, unlike the brute facts about physical world, the ontology of the facts about social reality -which he calls institutional facts- is subjective and they exist within a social environment. The appropriate way to learn these facts is by interaction. But, what should this interaction be like and with whom, especially in the case of artificial agents, before they become ‘mature’? This implies that we are to face a very similar problem like raising a child.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 139.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Metzinger, T.: M-autonomy. J. Conscious. Stud. 22, 270–302 (2015)

    Google Scholar 

  2. Searle, J.: Making the Social World. Oxford University Press, New York (2010)

    Book  Google Scholar 

  3. Seibt, J.: Towards an ontology of simulated social interaction: varieties of the “As If” for robots and humans. In: Hakli, R., Seibt, J. (eds.) Sociality and Normativity for Robots. Studies in the Philosophy of Sociality (9), pp. 11–40. Springer, Cham (2017)

    Chapter  Google Scholar 

  4. Tonkens, R.: A challenge for machine ethics. Mind. Mach. 19, 421–438 (2009)

    Article  Google Scholar 

  5. Wallach, W., Allen, C.: Moral Machines. Oxford University Press, New York (2009)

    Book  Google Scholar 

  6. White, J.B.: Autonomous reboot: the challenges of artificial moral agency and the ends of machine ethics. https://philpapers.org/rec/WHIART-6. Accessed 31 Mar 2018

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Arzu Gokmen .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Gokmen, A. (2018). Institutional Facts and AMAs in Society. In: Müller, V. (eds) Philosophy and Theory of Artificial Intelligence 2017. PT-AI 2017. Studies in Applied Philosophy, Epistemology and Rational Ethics, vol 44. Springer, Cham. https://doi.org/10.1007/978-3-319-96448-5_26

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-96448-5_26

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-96447-8

  • Online ISBN: 978-3-319-96448-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics