Skip to main content

Autonomous Weapon Systems – Dangers and Need for an International Prohibition

  • Conference paper
  • First Online:
KI 2019: Advances in Artificial Intelligence (KI 2019)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 11793))

Abstract

Advances in ICT, robotics and sensors bring autonomous weapon systems (AWS) within reach. Shooting without control by a human operator has military advantages, but also disadvantages – human understanding of the situation and control of events would suffer. Beyond this, compliance with the law of armed conflict is in question. Would it be ethical to allow a machine to take a human life? The increased pace of battle may overburden human understanding and decision making and lead to uncontrolled escalation. An international campaign as well as IT, robotics and AI professionals and enterprises are calling for an international ban of AWS. States have discussed about limitations in the UN context, but no consensus has evolved so far. Germany has argued for a ban of fully autonomous weapons, but has not joined the countries proposing an AWS ban, and is using a problematic definition.

An international ban could comprise a prohibition of AWS and a requirement that each use of force must be under meaningful human control (with very few exceptions). If remotely controlled uninhabited weapon systems remain allowed, a-priori verification that they cannot attack under computer control is virtually impossible. Compliance could be proved after the fact by secure records of all communication and sensor data and the actions of the human operator.

The AI and robotics communities could make significant contributions in teaching and by engaging the public and decision makers. Specific research projects could be directed, e.g., at dual use, proliferation risks and scenarios of interaction between two fleets of AWS. Because of high military, political and economic interests in AWS, a ban needs support by an alert public as well as the AI and robotics communities.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 59.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 74.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Fratricides by US Patriot missiles in the 2003 war against Iraq [53].

  2. 2.

    The US-DoD definition of “semi-autonomous weapon”, “a weapon system that is intended to only engage individual targets or specific target groups that have been selected by a human operator” [5] is more problematic in that it does not specify how targets or target groups are to be selected.

  3. 3.

    For a more differentiated autonomy scale with six steps see [55].

  4. 4.

    “The transformation of International Protocols and battlefield ethics into machine-usable representations …”, “Mechanisms to ensure that the design of intelligent behaviors only provides responses within rigorously defined ethical boundaries”, “The development of effective perceptual algorithms capable of superior target discrimination capabilities …”, “The creation of techniques to permit the adaptation of an ethical constraint set and underlying behavioral control parameters that will ensure moral performance …”, “A means to make responsibility assignment clear and explicit for all concerned parties …” [41, p. 211f.].

  5. 5.

    “The work … is, in fact, merely a suggestion for a computer software system for the ethical governance of robot ‘behaviour’. This is what is known as a ‘back-end system’. Its operation relies entirely on information from systems yet ‘to be developed’ by others sometime in the future. It has no direct access to the real world through sensors or a vision system and it has no means to discriminate between combatant and non-combatant, between a baby and a wounded soldier, or a granny in a wheelchair and a tank. It has no inference engine and certainly cannot negotiate the types of common sense reasoning and battlefield awareness necessary for discrimination or proportionality decisions. There is neither a method for interpreting how the precepts of the laws of war apply in particular contexts nor is there any method for resolving the ambiguities of conflicting laws in novel situations.” [50].

  6. 6.

    Note that the CFE Treaty in its preamble calls for “establishing a secure and stable balance of conventional forces at lower levels” and for “eliminating disparities detrimental to stability and security”. [46] Unfortunately the Treaty is no longer operating with respect to Russia.

  7. 7.

    Similar unpredictable, but probably escalatory interactions can be foreseen if offensive cyber operations were done under automatic/autonomous/AI control. Combined with AWS operations the problems could intensify each other.

  8. 8.

    The author was one of the founders. In the meantime the number of members has grown to 33 [52].

  9. 9.

    The full name is “Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects”. This framework convention was concluded in 1980 and has five specific protocols, the most relevant in the present context being Protocol IV that prohibits blinding laser weapons [45]. There are 125 member states, including practically all states with relevant militaries [51].

  10. 10.

    What this can mean in detail is explained in [54].

  11. 11.

    The Swiss Federal Office for Defence Procurement – armasuisse – has re-enacted the scene and shown that a shaped charge of 3 g explosive can penetrate a skull emulator [47].

  12. 12.

    In military parlance, “lethal” is mostly understood as “destructive”, not explicitly as killing people, as e.g. in military notions of “target kill” or “mission kill”. The use of the term LAWS for the CCW expert meetings was not intended for exclusion of weapons against materiél or of non-lethal weapons (personal communication from Ambassador Jean-Hugues Simon-Michel of France, first chair of the expert meetings).

  13. 13.

    See the respective discussion in the life sciences (e.g. [49]) and the wider German Leopoldina-DFG “Joint Committee for the Handling of Security-Relevant Research” [48].

  14. 14.

    The Trump administration no longer mentions the offset strategy explicitly, but continues emphasising the need to maintain “decisive and sustained U.S. military advantages” or “overmatch” [56, p. 4] , [43, p. 28].

  15. 15.

    Russia: “Whoever becomes the leader in this sphere [AI] will become the ruler of the world.” (Putin) [42] China: “[T]he PLA intends to ‘seize the advantage in military competition and the initiative in future warfare,’ seeking the capability to win in not only today’s informatized warfare but also future intelligentized warfare, in which AI and related technologies will be a cornerstone of military power.” [57, p. 13] The USA is more circumscript: “The Trump Administration’s National Security Strategy recognizes the need to lead in artificial intelligence, and the Department of Defense is investing accordingly.” [44].

  16. 16.

    As in the case of the Anti-personnel Land Mine Convention (1997) by Canada and for the Cluster Munitions Convention (2008) by Norway.

References

  1. Bhuta, N., Beck, S., Geiß, R., Liu, H.-Y., Kreß, C. (eds.): Autonomous Weapons Systems. Law, Ethics, Policy. Cambridge University Press, Cambridge (2016)

    Google Scholar 

  2. Scharre, P.: Army of None: Autonomous Weapons and the Future of War. Norton, New York (2018)

    Google Scholar 

  3. New America Foundation (2019). https://www.newamerica.org/in-depth/world-of-drones/3-who-has-what-countries-armed-drones. Accessed 16 July 2019

  4. Wezeman, P.D., Fleurant, A., Kuimova, A., Tian, N., Wezeman, S.T.: Trends in international arms transfers, 2018, March 2019. https://www.sipri.org/sites/default/files/2019–03/fs_1903_at_2018.pdf. Accessed 16 July 2019

  5. US Department of Defense: Autonomy in Weapon Systems (incorporating Change 1, May 8, 2017), 21 November 2012. http://www.esd.whs.mil/Portals/54/Documents/DD/issuances/dodd/300009p.pdf. Accessed 5 July 2019

  6. International Committee of the Red Cross: Ethics and autonomous weapon systems: An ethical basis for human control?, 3 April 2018. https://www.icrc.org/en/download/file/69961/icrc_ethics_and_autonomous_weapon_systems_report_3_april_2018.pdf. Accessed 5 July 2019

  7. Walpole, L.: The True Cost of Drone Warfare?, 8 June 2018. https://www.oxfordresearchgroup.org.uk/blog/the-true-cost-of-drone-warfare. Accessed 16 July 2019

  8. Sauer, F., Schörnig, N.: Killer drones – the silver bullet of democratic warfare? Secur. Dialogue 43(4), 353–370 (2012)

    Article  Google Scholar 

  9. US Department of Defense: Unmanned Systems Roadmap 2007-2032 (2007). http://www.dtic.mil/cgi-bin/GetTRDoc?Location=U2&doc=GetTRDoc.pdf&AD=ADA475002

  10. US Department of Defense: Unmanned Systems Integrated Roadmap FY2013-2038 (2013). http://www.dtic.mil/get-tr-doc/pdf?AD=ADA592015. Accessed 5 July 2019

  11. US Department of Defense: Unmanned Systems Integrated Roadmap 2017-2042, 28 August 2018. http://cdn.defensedaily.com/wp-content/uploads/post_attachment/206477.pdf. Accessed 5 July 2019

  12. Bendett, S.: Russia Is Poised to Surprise the US in Battlefield Robotics, 25 January 2018. https://www.defenseone.com/ideas/2018/01/russia-poised-surprise-us-battlefield-robotics/145439/. Accessed 8 July 2019

  13. Kania, E.B.: Battlefield Singularity: Artificial Intelligence, Military Revolution, and China’s Future Military Power, 28 November 2017. https://www.cnas.org/publications/reports/battlefield-singularity-artificial-intelligence-military-revolution-and-chinas-future-military-power. Accessed 9 July 2019

  14. Allen, G.C.: Understanding China’s AI Strategy – Clues to Chinese Strategic Thinking on Artificial Intelligence and National Security, 6 February 2019. https://www.cnas.org/publications/reports/understanding-chinas-ai-strategy. Accessed 18 February 2019

  15. Altmann, J.: Präventive Rüstungskontrolle. Die Friedens-Warte 83(2–3), 105–126 (2008)

    Google Scholar 

  16. Altmann, J.: Nanotechnology and Preventive Arms Control (2005). https://bundesstiftung-friedensforschung.de/wp-content/uploads/2017/08/berichtaltmann.pdf. Accessed 16 July 2019

  17. Altmann, J.: Arms control for armed uninhabited vehicles: an ethical issue. Ethics Inf. Technol. 15(2), 137–152 (2013)

    Article  Google Scholar 

  18. Altmann, J., Sauer, F.: Autonomous weapon systems. Survival 59(5), 117–142 (2017)

    Article  Google Scholar 

  19. Campaign to Stop Killer Robots (2019). https://www.stopkillerrobots.org/members/. Accessed 11 July 2019

  20. Heyns, C.: Report of the Special Rapporteur on extrajudicial, summary or arbitrary executions, 9 April 2013. http://www.ohchr.org/Documents/HRBodies/HRCouncil/RegularSession/Session23/A-HRC-23-47_en.pdf. Accessed 11 July 2019

  21. Campaign to Stop Killer Robots: Country Views on Killer Robots, 22 November 2018. https://www.stopkillerrobots.org/wp-content/uploads/2018/11/KRC_CountryViews22Nov2018.pdf. Accessed 12 July 2019

  22. Computing experts from 37 countries call for ban on killer robots – Decision to apply violent force must not be delegated to machines, 15 October 2013. https://www.icrac.net/wp-content/uploads/2018/06/Scientist-Call_Press-Release.pdf. Accessed 12 July 2019

  23. Autonomous Weapons: an Open Letter from AI & Robotics Researchers, 28 July 2015. https://futureoflife.org/open-letter-autonomous-weapons. Accessed 12 July 2019

  24. The 30717 Open Letter Signatories Include (2019). http://futureoflife.org/awos-signatories/. Accessed 12 July 2019

  25. Slaughterbots (Video 7:47), November 2017. https://www.youtube.com/watch?v=9CO6M2HsoIA. Accessed 12 July 2019

  26. Future of Life Institute (2019). https://futureoflife.org/lethal-autonomous-weapons-pledge/. Accessed 12 July 2019

  27. Gesellschaft für Informatik: Tödliche autonome Waffensysteme (LAWS) müssen völkerrechtlich geächtet werden, February 2019. https://gi.de/fileadmin/GI/Allgemein/PDF/GI-Stellungnahme_LAWS_2019-02.pdf. Accessed 12 July 2019

  28. Bundesverband der Deutschen Industrie: Künstliche Intelligenz in Sicherheit und Verteidigung, January 2019. https://issuu.com/bdi-berlin/docs/20181205_position_bdi_ki. Accessed 12 July 2019

  29. O’Sullivan, L.: I Quit My Job to Protest My Company’s Work on Building Killer Robots. American Civil Liberties Union, 6 March 2019. https://www.aclu.org/blog/national-security/targeted-killing/i-quit-my-job-protest-my-companys-work-building-killer. Accessed 12 July 2019

  30. Conger, K., Metz, C.: Tech Workers Now Want to Know: What Are We Building This For? New York Times, 7 October 2018. https://www.nytimes.com/2018/10/07/technology/tech-workers-ask-censorship-surveillance.html. Accessed 12 July 2019

  31. UK Ministry of Defence: Unmanned Aircraft Systems, August 2017. https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/640299/20170706_JDP_0-30.2_final_CM_web.pdf. Accessed 13 July 2019

  32. Bundesministerium der Verteidigung, Pol II 5. Definitionsentwurf deutsch/englisch: Letales Autonomes Waffensystem. Personal communication (2014)

    Google Scholar 

  33. Germany: Statement delivered by Germany on Working Definition of LAWS/Definition of Systems under Consideration, April 2018. https://www.unog.ch/80256EDD006B8954/(httpAssets)/2440CD1922B86091C12582720057898F/%24file/2018_LAWS6a_Germany.pdf. Accessed 13 July 2019

  34. Maas, H. (Minister for Foreign Affairs of Germany): Speech at the general debate of the 73rd General Assembly of the United Nations, 28 September 2018. https://gadebate.un.org/sites/default/files/gastatements/73/de_en.pdf. Accessed 13 July 2019

  35. Amoroso, D., Sauer, F., Sharkey, N., Suchman, L.: Autonomy in Weapon Systems – The Military Application of Artificial Intelligence as a Litmus Test for Germany’s New Foreign and Security Policy, 23 May 2018. https://www.boell.de/sites/default/files/boell_autonomy-in-weapon-systems_v04_kommentierbar_1.pdf. Accessed 13 July 2019

  36. Gubrud, M., Altmann, J.: Compliance Measures for an Autonomous Weapons Convention, May 2013. https://www.icrac.net/wp-content/uploads/2018/04/Gubrud-Altmann_Compliance-Measures-AWC_ICRAC-WP2.pdf. Accessed 12 July 2019

  37. Work, R.: Deputy Secretary of Defense Speech, 14 December 2015. https://www.defense.gov/News/Speeches/Speech-View/Article/634214/cnas-defense-forum. Accessed 9 July 2019

  38. Tucker, P.: Russian Weapons Maker To Build AI-Directed Guns, 14 July 2017. http://www.defenseone.com/technology/2017/07/russian-weapons-maker-build-ai-guns/139452/. Accessed 9 July 2019

  39. TASS: Russia is developing artificial intelligence for military and civilian drones, 15 May 2017. http://tass.com/defense/945950. Accessed 9 July 2019

  40. Sharkov, D.: Vladimir Putin Talks Ruling the World, Future Wars And Life On Mars, 1 September 2017. https://www.newsweek.com/vladimir-putin-talks-ruling-world-future-wars-and-life-mars-658579. Accessed 9 July 2019

  41. Arkin, R.C.: Governing Lethal Behavior in Autonomous Robots. Chapman&Hall/CRC, Boca Raton (2009)

    Book  Google Scholar 

  42. Russia Today: ‘Whoever leads in AI will rule the world’: Putin to Russian children on Knowledge Day, 1 September 2017. https://www.rt.com/news/401731-ai-rule-world-putin/. Accessed 29 November 2017

  43. President of the USA: National Security Strategy of the United States of America, December 2017. https://www.whitehouse.gov/wp-content/uploads/2017/12/NSS-Final-12-18-2017-0905.pdf. Accessed 10 July 2019

  44. White House: Artificial Intelligence for the American People, 10 May 2018. https://www.whitehouse.gov/briefings-statements/artificial-intelligence-american-people/. Accessed 10 July 2019

  45. Protocol on Blinding Laser Weapons (Protocol IV), 13 October 1995. https://www.unog.ch/80256EDD006B8954/(httpAssets)/8463F2782F711A13C12571DE005BCF1A/$file/PROTOCOL+IV.pdf. Accessed 11 July 2019

  46. Organization for Security and Co-operation in Europe: Treaty on Conventional Armed Forces in Europe, 19 November 1990. http://www.osce.org/library/14087. Accessed 11 July 2019

  47. Drapela, P.: Fake news? Lethal effect of micro drones, 11 April 2018. https://www.ar.admin.ch/en/armasuisse-wissenschaft-und-technologie-w-t/home.detail.news.html/ar-internet/news-2018/news-w-t/lethalmicrodrones.html. Accessed 12 July 2019

  48. Scientific Freedom and Scientific Responsibility (2019). https://www.leopoldina.org/en/about-us/cooperations/joint-committee-on-dual-use/. Accessed 12 July 2019

  49. World Health Organization: Dual Use Research of Concern (DURC) (2019). https://www.who.int/csr/durc/en/. Accessed 12 July 2019

  50. Sharkey, N.E.: The evitability of autonomous robot warfare. Int. Rev. Red Cross 94, 787–799 (2012)

    Article  Google Scholar 

  51. United Nations Office at Geneva: High Contracting Parties and Signatories (2019). https://www.unog.ch/80256EE600585943/(httpPages)/3CE7CFC0AA4A7548C12571C00039CB0C?OpenDocument. Accessed 11 July 2019

  52. Members (2019). https://www.icrac.net/members/. Accessed 11 July 2019

  53. Report of the Defense Science Board Task Force on Patriot System Performance, January 2005. https://www.acq.osd.mil/dsb/reports/2000s/ADA435837.pdf. Accessed 16 July 2019

  54. Sharkey, N.: Staying in the loop. Human supervisory control of weapons. In: Bhuta, N., Beck, S., Geiß, R., Liu, H., Kreß, C. (eds.) Autonomous Weapons Systems. Law, Ethics, Policy, pp. 23–28. Cambridge University Press, Cambridge (2016)

    Google Scholar 

  55. US Air Force: Autonomous Horizons – System Autonomy in the Air Force – A Path to the Future, Volume I, Human-Autonomy Teaming, AF/ST TR 15-01. United States Air Force, Office of the Chief Scientist, June 2015. http://www.af.mil/Portals/1/documents/SECAF/AutonomousHorizons.pdf. Accessed 22 July 2019

  56. US Department of Defense: Summary of the 2018 National Defense Strategy of the United States of America – Sharpening the American Military’s Competitive Edge (2018). https://dod.defense.gov/Portals/1/Documents/pubs/2018-National-Defense-Strategy-Summary.pdf. Accessed 10 July 2019

  57. He, L. (vice president of the PLA’s Academy of Military Science): Establish a Modern Military Theory System with Chinese Characteristics. Study Times, 19 June 2017. Cited by Kania, E.B., Battlefield Singularity: Artificial Intelligence, Military Revolution, and China’s Future Military Power, 28 November 2017. https://www.cnas.org/publications/reports/battlefield-singularity-artificial-intelligence-military-revolution-and-chinas-future-military-power. Accessed 9 July 2019

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jürgen Altmann .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Altmann, J. (2019). Autonomous Weapon Systems – Dangers and Need for an International Prohibition. In: Benzmüller, C., Stuckenschmidt, H. (eds) KI 2019: Advances in Artificial Intelligence. KI 2019. Lecture Notes in Computer Science(), vol 11793. Springer, Cham. https://doi.org/10.1007/978-3-030-30179-8_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-30179-8_1

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-30178-1

  • Online ISBN: 978-3-030-30179-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics