Considerations of Artificial Intelligence Safety Engineering for Unmanned Aircraft

  • Sebastian SchirmerEmail author
  • Christoph Torens
  • Florian Nikodem
  • Johann Dauer
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11094)


Unmanned aircraft systems promise to be useful for a multitude of applications such as cargo transport and disaster recovery. The research on increased autonomous decision-making capabilities is therefore rapidly growing and advancing. However, the safe use, certification, and airspace integration for unmanned aircraft in a broad fashion is still unclear. Standards for development and verification of manned aircraft are either only partially applicable or resulting safety and verification efforts are unrealistic in practice due to the higher level of autonomy required by unmanned aircraft. Machine learning techniques are hard to interpret for a human and their outcome is strongly dependent on the training data. This work presents the current certification practices in unmanned aviation in the context of autonomy and artificial intelligence. Specifically, the recently introduced categories of unmanned aircraft systems and the specific operation risk assessment are described, which provide means for flight permission not solely focusing on the aircraft but also incorporating the target operation. Exemplary, we show how the specific operation risk assessment might be used as an enabler for hard-to-certify techniques by taking the operation into account during system design.


Aerospace Certification AI-based system Unmanned aircraft systems Verification and validation 


  1. 1.
    Dauer, J.C., Lorenz, S., Dittrich, J.S.: Automated low altitude air delivery. Deutscher Luftund Raumfahrtkongress DGLR, 13–15 September 2016Google Scholar
  2. 2.
    European Aviation Safety Agency (EASA): Introduction of a regulatory framework for the operation of drones. Advance Notice of Proposed Amendment 2017-05Google Scholar
  3. 3.
    Friehmelt, H. (ed.): Integrated UAV Technologies Demonstration in Controlled Airspace Using ATTAS, AIAA Atmospheric Flight Mechanics Conference and Exhibit (2003)Google Scholar
  4. 4.
    Guyon, I., von Luxburg, U., Bengio, S., Wallach, H.M., Fergus, R., Vishwanathan, S.V.N., Garnett, R. (eds.): Advances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing Systems, USA (2017)Google Scholar
  5. 5.
    Hayhurst, K.J., Dorsey, C.A., Knight, J.C., Leveson, N.G., McCormick, G.F. (eds.): Streamlining Software Aspects of Certification: Report on the SSAC Survey (1999)Google Scholar
  6. 6.
    Hayhurst, K.J., Veerhusen, D.S. (eds.): A Practical Approach to Modified Condition/Decision Coverage (2001)Google Scholar
  7. 7.
    Joint Authorities for Rulemaking of Unmanned Systems: JARUS Guidelines on Specific Operations Risk Assessment (SORA). Draft for public consultation (2016)Google Scholar
  8. 8.
  9. 9.
    Schirmer, S., Torens, C., Adolf, F.M. (eds.): Formal Monitoring of Risk-based Geofences, AIAA Information Systems-AIAA Infotech at Aerospace (2018)Google Scholar
  10. 10.
    Torens, C., Nikodem, F., Dittrich, J.S., Dauer, J.C. (eds.): Onboard Functional Requirements for Specific Category UAS and Safe Operation Monitoring, 6th CEAS Air and Space Conference (2017)Google Scholar
  11. 11.
    Yuan, X., He, P., Zhu, Q., Bhat, R.R., Li, X.: Adversarial examples: attacks and defenses for deep learning. CoRR (2017)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Sebastian Schirmer
    • 1
    Email author
  • Christoph Torens
    • 1
  • Florian Nikodem
    • 1
  • Johann Dauer
    • 1
  1. 1.German Aerospace Center (DLR), Institute of Flight SystemsBraunschweigGermany

Personalised recommendations