Advertisement

Designing a Proactive Risk Mitigation Environment for Integrated Autonomous Vehicle and Human Infrastructure

  • Caitlin Anne SurakitbanharnEmail author
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 722)

Abstract

Many safety and implementation issues plague the broad deployment of autonomous vehicles. A viable path to address these issues may include borrowing technology and policy design from the air traffic control (ATC) space. Air traffic is currently operating in a mixed human-automation environment made possible through restrictions. Future plans for air traffic systems include full automation, but the inclusion of proactive risk mitigation will be necessary to manage system and human errors to prevent catastrophic incidents. Autonomous vehicles will likely operate in a mixed environment upon initial implementation, but the same type of forward-facing risk assessment, mitigation, and restrictions will be necessary to provide a safe transportation environment. This paper evaluates the adaptation of “critical pairs” from aviation to autonomous vehicles as a proactive risk mitigation tool. The implementation of critical pairs evaluates each vehicle in relation to the others, and, based on feasible errors, determines speed and position adjustments to avoid a collision in the event of such errors. This type of proactive risk assessment would help prevent collisions or other dangerous events by giving the vehicles enough space and time to preemptively react to otherwise unexpected errors. This information can be used to determine if, how, and when errors may occur that would endanger other vehicles and may be a human monitored function in the face of full autonomous driving. This paper also addresses the type of infrastructure and regulation changes, such as dedicated autonomous vehicle roadways, pedestrian infrastructure, and specialized transition areas, that would be needed to support transition into a transportation system that includes autonomous vehicles, and would eventually support a fully autonomous environment.

Keywords

Autonomous vehicles Human-computer interaction Smart infrastructure Automation safety 

References

  1. 1.
    Landry, S.J.: Intensity control: a concept for automated separation assurance safety and function allocation in NextGen. In: AIAA ATIO Conference, Indianapolis, Indiana, USA (2012)Google Scholar
  2. 2.
    Parasuraman, R., Riley, V.: Humans and automation: use, misuse, disuse, abuse. Hum. Factors J. Hum. Factors Ergon. Soc. 39(2), 230–253 (1997)CrossRefGoogle Scholar
  3. 3.
    Prevot, T., Homola, J., Mercer, J.: Human-in-the-loop evaluation of ground-based automated separation assurance for NextGen. In: Congress of International Council of the Aeronautical Sciences Anchorage, Anchorage, AK (2008)Google Scholar
  4. 4.
    Surakitbanharn, C.A.: Analyzing critical pair identification as a human-controlled function in air traffic control. In: Industrial Engineering, Purdue University (2017)Google Scholar
  5. 5.
    Cummings, M., Guerlain, S.: Developing operator capacity estimates for supervisory control of autonomous vehicles. Hum. Factors Ergon. Soc. 49(1), 1–15 (2007)CrossRefGoogle Scholar
  6. 6.
    Fagnant, D.J., Kockelman, K.: Preparing a nation for autonomous vehicles: opportunities, barriers and policy recommendations. Transp. Res. Part A: Policy Pract. 77, 167–181 (2015)Google Scholar
  7. 7.
    Carsten, O., et al.: Control task substitution in semiautomated driving: does it matter what aspects are automated? Hum. Factors 54(5), 747–761 (2012)CrossRefGoogle Scholar
  8. 8.
    Radlmayr, J., et al.: How traffic situations and non-driving related tasks affect the take-over quality in highly automated driving. In: Human Factors and Ergonomics Society, Chicago, Illinois (2014)Google Scholar
  9. 9.
    Gold, C., et al.: “Take over!” How long does it take to get the driver back into the loop? In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Los Angeles, CA. SAGE Publications Sage (2013)Google Scholar
  10. 10.
    Merat, N., et al.: Transition to manual: driver behaviour when resuming control from a highly automated vehicle. Transp. Res. Part F: Traffic Psychol. Behav. 27, 274–282 (2014)CrossRefGoogle Scholar
  11. 11.
    Metzger, U., Parasuraman, R.: Automation in future air traffic management: effects of decision aid reliability on controller performance and mental workload. Hum. Factors J. Hum. Factors Ergon. Soc. 47(1), 35–49 (2005)CrossRefGoogle Scholar
  12. 12.
    Bonnefon, J.-F., Shariff, A., Rahwan, I.: Autonomous vehicles need experimental ethics: are we ready for utilitarian cars? arXiv preprint arXiv:1510.03346 (2015)
  13. 13.
    Hubaux, J.-P., Capkun, S., Luo, J.: The security and privacy of smart vehicles. IEEE Secur. Priv. 2(3), 49–55 (2004)CrossRefGoogle Scholar
  14. 14.
    Goodall, N.J.: Machine ethics and automated vehicles. In: Road Vehicle Automation, pp. 93–102. Springer (2014)Google Scholar
  15. 15.
    Greenblatt, J.B., Shaheen, S.: Automated vehicles, on-demand mobility, and environmental impacts. Curr. Sustainable Renewable Energy Rep. 2(3), 74–81 (2015)CrossRefGoogle Scholar
  16. 16.
    Surakitbanharn, C., Surakitbanharn, C.A., Landry, S.J.: A comparison of an intensity control measure versus dynamic density to capture complexity within a sector. In: 15th AIAA Aviation Technology, Integration, and Operations Conference (2015)Google Scholar
  17. 17.
    Surakitbanharn, C.A.: Evaluating intensity as a controller function for NextGen scenarios with increased capacity (2014)Google Scholar

Copyright information

© Springer International Publishing AG 2018

Authors and Affiliations

  1. 1.Purdue Policy Research InstitutePurdue UniversityWest LafayetteUSA

Personalised recommendations