Universal Access in the Information Society

, Volume 18, Issue 1, pp 89–105 | Cite as

Feasibility analysis of the usage of head-up display devices and speech recognition in real vehicular environments

  • José A. SánchezEmail author
  • David Melendi
  • Laura Pozueco
  • Xabiel G. Pañeda
  • Roberto García
Long Paper


In recent years, the number of on-board devices that provide information about the vehicle, the driving process and the environment has increased. Nevertheless, these devices can be very distracting. Head-up display devices (HUDs) and speech recognition may be good technologies to enrich the experience of drivers while keeping safety under control. Thus, the purpose of this study is to carry out an evaluation of these technologies under real conditions. A total of 50 drivers participated in a study divided into two parts. In the first part, we performed an evaluation of the usage of driving assistants with HUD devices under real conditions. We also compared HUDs with conventional head-down display (HDD) screens. Finally, we asked users about their opinion on methods of interaction with HUDs. Considering the results, the second part of the study aimed at evaluating interaction with HUD devices using speech recognition. 65% of the drivers prefer to use HUDs instead of HDDs for safety reasons. Furthermore, the participants prefer to interact with HUDs using voice commands. 86.66% of the users stated that this method of interaction improved their feeling of safety. The main conclusion is that users agree that driving assistants combined with HUDs are useful and safe at the same time. Moreover, the interaction with HUDs through voice commands is accepted by the majority of the users; it improves their sensation of safety because they do not need to look away from the road to use driving assistants.


Efficient driving Feeling of driving safety Head-up display devices Voice interaction 



This work has been supported by the Council of Gijón (Asturias, Spain) in collaboration with the University Institute of Industrial Technology of Asturias (IUTA) of the University of Oviedo through the Project SV-15-GIJÓN-1.19, by the Spanish National Research Program within the Project TIN2013-41719-R, and by the Science, Technology and Innovation Plan of the Principality of Asturias, within the Project GRUPIN-14-065. We would like to thank the company AND Mobile Solutions for their assistance with the CATED system.


  1. 1.
    Schrank, D., Eisele, B., Lomax, T., Bak, J.: 2015 urban mobility scorecard (2015)Google Scholar
  2. 2.
    Charissis, V., Naef, M.: Evaluation of prototype automotive head-up display interface: testing driver’s focusing ability through a VR simulation. In: 2007 IEEE Intelligent Vehicles Symposium, pp. 560–565 (2007)Google Scholar
  3. 3.
    Rodriguez, A.R., Alvarez, D.M., Paneda, X.G., Carbajal, D.A., Jimenez, J.E., Linera, F.F.: Tutoring system for the efficient driving of combustion vehicles. Tecnol. Aprendiz. IEEE Rev. Iberoam. De 8(2), 82–89 (2013)Google Scholar
  4. 4.
    Liu, Y.-C., Wen, M.-H.: Comparison of head-up display (HUD) vs. head-down display (HDD): driving performance of commercial vehicle operators in Taiwan. Int. J. Hum. Comput. Stud. 61(5), 679–697 (2004)CrossRefGoogle Scholar
  5. 5.
    Yoon, S.H., Lim, J., Ji, Y.G.: Perceived visual complexity and visual search performance of automotive instrument cluster: a quantitative measurement study. Int. J. Hum. Comput. Interact. 31(12), 890–900 (2015)CrossRefGoogle Scholar
  6. 6.
    Normark, C.J.: Design and evaluation of a touch-based personalizable in-vehicle user interface. Int. J. Hum. Comput. Interact. 31(11), 731–745 (2015)CrossRefGoogle Scholar
  7. 7.
    Charissis, V.: Enhancing human responses through augmented reality Head-Up Display in vehicular environment. In: 2014 11th International Conference Expo on Emerging Technologies for a Smarter World (CEWIT), pp. 1–6 (2014)Google Scholar
  8. 8.
    Park, H., Kim, K.: Efficient information representation method for driver-centered AR-HUD system. In: Marcus, A. (ed.) Design, user experience, and usability. User experience in novel technological environments, pp. 393–400. Springer, Berlin (2013)CrossRefGoogle Scholar
  9. 9.
    Tonnis, M., Lange, C., Klinker, G.: Visual longitudinal and lateral driving assistance in the Head-Up Display of cars. In: 6th IEEE and ACM International Symposium on Mixed and Augmented Reality. ISMAR 2007, pp. 91–94 (2007)Google Scholar
  10. 10.
    Weinberg, G., Harsham, B., Medenica, Z.: Evaluating the usability of a Head-up Display for selection from choice lists in cars. In: Proceedings of the 3rd International Conference on Automotive User Interfaces and Interactive Vehicular Applications, New York, NY, USA, pp. 39–46 (2011)Google Scholar
  11. 11.
    Cheng, S.Y., Doshi, A., Trivedi, M.M.: Active heads-up display based speed compliance aid for driver assistance: a novel interface and comparative experimental studies. In: 2007 IEEE Intelligent Vehicles Symposium, pp. 594–599 (2007)Google Scholar
  12. 12.
    Yoon, C., Kim, K., Park, H.S., Park, M.W., Jung, S.K.: Development of augmented forward collision warning system for Head-Up Display. In: 2014 IEEE 17th International Conference on Intelligent Transportation Systems (ITSC), pp. 2277–2279 (2014)Google Scholar
  13. 13.
    Oh, H.J., Ji, Y.G.: Effects of superimposition of a head-up display on driving performance and glance behavior in the elderly. Int. J. Hum. Comput. Interact. 32(2), 143–154 (2015)CrossRefGoogle Scholar
  14. 14.
    Toyota Prius features cutting edge technology, Toyota Australia. Accessed 13 Oct 2015
  15. 15.
  16. 16.
    Tecnología e innovación | Descubre el Peugeot i-Cockpit y mucho más. Accessed: 13 Jan 2017
  17. 17.
    Cohen, P.R., Oviatt, S.L.: The role of voice input for human–machine communication. Proc. Natl. Acad. Sci. 92(22), 9921–9927 (1995)CrossRefGoogle Scholar
  18. 18.
    Black, M.P., et al.: Toward automating a human behavioral coding system for married couples’ interactions using speech acoustic features. Speech Commun. 55(1), 1–21 (2013)CrossRefGoogle Scholar
  19. 19.
    Rosa, S., Russo, A., Saglimbeni, A., Toscana, G.: Vocal interaction with a 7-DOF robotic arm for object detection, learning and grasping. In The Eleventh ACM/IEEE International Conference on Human Robot Interaction, pp. 505–506 (2016)Google Scholar
  20. 20.
    Lee, J.D., Caven, B., Haake, S., Brown, T.L.: Speech-based interaction with in-vehicle computers: the effect of speech-based e-mail on drivers’ attention to the roadway. Hum. Factors J. Hum. Factors Ergon. Soc. 43(4), 631–640 (2001)CrossRefGoogle Scholar
  21. 21.
    Barón, A., Green, P.: Safety and usability of speech interfaces for in-vehicle tasks while driving: a brief literature review. Citeseer (2006)Google Scholar
  22. 22.
    Angkititrakul, P., Petracca, M., Sathyanarayana, A., Hansen, J.H.: UTDrive: driver behavior and speech interactive systems for in-vehicle environments. In 2007 IEEE Intelligent Vehicles Symposium, pp. 566–569 (2007)Google Scholar
  23. 23.
    Angelini, L., et al.: Comparing gesture, speech and touch interaction modalities for in-vehicle infotainment systems. In: Actes de la 28ième conférence francophone sur l’Interaction Homme-Machine, pp. 188–196 (2016)Google Scholar
  24. 24.
    Chang, C.-C., Sodnik, J., Boyle, L.N.: Don’t speak and drive: cognitive workload of in-vehicle speech interactions. In: Proceedings of the 8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications Adjunct, pp. 99–104 (2016)Google Scholar
  25. 25.
    Coleman, J.R., Turrill, J., Cooper, J.M., Strayer, D.L.: Cognitive workload using interactive voice messaging systems. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 60, pp. 1894–1898 (2016)Google Scholar
  26. 26.
    Jeon, M., Hosseini, S.M.F., Landry, S., Sterkenburg, J.: Tutorial on in-vehicle auditory interactions: design and application of auditory displays, speech, sonification, & music. In: Proceedings of the 8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications Adjunct, pp. 225–228 (2016)Google Scholar
  27. 27.
    Chen, Y., Tonshal, B., Rankin, J., Feng, F.: Development of an integrated simulation system for design of speech-centric multimodal human-machine interfaces in an automotive cockpit environment. In: ASME 2016 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, p. V01AT02A004 (2016)Google Scholar
  28. 28.
    recPOP-RoadEyesCams. Accessed 11 Apr 2016
  29. 29.
    Davis, F.D., Jr.: A technology acceptance model for empirically testing new end-user information systems: theory and results. Massachusetts Institute of Technology (1986)Google Scholar

Copyright information

© Springer-Verlag GmbH Germany 2017

Authors and Affiliations

  1. 1.Department of Informatics, Escuela Politécnica de IngenieríaUniversity of OviedoGijónSpain

Personalised recommendations