Skip to main content
Log in

Feasibility analysis of the usage of head-up display devices and speech recognition in real vehicular environments

  • Long Paper
  • Published:
Universal Access in the Information Society Aims and scope Submit manuscript

Abstract

In recent years, the number of on-board devices that provide information about the vehicle, the driving process and the environment has increased. Nevertheless, these devices can be very distracting. Head-up display devices (HUDs) and speech recognition may be good technologies to enrich the experience of drivers while keeping safety under control. Thus, the purpose of this study is to carry out an evaluation of these technologies under real conditions. A total of 50 drivers participated in a study divided into two parts. In the first part, we performed an evaluation of the usage of driving assistants with HUD devices under real conditions. We also compared HUDs with conventional head-down display (HDD) screens. Finally, we asked users about their opinion on methods of interaction with HUDs. Considering the results, the second part of the study aimed at evaluating interaction with HUD devices using speech recognition. 65% of the drivers prefer to use HUDs instead of HDDs for safety reasons. Furthermore, the participants prefer to interact with HUDs using voice commands. 86.66% of the users stated that this method of interaction improved their feeling of safety. The main conclusion is that users agree that driving assistants combined with HUDs are useful and safe at the same time. Moreover, the interaction with HUDs through voice commands is accepted by the majority of the users; it improves their sensation of safety because they do not need to look away from the road to use driving assistants.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15

Similar content being viewed by others

Notes

  1. https://play.google.com/store/apps/details?id=com.google.android.launcher.

  2. https://play.google.com/store/apps/details?id=com.google.android.apps.accessibility.voiceaccess.

  3. http://www.road-eyes.com/.

  4. https://www.r-project.org/.

  5. https://www.navdy.com.

References

  1. Schrank, D., Eisele, B., Lomax, T., Bak, J.: 2015 urban mobility scorecard (2015)

  2. Charissis, V., Naef, M.: Evaluation of prototype automotive head-up display interface: testing driver’s focusing ability through a VR simulation. In: 2007 IEEE Intelligent Vehicles Symposium, pp. 560–565 (2007)

  3. Rodriguez, A.R., Alvarez, D.M., Paneda, X.G., Carbajal, D.A., Jimenez, J.E., Linera, F.F.: Tutoring system for the efficient driving of combustion vehicles. Tecnol. Aprendiz. IEEE Rev. Iberoam. De 8(2), 82–89 (2013)

    Google Scholar 

  4. Liu, Y.-C., Wen, M.-H.: Comparison of head-up display (HUD) vs. head-down display (HDD): driving performance of commercial vehicle operators in Taiwan. Int. J. Hum. Comput. Stud. 61(5), 679–697 (2004)

    Article  Google Scholar 

  5. Yoon, S.H., Lim, J., Ji, Y.G.: Perceived visual complexity and visual search performance of automotive instrument cluster: a quantitative measurement study. Int. J. Hum. Comput. Interact. 31(12), 890–900 (2015)

    Article  Google Scholar 

  6. Normark, C.J.: Design and evaluation of a touch-based personalizable in-vehicle user interface. Int. J. Hum. Comput. Interact. 31(11), 731–745 (2015)

    Article  Google Scholar 

  7. Charissis, V.: Enhancing human responses through augmented reality Head-Up Display in vehicular environment. In: 2014 11th International Conference Expo on Emerging Technologies for a Smarter World (CEWIT), pp. 1–6 (2014)

  8. Park, H., Kim, K.: Efficient information representation method for driver-centered AR-HUD system. In: Marcus, A. (ed.) Design, user experience, and usability. User experience in novel technological environments, pp. 393–400. Springer, Berlin (2013)

    Chapter  Google Scholar 

  9. Tonnis, M., Lange, C., Klinker, G.: Visual longitudinal and lateral driving assistance in the Head-Up Display of cars. In: 6th IEEE and ACM International Symposium on Mixed and Augmented Reality. ISMAR 2007, pp. 91–94 (2007)

  10. Weinberg, G., Harsham, B., Medenica, Z.: Evaluating the usability of a Head-up Display for selection from choice lists in cars. In: Proceedings of the 3rd International Conference on Automotive User Interfaces and Interactive Vehicular Applications, New York, NY, USA, pp. 39–46 (2011)

  11. Cheng, S.Y., Doshi, A., Trivedi, M.M.: Active heads-up display based speed compliance aid for driver assistance: a novel interface and comparative experimental studies. In: 2007 IEEE Intelligent Vehicles Symposium, pp. 594–599 (2007)

  12. Yoon, C., Kim, K., Park, H.S., Park, M.W., Jung, S.K.: Development of augmented forward collision warning system for Head-Up Display. In: 2014 IEEE 17th International Conference on Intelligent Transportation Systems (ITSC), pp. 2277–2279 (2014)

  13. Oh, H.J., Ji, Y.G.: Effects of superimposition of a head-up display on driving performance and glance behavior in the elderly. Int. J. Hum. Comput. Interact. 32(2), 143–154 (2015)

    Article  Google Scholar 

  14. Toyota Prius features cutting edge technology, Toyota Australia. http://www.toyota.com.au/prius/features/cutting-edge-technology/. Accessed 13 Oct 2015

  15. BMW Technology Guide: Head-Up Display. http://www.bmw.com/com/en/insights/technology/technology_guide/articles/head_up_display.html. Accessed 13 Oct 2015

  16. Tecnología e innovación | Descubre el Peugeot i-Cockpit y mucho más. http://www.peugeot.es/marca-y-tecnologia/tecnologia/conduccion-instintiva.html. Accessed: 13 Jan 2017

  17. Cohen, P.R., Oviatt, S.L.: The role of voice input for human–machine communication. Proc. Natl. Acad. Sci. 92(22), 9921–9927 (1995)

    Article  Google Scholar 

  18. Black, M.P., et al.: Toward automating a human behavioral coding system for married couples’ interactions using speech acoustic features. Speech Commun. 55(1), 1–21 (2013)

    Article  Google Scholar 

  19. Rosa, S., Russo, A., Saglimbeni, A., Toscana, G.: Vocal interaction with a 7-DOF robotic arm for object detection, learning and grasping. In The Eleventh ACM/IEEE International Conference on Human Robot Interaction, pp. 505–506 (2016)

  20. Lee, J.D., Caven, B., Haake, S., Brown, T.L.: Speech-based interaction with in-vehicle computers: the effect of speech-based e-mail on drivers’ attention to the roadway. Hum. Factors J. Hum. Factors Ergon. Soc. 43(4), 631–640 (2001)

    Article  Google Scholar 

  21. Barón, A., Green, P.: Safety and usability of speech interfaces for in-vehicle tasks while driving: a brief literature review. Citeseer (2006)

  22. Angkititrakul, P., Petracca, M., Sathyanarayana, A., Hansen, J.H.: UTDrive: driver behavior and speech interactive systems for in-vehicle environments. In 2007 IEEE Intelligent Vehicles Symposium, pp. 566–569 (2007)

  23. Angelini, L., et al.: Comparing gesture, speech and touch interaction modalities for in-vehicle infotainment systems. In: Actes de la 28ième conférence francophone sur l’Interaction Homme-Machine, pp. 188–196 (2016)

  24. Chang, C.-C., Sodnik, J., Boyle, L.N.: Don’t speak and drive: cognitive workload of in-vehicle speech interactions. In: Proceedings of the 8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications Adjunct, pp. 99–104 (2016)

  25. Coleman, J.R., Turrill, J., Cooper, J.M., Strayer, D.L.: Cognitive workload using interactive voice messaging systems. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 60, pp. 1894–1898 (2016)

  26. Jeon, M., Hosseini, S.M.F., Landry, S., Sterkenburg, J.: Tutorial on in-vehicle auditory interactions: design and application of auditory displays, speech, sonification, & music. In: Proceedings of the 8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications Adjunct, pp. 225–228 (2016)

  27. Chen, Y., Tonshal, B., Rankin, J., Feng, F.: Development of an integrated simulation system for design of speech-centric multimodal human-machine interfaces in an automotive cockpit environment. In: ASME 2016 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, p. V01AT02A004 (2016)

  28. recPOP-RoadEyesCams. http://www.road-eyes.com/la-gamme/recpop/. Accessed 11 Apr 2016

  29. Davis, F.D., Jr.: A technology acceptance model for empirically testing new end-user information systems: theory and results. Massachusetts Institute of Technology (1986)

Download references

Acknowledgements

This work has been supported by the Council of Gijón (Asturias, Spain) in collaboration with the University Institute of Industrial Technology of Asturias (IUTA) of the University of Oviedo through the Project SV-15-GIJÓN-1.19, by the Spanish National Research Program within the Project TIN2013-41719-R, and by the Science, Technology and Innovation Plan of the Principality of Asturias, within the Project GRUPIN-14-065. We would like to thank the company AND Mobile Solutions for their assistance with the CATED system.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to José A. Sánchez.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sánchez, J.A., Melendi, D., Pozueco, L. et al. Feasibility analysis of the usage of head-up display devices and speech recognition in real vehicular environments. Univ Access Inf Soc 18, 89–105 (2019). https://doi.org/10.1007/s10209-017-0579-z

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10209-017-0579-z

Keywords

Navigation