Advertisement

Augmented Reality Based Traffic Sign Recognition for Improved Driving Safety

  • Lotfi AbdiEmail author
  • Aref Meddeb
  • Faten Ben Abdallah
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9066)

Abstract

In recent years, automotive active safety systems have become increasingly common in road vehicles since they provide an opportunity to significantly reduce traffic fatalities by active vehicle control. Augmented Reality (AR) applications can enhance intelligent transportation systems by superimposing surrounding traffic information on the users view and keep drivers and pedestrians view on roads. However, due to the existence of a complex environment such as weather conditions, illuminations and geometric distortions, Traffic Sign Recognition(TSR) systems has always been considered as a challenging task. The aim of this paper is to evaluate the effectiveness of AR cues in improving driving safety by deploying an on-board camera-based driver alert system against approaching traffic signs such as stop, speed limit, unique, danger signs, etc. A new approach is presented for marker-less AR-TSR system that superimposes augmented virtual objects onto a real scene under all types of driving situations including unfavorable weather conditions. Our method is composed of both online and offline stages. An intrinsic camera parameter change depending on the zoom values is calibrated. A Haar-like feature with Adaboost has been used to train a Haar detector in the offline stage. Extrinsic camera parameters are then estimated based on homography method in the online stage. With the complete set of camera parameters, virtual objects can be coherently inserted into the video sequence captured by the camera so that synthetic traffic signs may be added to increase safety.

Keywords

Augmented Reality 3D SURF Camera Calibration Traffic Signs OpenCV 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Azuma, R., Baillot, Y., Behringer, R., Feiner, S., Julier, S., MacIntyre, B.: Recent advances in augmented reality. IEEE Computer Graphics and Applications 21(6), 34–47 (2001)CrossRefGoogle Scholar
  2. 2.
    Gabbard, J.L., Edward Swan III, J., Hix, D.: The effects of text drawing styles, background textures, and natural lighting on text legibility in outdoor augmented reality. Presence: Teleoperators and Virtual Environments 15(1), 16–32 (2006)Google Scholar
  3. 3.
    Gabbard, J.L., Swan, J.E., Hix, D., Schulman, R.S., Lucas, J., Gupta, D.: An empirical user-based study of text drawing styles and outdoor background textures for augmented reality. In: Proceedings of IEEE Virtual Reality 2005, pp. 11–18. IEEE (2005)Google Scholar
  4. 4.
    Geronimo, D., Lopez, A.M., Sappa, A.D., Graf, T.: Survey of pedestrian detection for advanced driver assistance systems. IEEE Transactions on Pattern Analysis and Machine Intelligence 32(7), 1239–1258 (2010)CrossRefGoogle Scholar
  5. 5.
    Guo, H., Zhao, F., Wang, W., Jiang, X.: Analyzing drivers attitude towards hud system using a stated preference survey. Advances in Mechanical Engineering 2014 (2014)Google Scholar
  6. 6.
    Guo, L., Ge, P.S., Zhang, M.H., Li, L.H., Zhao, Y.B.: Pedestrian detection for intelligent transportation systems combining adaboost algorithm and support vector machine. Expert Systems with Applications 39(4), 4274–4286 (2012)CrossRefGoogle Scholar
  7. 7.
    Ho, C., Reed, N., Spence, C.: Multisensory in-car warning signals for collision avoidance. Human Factors: The Journal of the Human Factors and Ergonomics Society 49(6), 1107–1114 (2007)CrossRefGoogle Scholar
  8. 8.
    Ho, C., Spence, C.: Assessing the effectiveness of various auditory cues in capturing a driver’s visual attention. Journal of Experimental Psychology: Applied 11(3), 157 (2005)Google Scholar
  9. 9.
    Hussain, K., Kaptan, V.: Modeling and simulation with augmented reality. RAIRO-Operations Research 38(02), 89–103 (2004)CrossRefzbMATHMathSciNetGoogle Scholar
  10. 10.
    Kramer, A.F., Cassavaugh, N., Horrey, W.J., Becic, E., Mayhugh, J.L.: Influence of age and proximity warning devices on collision avoidance in simulated driving. Human Factors: The Journal of the Human Factors and Ergonomics Society 49(5), 935–949 (2007)CrossRefGoogle Scholar
  11. 11.
    McCall, J.C., Trivedi, M.M.: Video-based lane estimation and tracking for driver assistance: survey, system, and evaluation. IEEE Transactions on Intelligent Transportation Systems 7(1), 20–37 (2006)CrossRefGoogle Scholar
  12. 12.
    Mogelmose, A., Trivedi, M.M., Moeslund, T.B.: Vision-based traffic sign detection and analysis for intelligent driver assistance systems: Perspectives and survey. IEEE Transactions on Intelligent Transportation Systems 13(4), 1484–1497 (2012)CrossRefGoogle Scholar
  13. 13.
    Park, J.-G., Kim, K.-J.: Design of a visual perception model with edge-adaptive gabor filter and support vector machine for traffic sign detection. Expert Systems with Applications 40(9), 3679–3687 (2013)CrossRefGoogle Scholar
  14. 14.
    Scott, J.J., Gray, R.: A comparison of tactile, visual, and auditory warnings for rear-end collision prevention in simulated driving. Human Factors: The Journal of the Human Factors and Ergonomics Society 50(2), 264–275 (2008)CrossRefGoogle Scholar
  15. 15.
    Sobel, D., Jędrasiak, K., Daniec, K., Wrona, J., Jurgaś, P., Nawrat, A.M.: Camera calibration for tracked vehicles augmented reality applications. In: Innovative Control Systems for Tracked Vehicle Platforms, pp. 147–162. Springer (2014)Google Scholar
  16. 16.
    Stallkamp, J., Schlipsing, M., Salmen, J., Igel, C.: The german traffic sign recognition benchmark: A multi-class classification competition. In: The 2011 International Joint Conference on Neural Networks (IJCNN), pp. 1453–1460. IEEE (2011)Google Scholar
  17. 17.
    Topór-Kamiñski, T., Krupanek, B., Homa, J.: Delays models of measurement and control data transmission network. In: Nawrat, A., Simek, K., Świerniak, A. (eds.) Advanced Technologies for Intelligent Systems. SCI, vol. 440, pp. 257–278. Springer, Heidelberg (2013)Google Scholar
  18. 18.
    Viola, P., Jones, M.J.: Robust real-time face detection. International Journal of Computer Vision 57(2), 137–154 (2004)CrossRefGoogle Scholar
  19. 19.
    Wang, J.G., Lin, C.J., Chen, S.M.: Applying fuzzy method to vision-based lane detection and departure warning system. Expert Systems with Applications 37(1), 113–126 (2010)CrossRefzbMATHGoogle Scholar
  20. 20.
    Yeh, M., Wickens, C.D.: Display signaling in augmented reality: Effects of cue reliability and image realism on attention allocation and trust calibration. Human Factors: The Journal of the Human Factors and Ergonomics Society 43(3), 355–365 (2001)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.National Engineering School of TunisUniversity of Tunis-El ManarTunisTunisia
  2. 2.National Engineering School Of SousseUniversity of SousseTunisTunisia

Personalised recommendations