Evaluation of an Intelligent Collision Warning System for Forklift Truck Drivers in Industry

  • Armin LangEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10917)


The number of collisions caused by lift trucks in the area of intralogistics is still increasing despite the availability of collision avoidance systems. Commercially available products for collision avoidance mounted on forklifts often issue false alarms activated by minimum distances during daily work. Consequently, the drivers sooner or later turn those systems off. A collision warning system based on computer-vision methods combined with a time-of-flight camera delivering 2D and 3D data can overcome inflationary warnings in warehouse situations. The 3D data delivered is used to identify objects by clustering as well as to get information about the movement of objects in forklift’s path. Machine-learning algorithms use the 2D data mainly to detect people in the path. Distinguishing people from non-human objects makes it possible to establish a two-level warning system able to warn earlier if humans are endangered than in collision situations in which no humans are in sight. This system’s general functionality has already been proven in lab tests. To transfer the academic results to application in an industrial environment, the same test procedure has been executed during daily work in a warehouse at a company in the production sector. In this paper, the authors aim to list the differences and commonalities between the academic and industrial runs.


Forklift safety Warning system Computer vision  People detection Machine learning Evaluation 



The “PräVISION” project is funded by the German Social Accident Insurance (DGUV). The project partners involved are:

– BIBA Bremen



– Berufsgenossenschaft Handel und Warendistribution (BGHW)


  1. 1.
    Lang, A., Günthner, W.A.: Evaluation of the usage of support vector machines for people detection for a collision warning system on a forklift. In: Nah, F.F.-H., Tan, C.-H. (eds.) HCIBGO 2017. LNCS, vol. 10293, pp. 322–337. Springer, Cham (2017). Scholar
  2. 2.
    Acure: Blaxtair. Accessed 20 Feb 2018
  3. 3.
    ELOKON GmbH: ELOprotect; ELOshield; ELOback2. Accessed 20 Feb 2018
  4. 4.
    tbm hightech control GmbH: RRW-207/3D; RAM-107; RRW-107plus. Accessed 20 Feb 2018
  5. 5.
    U-Tech GmbH: U-Tech. Accessed 20 Feb 2018
  6. 6.
    Horn, B.K.P., Schunck, B.G.: Determining optical flow. Artif. Intell. 17(1–3), 185–203 (1981)CrossRefGoogle Scholar
  7. 7.
    Jaimez, M., Souiai, M., Gonzalez-Jimenez, J., Cremers, D.: A primal-dual framework for real-time dense RGB-D scene flow. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) (2015)Google Scholar
  8. 8.
    Vapnik, V., Lerner, A.: Pattern recognition using generalized portrait method. Autom. Remote Control 24, 774–780 (1963)Google Scholar
  9. 9.
    Boser, B.E., Guyon, I.M., Vapnik, V.N.: A training algorithm for optimal margin classifiers. In: Haussler, D. (ed.) Proceedings of the Fifth Annual Workshop on Computational Learning Theory, COLT 1992, New York, NY, USA, pp. 144–152 (1992)Google Scholar
  10. 10.
    Dalal, N., Triggs, B., Schmid, C.: Human detection using oriented histograms of flow and appearance. In: Leonardis, A., Bischof, H., Pinz, A. (eds.) ECCV 2006, Part II. LNCS, vol. 3952, pp. 428–441. Springer, Heidelberg (2006). Scholar
  11. 11.
    Dollar, P., Wojek, C., Schiele, B., Perona, P.: Pedestrian detection: an evaluation of the state of the art. IEEE Trans. Pattern Anal. Mach. Intell. 34(4), 743–761 (2012)CrossRefGoogle Scholar
  12. 12.
    OpenCV. Accessed 20 Feb 2018
  13. 13.
    Danelljan, M., Khan, F.S., Felsberg, M., van de Weijer, J.: Adaptive color attributes for real-time visual tracking. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 1090–1097 (2014)Google Scholar
  14. 14.
    Kalal, Z., Mikolajczyk, K., Matas, J.: Forward-backward error: automatic detection of tracking failures. In: 2010 20th International Conference on Pattern Recognition (ICPR), pp. 2756–2759. IEEE (2010)Google Scholar
  15. 15.
    Kalal, Z., Mikolajczyk, K., Matas, J.: Tracking-learning-detection. IEEE Trans. Pattern Anal. Mach. Intell. 34(7), 1409–1422 (2012)CrossRefGoogle Scholar
  16. 16.
    Pérez, J.S., Meinhardt-Llopis, E., Facciolo, G.: TV-L1 optical flow estimation. Image Proc. On Line 3, 137–150 (2013)CrossRefGoogle Scholar
  17. 17.
    Enzweiler, M., Gavrila, D.M.: Monocular pedestrian detection: survey and experiments. IEEE Trans. Pattern Anal. Mach. Intell. 31(12), 2179–2195 (2009)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Chair of Materials Handling, Material Flow, LogisticsTechnical University of MunichMunichGermany

Personalised recommendations