Skip to main content
Log in

Real-time motion detection based on SW/HW-codesign for walking rescue robots

  • Special Issue
  • Published:
Journal of Real-Time Image Processing Aims and scope Submit manuscript

Abstract

In a rescue operation walking robots offer a great deal of flexibility in traversing uneven terrain in an uncontrolled environment. For such a rescue robot, each motion is a potential vital sign and the robot should be sensitive enough to detect such motion, at the same time maintaining high accuracy to avoid false alarms. However, the existing techniques for motion detection have severe limitations in dealing with strong levels of ego-motion on walking robots. This paper proposes an optical flow-based method for the detection of moving objects using a single camera mounted on a hexapod robot. The proposed algorithm estimates and compensates ego-motion to allow for object detection from a continuously moving robot, using a first-order-flow motion model. Our algorithm can deal with strong rotation and translation in 3D, with four degrees of freedom. Two alternative object detection methods using a 2D-histogram based vector clustering and motion-compensated frame differencing, respectively, are examined for the detection of slow- and fast-moving objects. The FPGA implementation with optimized resource utilization using SW/HW codesign can process video frames in real-time at 31 fps. The new algorithm offers a significant improvement in performance over the state-of-the-art, under harsh environment and performs equally well under smooth motion.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

References

  1. Alireza, B., Shahrokni, A., Motamedi, S.A.: A robust vision-based moving target detection and tracking system. In: Proceedings of Image and Vision Computing Conference (IVCNZ2001) (2001)

  2. Armangué, X., Araújo, H., Salvi, J.: A review on egomotion by means of differential epipolar geometry applied to the movement of a mobile robot. Pattern Recognit. 36(12), 2927–2944 (2003)

    Article  MATH  Google Scholar 

  3. Battiato, S., Gallo, G., Puglisi, G., Scellato, S.: Sift features tracking for video stabilization. In: 14th International Conference on Image Analysis and Processing, 2007 (ICIAP 2007), pp. 825–830. IEEE (2007)

  4. Claus, C., Laika, A., Jia, L., Stechele, W.: High performance FPGA based optical flow calculation using the census transformation. IEEE Intelligent Vehicle Symposium (2009)

  5. DeHon, A.: The density advantage of configurable computing. IEEE Comput. 33(4), 41–49 (2000)

    Article  Google Scholar 

  6. Fawcett, T.: An introduction to ROC analysis. Technical report, Institute for the Study of Learning and Expertise, 2164 Staunton Court, Palo Alto, CA 94306, USA (2005)

  7. Feng, L., Chen, B.M., Lee, T.H.: Robust vision-based target tracking control system for an unmanned helicopter using feature fusion. In: IAPR Conference on Machine Vision Applications, pp. 398–401 (2009)

  8. Fischler, M., Bolles, R.: Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM 24(6), 381–395 (1981)

    Article  MathSciNet  Google Scholar 

  9. Gonzalez, R., Woods, R., Eddins, S.: Digital image processing using MATLAB. Prentice Hall, Upper Saddle River (2004)

  10. Grozea, C., Bankovic, Z., Laskov, P.: FPGA vs. multi-core CPUs vs. GPUs: hands-on experience with a sorting application. In: Facing the Multi-Core Challenge: Conference for Young Scientists at the Heidelberger Akademie der Wissenschaften (2010)

  11. Jacoff, A., Messina, E., Weiss, B.A., Tadokoro, S., Nakagawa, Y.: Test arenas and performance metrics for urban search and rescue robots. In: Proceedings of the 2003 IEEE/RSJ International Conference on Intelligent Robot and Systems, Las Vegas, NV, October 27–31, 2003

  12. Jung, B., Sukhatme, G.: Detecting moving objects using a single camera on a mobile robot in an outdoor environment. In: International Conference on Intelligent Autonomous Systems, pp. 980–987 (2004)

  13. Laika, A.: Monoscopic object-recognition for advanced driver assistance systems. PhD thesis, TUM (2011)

  14. Milella, A., Siegwart, R.: Stereo-based ego-motion estimation using pixel tracking and iterative closest point. In: Proceedings of the Fourth IEEE International Conference on Computer Vision Systems (2006)

  15. Murray, D., Basu, A.: Motion tracking with an active camera. IEEE Trans. Pattern Anal. Mach. Intell. 16(5), 449–459 (1994). doi:10.1109/34.291452

    Article  Google Scholar 

  16. Sachs, D., Nasiri, S., Goehl, D.: Image stabilization technology overview. InvenSense Whitepaper (2006)

  17. Scaramuzza, D., Fraundorfer, F., Siegwart, R.: Real-time monocular visual odometry for on-road vehicles with 1-point ransac. In: IEEE International Conference on Robotics and Automation, 2009 (ICRA’09), pp. 4293–4299. IEEE (2009)

  18. Se, S., Lowe, D., Little, J.: Vision-based mobile robot localization and mapping using scale-invariant features. In: IEEE International Conference on Robotics and Automation, 2001 (ICRA2001), vol. 2, pp. 2051–2058. IEEE (2001)

  19. Stein, F.: Efficient computation of optical flow using the census transform. In: Rasmussen, C.E., Bülthoff, H.H., Schölkopf, B., Giese, M.A. (eds.) DAGM-Symposium. Lecture Notes in Computer Science, vol. 3175, pp. 79–86. Springer, Tübingen (2004)

  20. Surmann, H., Lingemann, K., Nüchter, A., Hertzberg, J.: A 3D laser range finder for autonomous mobile robots. In: Proceedings of the 32nd ISR (International Symposium on Robotics), Citeseer, vol. 19, pp. 153–158 (2001)

  21. Tunley, H., Young, D.: First order optic flow from log-polar sampled images. In: Computer Vision-ECCV’94. Lecture Notes in Computer Science, vol. 800, pp. 132–137. Springer, Berlin (1994). doi:10.1007/3-540-57956-7_14

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Johny Paul.

Electronic supplementary material

Below is the link to the electronic supplementary material:

AVI (3172 KB)

AVI (18908 KB)

AVI (14408 KB)

AVI (4539 KB)

AVI (15308 KB)

AVI (18908 KB)

AVI (6442 KB)

Appendix: Estimation of motion model from flow vectors

Appendix: Estimation of motion model from flow vectors

The computation of the first-order-flow motion model from the optical flow vectors is explained below. The first-order-flow motion model in Eq. 1 can be reversed to form a new Eq. 5 which can be used to compute the motion model (x c y c DR) from flow vectors, whose velocity and position are represented by (v x1v y1), [x 1y 1] and (v x2v y2), [x 2y 2] respectively.

$$ \begin{aligned} x_c &=\frac{e_1+e_2+e_3}{v_{x1}(v_{x1}-2v_{x2})+v_{x2}^2+v_{y1}^2-v_{y2}(2v_{y1}-v_{y2})}\\ y_c &=\frac{e_4+e_5+e_6}{v_{x1}(v_{x1}-2v_{x2})+v_{x2}^2+v_{y1}^2-v_{y2}(2v_{y1}-v_{y2})}\\ D &= \frac{(x_1-x_2)(v_{x1}-v_{x2})+(y_1-y_2)(v_{y1}-v_{y2})}{x_1(x_1-2x_2)+x_2^2+y_1(y_1-2y_2)+y_2^2}\\ R &= \frac{(x_1-x_2)(v_{y1}-v_{y2})+(y_2-y_1)(v_{x1}-v_{x2})}{x_1(x_1-2x_2)+x_2^2+y_1(y_1-2y_2)+y_2^2}\\ \end{aligned} $$
(5)

where e 1, to e 6 are as shown below:

$$ \begin{aligned} e_1 &= v_{x1} (v_{x1}x_2 - v_{x2}x_1 - v_{x2}x_2 + v_{y2}y_1 - v_{y2}y_2)\\ e_2 &= v_{y1} (v_{y1}x_2 - v_{y2}x_1 - v_{y2}x_2 - v_{x2}y_1 + v_{x2}y_2)\\ e_3 &= x_1 (v_{y2}^2 + v_{x2}^2)\\ e_4 &= v_{x2} (v_{y1}x_1 - v_{y1}x_2 - v_{x1} y_1 + v_{x2}y_1 - v_{x1}y_2)\\ e_5 &= v_{y2} (v_{x1}x_2 - v_{x1}x_1 - v_{y1} y_1 + v_{y2}y_1 - v_{y1} y_2)\\ e_6 &= y_2 (v_{x1}^2 + v_{y1}^2)\\ \end{aligned} $$
(6)

Rights and permissions

Reprints and permissions

About this article

Cite this article

Paul, J., Laika, A., Claus, C. et al. Real-time motion detection based on SW/HW-codesign for walking rescue robots. J Real-Time Image Proc 8, 353–368 (2013). https://doi.org/10.1007/s11554-011-0239-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11554-011-0239-0

Keywords

Navigation