Skip to main content

A New Approach to Model-Free Tracking with 2D Lidar

  • Chapter
  • First Online:
Robotics Research

Part of the book series: Springer Tracts in Advanced Robotics ((STAR,volume 114))

Abstract

This paper presents a unified and model-free framework for the detection and tracking of dynamic objects with 2D laser range finders in an autonomous driving scenario. A novel state formulation is proposed that captures joint estimates of the sensor pose, a local static background and dynamic states of moving objects. In addition, we contribute a new hierarchical data association algorithm to associate raw laser measurements to observable states, and within which, a new variant of the Joint Compatibility Branch and Bound (JCBB) algorithm is introduced for problems with large numbers of measurements. The system is calibrated systematically on 7.5K labeled object examples and evaluated on 6K test cases, and is shown to greatly outperform an existing industry standard targeted at the same problem domain.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Arras, K., Grzonka, S., Luber, M., Burgard, W.: Efficient people tracking in laser range data using a multi-hypothesis leg-tracker with adaptive occlusion probabilities. In: Proceedings of the IEEE International Conference on Robotics and Automation, ICRA 2008, pp. 1710–1715 (2008)

    Google Scholar 

  2. Bar-Shalom, Y., Kirubarajan, T., Li, X.R.: Estimation with applications to tracking and navigation. Wiley, New York (2002)

    Google Scholar 

  3. Besl, P., McKay, N.D.: A method for registration of 3-D shapes. IEEE Trans. Pattern Anal. Mach. Intell. 14(2), 239–256 (1992)

    Article  Google Scholar 

  4. Everingham, M., Van Gool, L., Williams, C.K.I., Winn, J., Zisserman, A.: The Pascal visual object classes (VOC) challenge. Int. J. Comput. Vis. 88(2), 303–338 (2010)

    Article  Google Scholar 

  5. Gavrila, D.M., Munder, S.: Multi-cue pedestrian detection and tracking from a moving vehicle. Int. J. Comput. Vis. 73(1), 41–59 (2007)

    Article  Google Scholar 

  6. Leonard, J., How, J., Teller, S., Berger, M., Campbell, S., Fiore, G., Fletcher, L., Frazzoli, E., Huang, A., Karaman, S., Koch, O., Kuwata, Y., Moore, D., Olson, E., Peters, S., Teo, J., Truax, R., Walter, M., Barrett, D., Epstein, A., Maheloni, K., Moyer, K., Jones, T., Buckley, R., Antone, M., Galejs, R., Krishnamurthy, S., Williams, J.: A perception-driven autonomous urban vehicle. J. Field Robot. 25(10), 727–774 (2008)

    Article  Google Scholar 

  7. Mertz, C., Navarro-Serment, L.E., MacLachlan, R., Rybski, P., Steinfeld, A., Suppe, A., Urmson, C., Vandapel, N., Hebert, M., Thorpe, C., Duggins, D., Gowdy, J.: Moving object detection with laser scanners. J. Field Robot. 30(1), 17–43 (2013)

    Article  Google Scholar 

  8. Miyasaka, T., Ohama, Y., Ninomiya, Y.: Ego-motion estimation and moving object tracking using multi-layer LIDAR. In: Proceedings of the Intelligent Vehicles Symposium, pp. 151–156. IEEE (2009)

    Google Scholar 

  9. Neira, J., Tardos, J.: Data association in stochastic mapping using the joint compatibility test. IEEE Trans. Robot. Autom. 17(6), 890–897 (2001)

    Article  Google Scholar 

  10. Snoek, J., Larochelle, H., Adams, R.P.: Practical Bayesian optimization of machine learning algorithms. In: Neural Information Processing Systems (2012)

    Google Scholar 

  11. Thrun, S., Montemerlo, M., Dahlkamp, H., Stavens, D., Aron, A., Diebel, J., Fong, P., Gale, J., Halpenny, M., Hoffmann, G., Lau, K., Oakley, C., Palatucci, M., Pratt, V., Stang, P., Strohband, S., Dupont, C., Jendrossek, L.E., Koelen, C., Markey, C., Rummel, C., van Niekerk, J., Jensen, E., Alessandrini, P., Bradski, G., Davies, B., Ettinger, S., Kaehler, A., Nefian, A., Mahoney, P.: Stanley: the robot that won the DARPA grand challenge. J. Field Robot. 23(9), 661–692 (2006)

    Article  Google Scholar 

  12. Tipaldi, G., Ramos, F.: Motion clustering and estimation with conditional random fields. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2009, pp. 872–877 (2009)

    Google Scholar 

  13. Topp, E., Christensen, H.: Tracking for following and passing persons. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2005), pp. 2321–2327 (2005)

    Google Scholar 

  14. Urmson, C., Anhalt, J., Bagnell, D., Baker, C., Bittner, R., Clark, M.N., Dolan, J., Duggins, D., Galatali, T., Geyer, C., Gittleman, M., Harbaugh, S., Hebert, M., Howard, T.M., Kolski, S., Kelly, A., Likhachev, M., McNaughton, M., Miller, N., Peterson, K., Pilnick, B., Rajkumar, R., Rybski, P., Salesky, B., Seo, Y.W., Singh, S., Snider, J., Stentz, A., Whittaker, W.R., Wolkowicki, Z., Ziglar, J., Bae, H., Brown, T., Demitrish, D., Litkouhi, B., Nickolaou, J., Sadekar, V., Zhang, W., Struble, J., Taylor, M., Darms, M., Ferguson, D.: Autonomous driving in urban environments: boss and the urban challenge. J. Field Robot. 25(8), 425–466 (2008)

    Article  Google Scholar 

  15. van de Ven, J., Ramos, F., Tipaldi, G.: An integrated probabilistic model for scan-matching, moving object detection and motion estimation. In: Proceedings of the 2010 IEEE International Conference on Robotics and Automation (ICRA), pp. 887–894 (2010)

    Google Scholar 

  16. Wang, C.C., Thorpe, C., Thrun, S.: Online simultaneous localization and mapping with detection and tracking of moving objects: theory and results from a ground vehicle in crowded urban areas. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Taipei, Taiwan (2003)

    Google Scholar 

  17. Williams, S.B.: Efficient solutions to autonomous mapping and navigation problems. Ph.D. thesis, Australian Centre for Field Robotics, The University of Sydney (2001)

    Google Scholar 

  18. Yang, S.W., Wang, C.C.: Simultaneous egomotion estimation, segmentation, and moving object detection. J. Field Robot. 28(4), 565–588 (2011)

    Article  MATH  Google Scholar 

Download references

Acknowledgments

This work is supported by the Clarendon Fund. Paul Newman is supported by an EPSRC Leadership Fellowship, EPSRC Grant EP/I005021/1. The authors wish to thank Jasper Snoek for making the Spearmint Bayesian optimisation package publicly available.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dominic Zeng Wang .

Editor information

Editors and Affiliations

Appendix

Appendix

In this appendix, we state the exact forms of the observation models applied to boundary points on the static background and dynamic objects respectively. All variables involved in what follows are defined in Sect. 4.1, and the function \(\mathbf {u}\) maps a pair of 2D cartesian coordinates into polar coordinates.

Each boundary point j on the static background may potentially generate a laser measurement \(\mathbf {z} = [r,\theta ]^T\), and hence its measurement model is the boundary point’s location in polar coordinates in the sensor’s frame of reference:

$$\begin{aligned} \mathbf {h}_j(\mathbf {x}) = \mathbf {u}(\mathbf {g}(\mathbf {x}_S,\mathbf {b}_j))\text{, } \;\mathbf {g}(\mathbf {x}_S,\mathbf {b}_j) = \mathbf {R}^T(\psi )\left( \left[ \begin{array}{c}x_j\\ y_j\end{array}\right] -\left[ \begin{array}{c}\alpha \\ \beta \end{array}\right] \right) \text{. } \end{aligned}$$
(9)

Each boundary point j on any dynamic track i may also give rise to a laser measurement, and the measurement model in this case is the 2D polar coordinates of the boundary point in the sensor frame, and is given by:

$$\begin{aligned} \mathbf {h}_j(\mathbf {x}) = \mathbf {u}(\mathbf {g}(\mathbf {x}_S,\mathbf {x}_T^i,\mathbf {p}_j^i))\text{, } \;\mathbf {g}(\mathbf {x}_S,\mathbf {x}_T^i,\mathbf {p}_j^i) = \mathbf {R}^T(\psi )\left( \mathbf {R}(\phi _i)\left[ \begin{array}{c}x_j^i\\ y_j^i\end{array}\right] +\left[ \begin{array}{c}\gamma _i\\ \delta _i\end{array}\right] -\left[ \begin{array}{c}\alpha \\ \beta \end{array}\right] \right) \text{. } \end{aligned}$$
(10)

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Wang, D.Z., Posner, I., Newman, P. (2016). A New Approach to Model-Free Tracking with 2D Lidar. In: Inaba, M., Corke, P. (eds) Robotics Research. Springer Tracts in Advanced Robotics, vol 114. Springer, Cham. https://doi.org/10.1007/978-3-319-28872-7_32

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-28872-7_32

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-28870-3

  • Online ISBN: 978-3-319-28872-7

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics