Advertisement

A Resource Allocation Framework for Adaptive Selection of Point Matching Strategies

  • Quentin De Neyer
  • Christophe De Vleeschouwer
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8192)

Abstract

This report investigates how to track an object based on the matching of points between pairs of consecutive video frames. The approach is especially relevant to support object tracking in close-view video shots, as for example encountered in the context of the Pan-Tilt-Zoom (PTZ) camera autotracking problem. In contrast to many earlier related works, we consider that the matching metric of a point should be adapted to the signal observed in its spatial neighborhood, and introduce a cost-benefit framework to control this adaptation with respect to the global target displacement estimation objective. Hence, the proposed framework explicitly handles the trade-off between the point-level matching metric complexity, and the contribution brought by this metric to solve the target tracking problem. As a consequence, and in contrast with the common assumption that only specific points of interest should be investigated, our framework does not make any a priori assumption about the points that should be considered or ignored by the tracking process. Instead, it states that any point might help in the target displacement estimation, provided that the matching metric is well adapted. Measuring the contribution reliability of a point as the probability that it leads to a crorrect matching decision, we are able to define a global successful target matching criterion. It is then possible to minimize the probability of incorrect matching over the set of possible (point,metric) combinations and to find the optimal aggregation strategy. Our preliminary results demonstrate both the effectiveness and the efficiency of our approach.

Keywords

active tracking point matching cost-benefit optimization 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bay, H., Ess, A., Tuytelaars, T., Van Gool, L.: Speeded-up robust features (surf). Computer Vision and Image Understanding 110(3), 346–359 (2008)CrossRefGoogle Scholar
  2. 2.
    De Neyer, Q., Sun, L., Chaudy, C., Parisot, C., De Vleeschouwer, C.: Demo: Point matching for ptz camera autotracking. In: ICDSC. IEEE (2012)Google Scholar
  3. 3.
    Denman, S., Fookes, C., Sridharan, S.: Improved simultaneous computation of motion detection and optical flow for object tracking. In: Digital Image Computing: Techniques and Applications, DICTA 2009, pp. 175–182. IEEE (2009)Google Scholar
  4. 4.
    Harris, C., Stephens, M.: A combined corner and edge detector. In: Alvey Vision Conference, Manchester, UK, vol. 15, p. 50 (1988)Google Scholar
  5. 5.
    Hu, W.C.: Adaptive template block-based block matching for object tracking. In: Eighth International Conference on Intelligent Systems Design and Applications, ISDA 2008, pp. 61–64. IEEE (2008)Google Scholar
  6. 6.
    Lalonde, M., Foucher, S., Gagnon, L., Pronovost, E., Derenne, M., Janelle, A.: A system to automatically track humans and vehicles with a ptz camera. In: Defense and Security Symposium. International Society for Optics and Photonics (2007)Google Scholar
  7. 7.
    Lowe, D.: Object recognition from local scale-invariant features. In: ICCV, pp. 1150–1157 (1999)Google Scholar
  8. 8.
    Martello, S., Toth, P.: Knapsack problems: algorithms and computer implementations. John Wiley & Sons, Inc. (1990)Google Scholar
  9. 9.
    Papanikolopoulos, N.P., Khosla, P.K., Kanade, T.: Visual tracking of a moving target by a camera mounted on a robot: A combination of control and vision. IEEE Transactions on Robotics and Automation 9(1), 14–35 (1993)CrossRefGoogle Scholar
  10. 10.
    Sun, L., De Neyer, Q., De Vleeschouwer, C.: Multimode spatiotemporal background modeling for complex scenes. In: EUSIPCO (2012)Google Scholar
  11. 11.
    Veenman, C.J., Reinders, M.J.T., Backer, E.: Resolving motion correspondence for densely moving points. IEEE Transactions on Pattern Analysis and Machine Intelligence 23(1), 54–72 (2001)CrossRefGoogle Scholar
  12. 12.
    Xie, Y., Lin, L., Jia, Y.: Tracking objects with adaptive feature patches for ptz camera visual surveillance. In: 2010 20th International Conference on Pattern Recognition (ICPR), pp. 1739–1742. IEEE (2010)Google Scholar
  13. 13.
    Zhang, Y., Liang, Z., Hou, Z., Wang, H., Tan, M.: An adaptive mixture gaussian background model with online background reconstruction and adjustable foreground mergence time for motion segmentation. In: IEEE International Conference on Industrial Technology, ICIT 2005, pp. 23–27. IEEE (2005)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Quentin De Neyer
    • 1
  • Christophe De Vleeschouwer
    • 1
  1. 1.Université Catholique de LouvainLouvain-La-NeuveBelgique

Personalised recommendations