Abstract
Within a human motion analysis system, body parts are modeled by simple virtual 3D rigid objects. Its position and orientation parameters at frame t + 1 are estimated based on the parameters at frame t and the image intensity variation from frame t to t + 1, under kinematic constraints. A genetic algorithm calculates the 3D parameters that make a goal function that measures the intensity change minimum. The goal function is robust, so that outliers located especially near the virtual object projection borders have less effect on the estimation. Since the object’s parameters are relative to the reference system, they are the same from different cameras, so more cameras are easily added, increasing the constraints over the same number of variables. Several successful experiments are presented for an arm motion and a leg motion from two and three cameras.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
C. Bregler and J. Malik. Video motion capture. http://www.cs.berkely.edu/~bregler/digmuy.html, 1997. UCB-CSD-97-973.
I. Haritauglu, D. Harwood, and L. Davis. W 4 S: A real-time system for detecting and tracking people in 212D. Computer Vision-ECCV’98, 1406:877–892, 1998.
E. Hunter, P. Kelly, and R. Jain. Estimation of articulated motion using kinematically constrained mixture densities. In Proceedings of IEEE Non-Rigid and Articulated Motion Workshop, pages 10–17, Puerto Rico, USA, 1997.
S. Ju, M. Black, and Y. Yacoob. Carboard People: A Parameterized model of articulated image motion. In 2nd International Conference on Face and Gesture Analysis, pages 38–44, Vermont, USA, 1996.
Z. Michalewicz. Genetic Algorithms + Data Structures = Evolution Programs. Springer-Verlag, 1996.
R. Murray, Z. Li, and S. Sastry. A Mathematical Introduction to Robotic Manipulation. CRC Press, 1994.
Frank Perez and Christof Koch. Toward color image segmentation in analog VLSI:Algorithm and hardware. Int. Journal of Computer Vision, 12(1):17–42, Feb. 1994.
M. Rautiainen, T. Ojila, and H. Kauniskangas. Detecting perceptual color changes from sequential images for scene surveillance. In Proc. Workshop on Machine Vision Applications, pages 140–143, Tokyo, Japan, 2000.
C. Wren, A. Azarbayejani, T. Darrel, and A. Pentland. Pfinder: Real-time tracking of the human body. IEEE trans. on PAMI, 19(7):780–785, July 1997.
J. Weng, T. Huang, and N. Ahuja. Motion and Structure from Image Sequences, volume 29 of Series in Informacion Sciences. Springer-Verlag, 1993.
S. Wachter and H. Nagel. Tracking of persons in monocular image sequences. In Proceedings of IEEE Non-Rigid and Articulated Motion Workshop, pages 2–9, Puerto Rico, USA, 1997.
C. Wren and A. Pentland. Understanding purposeful human motion. In Fourth IEEE Int. Conf. on Automatic Face and Gesture Recognition, 2000.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2001 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Rocha, J., Mir, A. (2001). Articulated Object Tracking via a Genetic Algorithm. In: Figueiredo, M., Zerubia, J., Jain, A.K. (eds) Energy Minimization Methods in Computer Vision and Pattern Recognition. EMMCVPR 2001. Lecture Notes in Computer Science, vol 2134. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-44745-8_10
Download citation
DOI: https://doi.org/10.1007/3-540-44745-8_10
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-42523-6
Online ISBN: 978-3-540-44745-0
eBook Packages: Springer Book Archive