Monocular 3D Tracking of Articulated Human Motion in Silhouette and Pose Manifolds

  • Feng Guo
  • Gang Qian
Open Access
Research Article
Part of the following topical collections:
  1. Anthropocentric Video Analysis: Tools and Applications


This paper presents a robust computational framework for monocular 3D tracking of human movement. The main innovation of the proposed framework is to explore the underlying data structures of the body silhouette and pose spaces by constructing low-dimensional silhouettes and poses manifolds, establishing intermanifold mappings, and performing tracking in such manifolds using a particle filter. In addition, a novel vectorized silhouette descriptor is introduced to achieve low-dimensional, noise-resilient silhouette representation. The proposed articulated motion tracker is view-independent, self-initializing, and capable of maintaining multiple kinematic trajectories. By using the learned mapping from the silhouette manifold to the pose manifold, particle sampling is informed by the current image observation, resulting in improved sample efficiency. Decent tracking results have been obtained using synthetic and real videos.


Manifold Particle Filter Human Motion Full Article Current Image 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Publisher note

To access the full article, please see PDF.

Copyright information

© F. Guo and G. Qian. 2008

This article is published under license to BioMed Central Ltd. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Authors and Affiliations

  1. 1.Department of Electrical EngineeringArizona State UniversityTempeUSA
  2. 2.Arts, Media and Engineering Program, Department of Electrical EngineeringArizona State UniversityTempeUSA

Personalised recommendations