About this book
The accurate and precise estimation of three-dimensional motion vector fields in real time remains one of the key targets for the discipline of computer vision.
This important text/reference presents methods for estimating optical flow and scene flow motion with high accuracy, focusing on the practical application of these methods in camera-based driver assistance systems. Clearly and logically structured, the book builds from basic themes to more advanced concepts, covering topics from variational methods and optic flow estimation, to adaptive regularization and scene flow analysis. This in-depth discussion culminates in the development of a novel, accurate and robust scene flow method for the higher-level challenges posed by real-world applications.
Topics and features:
- Reviews the major advances in motion estimation and motion analysis, and the latest progress of dense optical flow algorithms
- Investigates the use of residual images for optical flow
- Examines methods for deriving motion from stereo image sequences
- Analyses the error characteristics for motion variables, and derives scene flow metrics for movement likelihood and velocity
- Introduces a framework for scene flow-based moving object detection and segmentation, and discusses the application of Kalman filters for propagating scene flow estimation over time
- Includes pseudo code for all important computational challenges
- Contains Appendices on data terms and quadratic optimization, and scene flow implementation using Euler-Lagrange equations, in addition to a helpful Glossary and Index
A valuable reference for researchers and graduate students on segmentation, optical flow and scene flow, this unique book will also be of great interest to professionals involved in the development of driver assistance systems.
- DOI https://doi.org/10.1007/978-0-85729-965-9
- Copyright Information Springer-Verlag London Limited 2011
- Publisher Name Springer, London
- eBook Packages Computer Science
- Print ISBN 978-0-85729-964-2
- Online ISBN 978-0-85729-965-9
- About this book