Abstract
Model selection criteria are designed for selecting an appropriate hypothesis for the phenomenon in question. In traditional statistics, a hypothesis has a parametric expression that combines a deterministic input-output relationship and random fluctuations. However, most of the problems encountered in computer vision do not fit in this framework. In this chapter, we illustrate this difference by taking line fitting as a typical example. First, we discuss the classical regression problem and show how the Akaike Information Criterion (AIC) can be used for model selection. Then, we go on to the geometric fitting problem described in the form that typically appears in computer vision applications. Since the two problems are different, we must modify the AIC; we call the resulting criterion the “geometric AIC.“ We generalize this idea to an abstract framework and compare it with other criteria such as cross-validation, jackknife, bootstrap, C P , Bayesian Information Criterion (BIC), and Minimum Description Length (MDL). We conclude by discussing some of the fundamental issues that lie behind all these criteria.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2000 Springer-Verlag New York, Inc.
About this chapter
Cite this chapter
Kanatani, K. (2000). Model Selection Criteria for Geometric Inference. In: Bab-Hadiashar, A., Suter, D. (eds) Data Segmentation and Model Selection for Computer Vision. Springer, New York, NY. https://doi.org/10.1007/978-0-387-21528-0_4
Download citation
DOI: https://doi.org/10.1007/978-0-387-21528-0_4
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4684-9508-9
Online ISBN: 978-0-387-21528-0
eBook Packages: Springer Book Archive