Robust Parameter Estimation

Part of the Advances in Computer Vision and Pattern Recognition book series (ACVPR)


The problem of robust parameter estimation in image registration is discussed and various robust methods for estimating registration parameters under outliers and inaccurate correspondences are reviewed and compared. After reviewing ordinary least-squares and weighted least-squares estimation, robust estimators such as maximum likelihood (M), repeated median (RM), scale (S), least median of squares (LMS), least trimmed square (LTS), and rank (R) estimators are described and compared.


Ordinary Little Square Original Feature Robust Estimator Breakdown Point Ordinary Little Square Estimator 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Abdi, H.: Least squares. In: Lewis-Beck, M., Bryman, A., Futing, T. (eds.) The Sage Encyclopedia of Social Sciences Research Methods, Thousand Oaks, CA, pp. 1–4 (2003) Google Scholar
  2. 2.
    Aitken, A.C.: On least squares and linear combinations of observations. Proc. R. Soc. Edinb. 55, 42–48 (1935) Google Scholar
  3. 3.
    Andrews, D.F., Bickel, P.J., Hampel, F.R., Huber, P.J., Rogers, W.H., Tukey, J.W.: Robust Estimates of Location: Survey and Advances. Princeton University Press, Princeton (1972) MATHGoogle Scholar
  4. 4.
    Golub, G., Kahan, W.: Calculating the singular values and pseudo-inverse of a matrix. J. SIAM Numer. Anal., Ser. B 2(2), 205–224 (1965) MathSciNetCrossRefGoogle Scholar
  5. 5.
    Hampel, F.R.: A general qualitative definition of robustness. Ann. Math. Stat. 42(6), 1887–1896 (1971) MathSciNetMATHCrossRefGoogle Scholar
  6. 6.
    Hampel, F.R.: The influence curve and its role in robust estimation. J. Am. Stat. Assoc. 69(346), 383–393 (1974) MathSciNetMATHCrossRefGoogle Scholar
  7. 7.
    Hampel, F.R., Ronchetti, E.M., Rousseeuw, P.J., Stahel, W.A.: Robust Statistics: The Approach Based on Influence Functions. Wiley, New York (1986) MATHGoogle Scholar
  8. 8.
    Hodges, V.J., Austin, J.: A survey of outlier detection methodologies. Artif. Intell. Rev. 22, 85–126 (2004) CrossRefGoogle Scholar
  9. 9.
    Hossjer, P.: Rank-based estimates in the linear model with high breakdown point. J. Am. Stat. Assoc. 89(425), 149–158 (1994) MathSciNetGoogle Scholar
  10. 10.
    Hotelling, H.: Analysis of a complex of statistical variables into principal components. J. Educ. Psychol. 24, 417–441 (1933), also see pp. 498–520 CrossRefGoogle Scholar
  11. 11.
    Huber, P.J.: Robust regression: Asymptotics, conjectures and Monte Carlo. Ann. Stat. 1(5), 799–821 (1973) MATHCrossRefGoogle Scholar
  12. 12.
    Huber, P.J.: Robust Statistics. Wiley, New York (1981) MATHCrossRefGoogle Scholar
  13. 13.
    Jaeckel, L.A.: Regression coefficients by minimizing the dispersion of the residuals. Ann. Math. Stat. 43(5), 1449–1458 (1972) MathSciNetMATHCrossRefGoogle Scholar
  14. 14.
    Jolliffe, I.T.: Discarding variables in a principal component analysis. I: Artificial data. J. R. Stat. Soc., Ser. C, Appl. Stat. 21(2), 160–173 (1972) MathSciNetGoogle Scholar
  15. 15.
    Jolliffe, I.T.: Discarding variables in a principal component analysis. II: Real data. J. R. Stat. Soc., Ser. C, Appl. Stat. 22(1), 21–31 (1973) Google Scholar
  16. 16.
    Jolliffee, I.T.: Principal Component Analysis. Springer, New York (2002) Google Scholar
  17. 17.
    Kendall, M.G.: A Course in Multivariate Analysis, 4th Impression. Hafner, New York (1968) Google Scholar
  18. 18.
    Kittler, J., Young, P.C.: A new approach to feature selection based on the Karhunen–Loève expansion. Pattern Recognit. 5, 335–352 (1973) MathSciNetCrossRefGoogle Scholar
  19. 19.
    Mao, K.Z.: Identifying critical variables of principal components for unsupervised feature selection. IEEE Trans. Syst. Man Cybern., Part B, Cybern. 35(2), 334–339 (2005) CrossRefGoogle Scholar
  20. 20.
    McElroy, F.W.: A necessary and sufficient condition that ordinary least-squares estimators be best linear unbiased. J. Am. Stat. Assoc. 62(320), 1302–1304 (1967) MathSciNetCrossRefGoogle Scholar
  21. 21.
    Pearson, K.: On lines and planes of closest fit to systems of points in space. Philos. Mag. 2(6), 559–572 (1901) Google Scholar
  22. 22.
    Penrose, R.: A generalized inverse for matrices. Math. Proc. Camb. Philos. Soc. 51(3), 406–413 (1955) MathSciNetMATHCrossRefGoogle Scholar
  23. 23.
    Rao, C.R.: The use of interpretation of principal component analysis in applied research. Indian J. Stat., Ser. A 26(4), 329–358 (1964) MATHGoogle Scholar
  24. 24.
    Rousseeuw, P.J.: Least median of squares regression. J. Am. Stat. Assoc. 79(388), 871–880 (1984) MathSciNetMATHCrossRefGoogle Scholar
  25. 25.
    Rousseeuw, P.J., Croux, C.: Alternatives to the median absolute deviation. J. Am. Stat. Assoc. 88(424), 1273–1283 (1993) MathSciNetMATHCrossRefGoogle Scholar
  26. 26.
    Rousseeuw, P.J., Hubert, M.: Recent developments in PROGRESS. In: Lecture Notes on L 1-Statistical Procedures and Related Topics, vol. 31, pp. 201–214 (1997) CrossRefGoogle Scholar
  27. 27.
    Rousseeuw, P.J., Leroy, A.M.: Robust Regression and Outlier Detection. Wiley, New York (1987) MATHCrossRefGoogle Scholar
  28. 28.
    Rousseeuw, P., Yohai, V.: Robust regression by means of S-estimators. In: Franke, J., Hördle, W., Martin, R.D. (eds.) Robust and Nonlinear Time Series Analysis. Lecture Notes in Statistics, vol. 26, pp. 256–274. Springer, New York (1984) CrossRefGoogle Scholar
  29. 29.
    Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290, 2323–2326 (2000) CrossRefGoogle Scholar
  30. 30.
    Scholkopf, B., Smola, A., Muller, K.-R.: Nonlinear component analysis as a kernel eigenvalue problem. Neural Comput. 10, 1299–1319 (1998) CrossRefGoogle Scholar
  31. 31.
    Seal, H.L.: Studies in the history of probability and statistics XV: The historical development of the Gauss linear model. Biometrika 54(1–2), 1–24 (1967) MathSciNetMATHGoogle Scholar
  32. 32.
    Siegel, A.F.: Robust regression using repeated medians. Biometrika 69(1), 242–244 (1982) MATHCrossRefGoogle Scholar
  33. 33.
    Tenenbaum, J.B., de Silva, V., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290, 2319–2323 (2000) CrossRefGoogle Scholar
  34. 34.
    Theodoridis, S., Koutroumbas, K.: Pattern Recognition, 4th edn. Academic Press, San Diego (2009), pp. 602, 605, 606 Google Scholar
  35. 35.
    Watanabe, S.: Karhunen–Loève expansion and factor analysis theoretical remarks and applications. In: Trans. Prague Conf. Information Theory, Statistical Decision Functions, Random Processes, pp. 9–26 (1965) Google Scholar
  36. 36.
    Wilcox, R.R.: Introduction to Robust Estimation and Hypothesis Testing. Academic Press, San Diego (1997) MATHGoogle Scholar

Copyright information

© Springer-Verlag London Limited 2012

Authors and Affiliations

  1. 1.Dept. Computer Science and Engineering, 303 Russ Engineering CenterWright State UniversityDaytonUSA

Personalised recommendations