Abstract
Proving convergence of the various optimization algorithms is a delicate exercise. In general, it is helpful to consider local and global convergence patterns separately. The local convergence rate of an algorithm provides a useful benchmark for comparing it to other algorithms. On this basis, Newton’s method wins hands down. However, the tradeoffs are subtle. Besides the sheer number of iterations until convergence, the computational complexity and numerical stability of an algorithm are critically important. The MM algorithm is often the epitome of numerical stability and computational simplicity. Scoring lies somewhere between these two extremes. It tends to converge more quickly than the MM algorithm and to behave more stably than Newton’s method. Quasi-Newton methods also occupy this intermediate zone. Because the issues are complex, all of these algorithms survive and prosper in certain computational niches.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
de Leeuw J (1994) Block relaxation algorithms in statistics. Information Systems and Data Analysis, Bock HH, Lenski W, Richter MM, Springer, Berlin, pp 308-325
Dennis JE Jr, Schnabel RB (1983) Numerical Methods for Unconstrained Optimization and Nonlinear Equations. Prentice-Hall, Engle-wood Cliffs, NJ
Fessler JA, Clinthorne NH, Rogers WL (1993) On complete-data spaces for PET reconstruction algorithms. IEEE Trans Nuclear Sci 40:1055-1061
Gill PE, Murray W, Wright MH (1981) Practical Optimization. Academic Press, New York
Guillemin V, Pollack A (1974) Differential Topology. Prentice-Hall, Englewood Cliffs, NJ
Lange K (1995) A gradient algorithm locally equivalent to the EM algorithm. J Roy Stat Soc B 57:425-437
Lange, K (2004) Optimization. Springer, New York
Louis TA (1982) Finding the observed information matrix when using the EM algorithm. J Roy Stat Soc B 44:226-233
Luenberger DG (1984) Linear and Nonlinear Programming, 2nd ed. Addison-Wesley, Reading, MA
McLachlan GJ, Krishnan T (2008) The EM Algorithm and Extensions, 2nd ed. Wiley, New York
Meyer RR (1976) Sufficient conditions for the convergence of monotonic mathematical programming algorithms. J Computer System Sci 12:108-121
Nocedal J (1991) Theory of algorithms for unconstrained optimization. Acta Numerica 1991:199-242.
Ortega JM (1990) Numerical Analysis: A Second Course. SIAM, Philadelphia
Ortega JM, Rheinboldt WC (1970) Iterative Solution of Nonlinear Equations in Several Variables. Academic Press, San Diego
Ostrowski AM (1960) Solution of Equations and Systems of Equations. Academic Press, New York
Wu CF (1983) On the convergence properties of the EM algorithm. Ann Stat 11:95-103
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2010 Springer New York
About this chapter
Cite this chapter
Lange, K. (2010). Local and Global Convergence. In: Numerical Analysis for Statisticians. Statistics and Computing. Springer, New York, NY. https://doi.org/10.1007/978-1-4419-5945-4_15
Download citation
DOI: https://doi.org/10.1007/978-1-4419-5945-4_15
Published:
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4419-5944-7
Online ISBN: 978-1-4419-5945-4
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)