The MM and EM algorithms are hardly the only methods of optimization. Newton’s method is better known and more widely applied. Despite its defects, Newton’s method is the gold standard for speed of convergence and forms the basis of most modern optimization algorithms. Its many variants seek to retain its fast convergence while taming its defects. They all revolve around the core idea of locally approximating the objective function by a strictly convex quadratic function. At each iteration the quadratic approximation is optimized. Safeguards are introduced to keep the iterates from veering toward irrelevant stationary points.
KeywordsPositive Semidefinite Exponential Family Positive Definite Matrix Descent Direction Quadratic Rate
Unable to display preview. Download preview PDF.