Abstract
Most practical optimization problems arise with constraints on the solutions. Nevertheless, unconstrained optimization techniques serve as a major tool in finding solutions for both unconstrained and constrained optimization problems. In this chapter we present techniques for solving the unconstrained optimization problem. In Chap. 5 we will see how the unconstrained solution methods aid in finding the solutions to constrained optimization problems.
In order to find the solution of a given unconstrained optimization problem, a general procedure includes an initial solution estimate (guess) followed by iterative updates of the solution directed toward minimizing the objective function. If we consider the solution of an N-dimensional objective function as an N-vector, iterative updates can be performed by adding a vector to the current solution vector. We call the vector augmented to the current solution vector the step vector or simply the step.
Various unconstrained optimization methods can be classified by the method of determining step vectors. We place direct search methods, which are the simplest, in group one. In this group, the step vectors are randomly determined. The second group is known as derivative-based methods. In this case, the step vectors are determined based on the derivative of the objective function. Gradient methods and Newton’s methods fall into this category.
In practice, more sophisticated methods, such as conjugate gradient and quasi-Newton methods, have shown comparable performance with greatly reduced computational cost and memory space. A brief discussion of the conjugate gradient algorithm will also be included in this chapter.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
A function is called unimodal if it has a single minimum point. A function is called multimodal if it has multiple local minima.
- 2.
The function is said to have a well-shaped minimum if it has a sharp, narrow valley in the neighborhood of the minimum point.
References
E.K.P. Chong, S.H. Zak, An Introduction to Optimization (Wiley, New York, 1996)
R. Fletcher, Practical Methods of Optimization. Unconstrained Optimization, vol 1 (Wiley, Chichester, 1980)
D.B. Luenberger, Linear and Nonlinear Programming, 2nd edn. (Addison-Wesley, Reading, 1984)
N. Metropolis, A. Rosenbluth, M. Rosenbluth, H. Teller, E. Teller, Equation of state calculations by fast computing machines. J. Chem. Phys. 21, 1087–1092 (1953)
J.I. Moon, J.K. Paik, Fast iterative image restoration algorithm. J. Electr. Eng. Inf. Sci. 1(2), 67–75 (1996)
W. Murray (ed.), Numerical Methods for Unconstrained Optimization (Academic Press, New York, 1972)
Additional References and Further Readings
R.K. Ahuya, T.L. Magnanti, J.B. Orlin, Network Flows: Theory, Algorithms, and Applications (Prentice Hall, Englewood Cliffs, NJ, 1993)
J.E. Dennis, R.B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear Equations (Prentice Hall, Englewood Cliffs, NJ, 1983)
J.E. Dennis, R.B. Schnabel, A view of unconstrained optimization, in Optimization, ed. by G.L. Nemhauser, A.H.G. Rinnooy Kan, M.J. Todd (North-Holland, Amsterdam, 1989), pp. 1–72
P.E. Gill, W. Murray, M.H. Wright, Numerical Linear Algebra and Optimization (Addison-Wesley, Reading, MA, 1991)
G.H. Golub, C.F. Van Loan, Matrix Computations, 2nd edn. (The Johns Hopkins University Press, Baltimore, MD, 1989)
J.J. Moré, D.C. Sorensen, Newton’s method, in Studies in Numerical Analysis, ed. by G.H. Golub (Mathematical Association of America, Washington, DC, 1984), pp. 29–82
B.A. Murtagh, Advanced Linear Programming: Computation and Practice (McGraw-Hill, New York, 1981)
G.L. Nemhauser, L.A. Wolsey, Integer and Combinatorial Optimization (Wiley, New York, 1988)
J. Nocedal, Theory of algorithms for unconstrained optimization, in Acta Numerica 1992 (Cambridge University Press, Cambridge, 1992), pp. 199–242
A. Belegundu, T.R. Chandrapatla, Optimization Concepts and Applications in Engineering (Prentice Hall, Englewood Cliffs, 1999)
D.G. Luenberger, Introduction to Linear and Nonlinear Programming (Addison Wesley, Reading, 1984)
G.V. Reklaitis, A. Ravindran, K.M. Ragsdell, Engineering Optimization: Methods and Applications (Wiley, Hoboken, 1983)
C.S. Beightler, D.T. Phillips, D.J. Wilde, Foundations of Optimization, 2nd edn. (Prentice-Hall, Englewood Cliffs, NJ, 1979)
G.S. Beveridge, R.S. Schechter, Optimization: Theory and Practice (McGraw-Hill Book Company, New York, 1970)
G.V. Reklaitis, A. Ravindran, K.M. Ragsdell, Engineering Optimization: Methods and Applications (Wiley, New York, 1983)
A.L. Peressini, F.E. Sullivan, J.J. Uhl Jr., The Mathematics of Nonlinear Programming (Springer, New York, 1988)
M.S. Bazaraa, H.D. Sherali, C.M. Shetty, Nonlinear Programming: Theory and Algorithms, 2nd edn. (Wiley, New York, 1993)
O.L. Mangasarian, Nonlinear Programming. Classics in Applied Mathematics series, vol 10 (SIAM, Philadelphia, 1994)
J.M. Borwein, A.S. Lewis, Convex Analysis and Nonlinear Optimization (Springer, Berlin, 2000)
J.-B. Hiriart-Urruty, C. Lemarechal, Convex Analysis and Minimization Algorithms (Springer, New York, 1993)
R.T. Rockafellar, Convex Analysis (Princeton University Press, Princeton, 1996)
R.P. Horst, M. Pardalos, N.V. Thoai, Introduction to Global Optimization (Kluwer Academic Publishers, Dordrecht, 1995)
R. Horst, H. Tuy, Global Optimization: Deterministic Approaches, 3rd edn. (Springer, Heidelberg, 1996)
D.P. Bertsekas, Nonlinear Programming (Athena Scientific, Boston, MA, 1995)
J.R. Birge, F. Louveaux, Introduction to Stochastic Programming (Springer, New York, NY, 1997)
J.M. Borwein, A.S. Lewis, Convex Analysis and Nonlinear Optimization: Theory and Examples (Springer, New York, 2000)
V. Chvátal, Linear Programming (W. H Freeman and Company, New York, 1983)
W.J. Cook, W.H. Cunningham, W.R. Pulleyblank, A. Schrijver, Combinatorial Optimization (Wiley, New York, 1998)
G.B. Dantzig, Linear Programming and Extensions (Princeton University Press, Princeton, NJ, 1963)
C.H. Edwards, Advanced Calculus of Several Variables (Academic Press, New York, 1973)
S.-C. Fang, S. Puthenpura, Linear Optimization and Extensions (Prentice-Hall, Englewood Cliffs, NJ, 1993)
A.V. Fiacco, G.P. McCormick, Nonlinear Programming: Sequential Unconstrained Minimization Techniques (Wiley, New York, 1968)
J. Nocedal, S.J. Wright, Numerical Optimization (Springer, New York, 1999)
R.T. Rockafellar, R.J.-B. Wets, Variational Analysis (Springer, New York, 1997)
G. Strang, Linear Algebra and Its Applications, 3rd edn. (Harcourt Brace Jovanovich, San Diego, 1988)
D. Bertsimas, J. Tsitsiklis, Introduction to Linear Optimization (Athena Scientific, Belmont, 1997)
P.E. Gill, W. Murray, M.H. Wright, Numerical Linear Algebra and Optimization, vol 1 (Addison-Wesley, Redwood City, CA, 1991)
M.H. Wright, Optimization and large-scale computation, in Very Large-Scale Computation in the 21st Century, ed. by J.P. Mesirov (Society for Industrial and Applied Mathematics, Philadelphia, 1991), pp. 250–272
M.H. Wright, Direct search methods: once scorned now respectable, in Numerical Analysis 1995 (Proceedings of the 1995 Dundee Biennial Conference in Numerical Analysis), ed. by D.F. Griffiths, G.A. Watson (Longman, Addison Wesley, 1996), pp. 191–208
M.H. Wright, Ill-conditioning and computational error in primal-dual interior methods for nonlinear programming. SIAM J. Optim. 9, 84–111 (1998)
C.T. Kelley, Iterative Methods for Optimization (SIAM, Philadelphia, 1999)
M.R. Hestness, Conjugate Direction Methods in Optimization (Springer, Berlin, 1980)
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Abidi, M.A., Gribok, A.V., Paik, J. (2016). Unconstrained Optimization. In: Optimization Techniques in Computer Vision. Advances in Computer Vision and Pattern Recognition. Springer, Cham. https://doi.org/10.1007/978-3-319-46364-3_4
Download citation
DOI: https://doi.org/10.1007/978-3-319-46364-3_4
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-46363-6
Online ISBN: 978-3-319-46364-3
eBook Packages: Computer ScienceComputer Science (R0)