About this book
This book studies the approximate solutions of optimization problems in the presence of computational errors. A number of results are presented on the convergence behavior of algorithms in a Hilbert space; these algorithms are examined taking into account computational errors. The author illustrates that algorithms generate a good approximate solution, if computational errors are bounded from above by a small positive constant. Known computational errors are examined with the aim of determining an approximate solution. Researchers and students interested in the optimization theory and its applications will find this book instructive and informative.
This monograph contains 16 chapters; including a chapters devoted to the subgradient projection algorithm, the mirror descent algorithm, gradient projection algorithm, the Weiszfelds method, constrained convex minimization problems, the convergence of a proximal point method in a Hilbert space, the continuous subgradient method, penalty methods and Newton’s method.
- DOI https://doi.org/10.1007/978-3-319-30921-7
- Copyright Information Springer International Publishing Switzerland 2016
- Publisher Name Springer, Cham
- eBook Packages Mathematics and Statistics
- Print ISBN 978-3-319-30920-0
- Online ISBN 978-3-319-30921-7
- Series Print ISSN 1931-6828
- Series Online ISSN 1931-6836
- Buy this book on publisher's site