Advertisement

© 2018

Lectures on Convex Optimization

  • Presents a self-contained description of fast gradient methods

  • Offers the first description in the monographic literature of the modern second-order methods based on cubic regularization

  • Provides a comprehensive treatment of the smoothing technique

  • Develops a new theory of optimization in relative scale

Textbook

Part of the Springer Optimization and Its Applications book series (SOIA, volume 137)

Table of contents

  1. Front Matter
    Pages i-xxiii
  2. Black-Box Optimization

    1. Front Matter
      Pages 1-1
    2. Yurii Nesterov
      Pages 3-58
    3. Yurii Nesterov
      Pages 59-137
    4. Yurii Nesterov
      Pages 139-240
    5. Yurii Nesterov
      Pages 241-322
  3. Structural Optimization

    1. Front Matter
      Pages 323-323
    2. Yurii Nesterov
      Pages 325-421
    3. Yurii Nesterov
      Pages 423-487
    4. Yurii Nesterov
      Pages 489-570
  4. Back Matter
    Pages 571-589

About this book

Introduction

This book provides a comprehensive, modern introduction to convex optimization, a field that is becoming increasingly important in applied mathematics, economics and finance, engineering, and computer science, notably in data science and machine learning.

Written by a leading expert in the field, this book includes recent advances in the algorithmic theory of convex optimization, naturally complementing the existing literature. It contains a unified and rigorous presentation of the acceleration techniques for minimization schemes of first- and second-order. It provides readers with a full treatment of the smoothing technique, which has tremendously extended the abilities of gradient-type methods. Several powerful approaches in structural optimization, including optimization in relative scale and polynomial-time interior-point methods, are also discussed in detail.

Researchers in theoretical optimization as well as professionals working on optimization problems will find this book very useful. It presents many successful examples of how to develop very fast specialized minimization algorithms. Based on the author’s lectures, it can naturally serve as the basis for introductory and advanced courses in convex optimization for students in engineering, economics, computer science and mathematics.

Keywords

complexity complexity theory graphs mathematical programming optimization Fast Gradient Methods Self-Concordant Functions Interior-Point Methods Smoothing Technique Cubic Regularization of Newton Method Optimization in Relative Scale MSC 2010 49M15, 49M29, 49N15, 65K05, 65K10, 90C25, 90C30, 90C46 90C51, 90C52, 90C60

Authors and affiliations

  1. 1.CORE/INMACatholic University of LouvainLouvain-la-NeuveBelgium

About the authors

​Yurii Nesterov is a well-known specialist in optimization. He is an author of pioneering works related to fast gradient methods, polynomial-time interior-point methods, smoothing technique, regularized Newton methods, and others. He is a winner of several prestigious international prizes, including George Danzig prize (2000), von Neumann Theory prize (2009), SIAM Outstanding Paper Award (20014), and Euro Gold Medal (2016).

Bibliographic information

Industry Sectors
Electronics
Engineering
Finance, Business & Banking
Law

Reviews

“It is a must-read for both students involved in the operations research programs, as well as the researchers in the area of nonlinear programming, in particular in convex optimization.” (Marcin Anholcer, zbMATH 1427.90003, 2020)