Advertisement

© 2012

Stochastic Averaging and Stochastic Extremum Seeking

  • For engineers the text develops a stochastic version of increasingly popular deterministic extremum-seeking algorithms

  • Demonstrates to the mathematician how stochastic averaging theory can be used as a tool for studying stability rather than just approximation

  • Stochastic algorithms are intuitive and connect with the huge field of stochastic optimization

  • Shows how control ideas derived from study of a biological system can be generalized into other widely-different fields of application

Book

Part of the Communications and Control Engineering book series (CCE)

Table of contents

  1. Front Matter
    Pages I-XI
  2. Shu-Jun Liu, Miroslav Krstic
    Pages 1-10
  3. Shu-Jun Liu, Miroslav Krstic
    Pages 11-20
  4. Shu-Jun Liu, Miroslav Krstic
    Pages 21-55
  5. Shu-Jun Liu, Miroslav Krstic
    Pages 57-78
  6. Shu-Jun Liu, Miroslav Krstic
    Pages 79-93
  7. Shu-Jun Liu, Miroslav Krstic
    Pages 95-119
  8. Shu-Jun Liu, Miroslav Krstic
    Pages 121-128
  9. Shu-Jun Liu, Miroslav Krstic
    Pages 129-146
  10. Shu-Jun Liu, Miroslav Krstic
    Pages 181-199
  11. Back Matter
    Pages 201-224

About this book

Introduction

Stochastic Averaging and Stochastic Extremum Seeking develops methods of mathematical analysis inspired by the interest in reverse engineering  and analysis of bacterial  convergence by chemotaxis and to apply similar stochastic optimization techniques in other environments.

The first half of the text presents significant advances in stochastic averaging theory, necessitated by the fact that existing theorems are restricted to systems with linear growth, globally exponentially stable average models, vanishing stochastic perturbations, and prevent analysis over infinite time horizon.

The second half of the text introduces stochastic extremum seeking algorithms for model-free optimization of systems in real time using stochastic perturbations for estimation of their gradients. Both gradient- and Newton-based algorithms are presented, offering the user the choice between the simplicity of implementation (gradient) and the ability to achieve a known, arbitrary convergence rate (Newton).

The design of algorithms for non-cooperative/adversarial games is described. The analysis of their convergence to Nash equilibria is provided. The algorithms are illustrated on models of economic competition and on problems of the deployment of teams of robotic vehicles.
Bacterial locomotion, such as chemotaxis in E. coli, is explored with the aim of identifying two simple feedback laws for climbing nutrient gradients. Stochastic extremum seeking is shown to be a biologically plausible interpretation for chemotaxis. For the same chemotaxis-inspired stochastic feedback laws, the book also provides a detailed analysis of convergence for models of nonholonomic robotic vehicles operating in GPS-denied environments.

The book contains block diagrams and several simulation examples, including examples arising from bacterial locomotion, multi-agent robotic systems, and economic market models.
Stochastic Averaging and Extremum Seeking will be informative for control engineers from backgrounds in electrical, mechanical, chemical and aerospace engineering and to applied mathematicians. Economics researchers, biologists, biophysicists and roboticists will find the applications examples instructive.

The Communications and Control Engineering series reports major technological advances which have potential for great impact in the fields of communication and control. It reflects research in industrial and academic institutions around the world so that the readership can exploit new possibilities as they become available.

Keywords

Adaptive Control Extremum Seeking Game Theory Gradient-based Algorithm Nash Equilibria Newton-based Algorithm Optimization Stochastic Averaging Systems Biology

Authors and affiliations

  1. 1.Department of MathematicsSoutheast UniversityNanjingChina, People's Republic
  2. 2.Dept. Mechanical & Aerospace Engin.University of California, San DiegoLa JollaUSA

About the authors

Miroslav Krstic is an author of several books on adaptive control, stochastic nonlinear control, extremum seeking, and control of PDEs. Several of these books have had a high impact in the control field and inspired many researchers to work on the topics that the books have covered and apply the tools from the books in their research and in practice.
Shujun Liu is a young researcher in mathematics and control theory in China with strong connections with the leading research groups in control theory at the Chinese Academy of Sciences. Her doctoral work on stochastic stability and stabilization has had considerable influence on a number of research groups in China who have taken on this topic after her initial work with her doctoral advisor Professor Jifeng Zhang.
Much of the material of this book was developed while the first author was a postdoctoral scholar with the second author at University of California, San Diego.

Bibliographic information

Industry Sectors
Automotive
Chemical Manufacturing
Biotechnology
Electronics
IT & Software
Telecommunications
Aerospace
Pharma
Materials & Steel
Oil, Gas & Geosciences
Engineering

Reviews

From the book reviews:

“This research monograph presents and consolidates new results on the well-known topic of stochastic averaging and in the emerging area of stochastic extremum seeking. … The monograph develops averaging from scratch for ordinary differential equations in deterministic and stochastic settings. … This book will be of interest to researchers interested in stochastic search techniques applied to a large variety of engineering systems.” (IEEE Control Systems Magazine, October, 2013)