Skip to main content

Optimal Feedback Control

  • Book
  • © 1995

Overview

Part of the book series: Lecture Notes in Control and Information Sciences (LNCIS, volume 207)

This is a preview of subscription content, log in via an institution to check access.

Access this book

Softcover Book USD 16.99 USD 54.99
Discount applied Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Other ways to access

Licence this eBook for your library

Institutional subscriptions

Table of contents (4 chapters)

Keywords

About this book

This book outlines a new approach to constructing optimal feedback controls for linear control systems that are under the influence of constantly acting bounded perturbations. The optimal synthesis problem is solved by using discrete time systems obtained from continuous ones. Feedback and output feedback are also examined within this context. In cases where only incomplete or imprecise data are available, algorithms for optimal estimators as well as algorithms of optimal identifiers are described. Algorithms for optimal controllers are also constructed. An algorithm for optimal stabilization by bounded controls is also proposed whilst the Appendix of the book contains the outline of the adaptive method of programming which is the foundation for the approach used in the rest of the book.

Bibliographic Information

Publish with us