Observation control problems in discrete-continuous stochastic systems

  • Boris M. Miller
  • Evgeny Ya. Rubinovich


The most common way to formulate a stochastic control problem is to let the control affect only the evolution of the state but not the observation program, which is usually supposed to be fixed and continuous. However, in many practical situations we also have a possibility of controlling the observation program in a way that affects both the observations timing and composition. This, in turn, leads to a control problem where one tries to choose the control to maximize the information content of the observations regarding the state taking at the same time into account various constraints and possible penalty imposed on the control effort, i.e. the observations control problem.


Maximum Principle Optimal Control Problem Auxiliary Problem Observation Process Sufficient Optimality Condition 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer Science+Business Media New York 2003

Authors and Affiliations

  • Boris M. Miller
    • 1
  • Evgeny Ya. Rubinovich
    • 2
  1. 1.Institute for Information Transmission ProblemsMoscowRussia
  2. 2.Institute of Control SciencesMoscowRussia

Personalised recommendations