Abstract
We study a control problem for a stochastic system with discrete time. The optimality criterion is the probability of the event that the terminal state function does not exceed a given limit. To solve the problem, we use dynamic programming. The loss function is assumed to be lower semicontinuous with respect to the terminal state vector, and the transition function from the current state to the next is assumed to be continuous with respect to all its arguments. We establish that the dynamic programming algorithm lets one in this case find optimal positional control strategies that turn out to be measurable. As an example we consider a two-step problem of security portfolio construction. We establish that in this special case the future loss function on the second step turns out to be continuous everywhere except one point.
Similar content being viewed by others
References
Malyshev, V.V. and Kibzun, A.I., Analiz i sintez vysokotochnogo upravleniya LA (Analysis and Synthesis of High-Accuracy Control for Flying Vehicles), Moscow: Mashinostroenie, 1987.
Kibzun, A.I. and Ignatov, A.N., Reduction of the Two-Step Problem of Stochastic Optimal Control with Bilinear Model to the Problem of Mixed Integer Linear Programming, Autom. Remote Control, 2016, vol. 77, no. 12, pp. 2175–2192.
Bodnar, T., Parolya, N., and Schmid, W., On the Exact Solution of the Multi-Period Portfolio Choice Problem for an Exponential Utility under Return Predictability, Eur. J. Oper. Res., 2015, vol. 246, no. 2, pp. 528–542.
Canakoglu, E. and Ozekici, S., Portfolio Selection in Stochastic Markets with HARA Utility Functions, Eur. J. Oper. Res., 2010, vol. 201, no. 2, pp. 520–536.
Kibzun, A.I. and Kan, Yu.S., Zadachi stokhasticheskogo programmirovaniya po veroyatnostnym kriteriyam (Stochastic Programming Problems with Probability Criteria), Moscow: Fizmatlit, 2009.
Bellman, R., Dynamic Programming, Princeton, New Jersey: Princeton Univ. Press, 1957. Translated under the title Dinamicheskoe programmirovanie, Moscow: Inostrannaya Literatura, 1960.
Bertsekas, D.P. and Sreve, S.E., Stochastic Optimal Control, New York: Academic, 1978. Translated under the title Stokhasticheskoe optimal’noe upravlenie, Moscow: Nauka, 1985.
Dynkin, E.B. and Yushkevich, A.A., Upravlyaemye markovskie protsessy i ikh prilozheniya (Controllable Markov Processes and Their Applications), Moscow: Nauka, 1975.
Malyshev, V.V. and Kibzun, A.I., Optimal Control for a Stochastic System with Discrete Time, Izv. Akad. Nauk SSSR, Tekh. Kibern., 1985, no. 6, pp. 113–120.
Azanov, V.M. and Kan, Yu.S., Single-Parameter Optimal Correction Problem for the Trajectory of a Flying Vehicle with the Probability Criterion, Izv. Ross. Akad. Nauk, Teor. Sist. Upravlen., 2016, no. 2, pp. 1–13.
Azanov, V.M. and Kan, Yu.S., Optimizing the Correction of Near-Circular Orbit of an Artificial Earth Satellite with the Probability Criterion, Tr. ISA RAN, 2015, vol. 65, no. 2, pp. 18–26.
Kibzun, A.I. and Ignatov, A.N., The Two-step Problem of Investment Portfolio Selection from Two Risk Assets via the Probability Criterion, Autom. Remote Control, 2015, vol. 76, no. 7, pp. 1201–1220.
Raik, E.O., On Stochastic Programming Problems with Probability and Quantile Functionals, Izv. Akad. Nauk ESSR, Fiz. Mat., 1972, vol. 21, no. 2, pp. 142–148.
Rudin, W., Principles of Mathematical Analysis, New York: McGraw-Hill, 1964. Translated under the title Osnovy matematicheskogo analiza, Moscow: Mir, 1976, 2nd ed.
Yudin, D.B., Zadachi i metody stokhasticheskogo programmirovaniya (Problems and Methods of Stochastic Programming), Moscow: Sovetskoe Radio, 1979.
Author information
Authors and Affiliations
Corresponding author
Additional information
Original Russian Text © A.I. Kibzun, A.N. Ignatov, 2017, published in Avtomatika i Telemekhanika, 2017, No. 10, pp. 139–154.
Rights and permissions
About this article
Cite this article
Kibzun, A.I., Ignatov, A.N. On the existence of optimal strategies in the control problem for a stochastic discrete time system with respect to the probability criterion. Autom Remote Control 78, 1845–1856 (2017). https://doi.org/10.1134/S0005117917100083
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1134/S0005117917100083