Skip to main content
Log in

Using Informational Measures of Dependence in Statistical Linearization

  • Published:
Automation and Remote Control Aims and scope Submit manuscript

Abstract

Some problems arising in identification of stochastic systems due to using nonlinear measures of dependence of random variables (processes) were analyzed. Recent publications describing approaches based on a consistent measure of dependence such as the mutual information were discussed. A constructive procedure of creating the linear input-output model that is statistically equivalent to the nonlinear dynamic stochastic system with white-noise Gaussian input process was proposed. The condition for coincidence of the mutual information of the input and output processes of the system and the mutual information of the input and output processes of the model which is used as a criterion for statistical linearization is pivotal to this procedure. This approach enables one to establish explicit relations defining the weight coefficients of the linearized model without using the unrealistic—in the context of identification—a priori condition for knowledge of the joint distribution of the output processes of system and model.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

REFERENCES

  1. Sarmanov, O.V., Maximum Correlation Coefficient (Nonsymmetrical Case), Dokl.Akad.Nauk SSSR, 1958, vol. 121, no. 1, pp. 52–55.

    Google Scholar 

  2. Rényi, A., On Measures of Dependence, Acta Math.Hung., 1959, vol. 10, no. 3–4, pp. 441–451.

    Google Scholar 

  3. Dispersionnaya identifikatsiya (Dispersion-based Identification), Raibman, N.S., Ed., Moscow: Nauka, 1981.

    Google Scholar 

  4. Sarmanov, O.V. and Zakharov, E.K., Measures of Dependence between Random Variables and Spectra of Stochastic Kernels and Matrices, Mat.Sb., 1960, vol. 52(94), no. 4, pp. 953–990.

    Google Scholar 

  5. Stoorvogel, A.A. and van Schuppen, J.H., System Identification with Information Theoretic Criteria, in Identification, Adaptation, Learning, Bittanti, S. and Picci, G., Eds., Berlin: Springer, 1996, pp. 289–338.

    Google Scholar 

  6. Durgaryan, I.S. and Pashchenko, F.F., Plant Identification by the Criterion for Maximum of Information Content, Avtom.Telemekh., 2001, no. 7, pp. 91–102.

  7. Prangishvili, I.V. et al., Determination and Modeling of Regularities from Experimental Data, in Sistemnye zakony i zakonomernosti v elektrodinamike, prirode i obshchestve (System Laws and Regularities in Electrodynamics, Nature, and Society), Moscow: Nauka, 2001, ch. 7, pp. 411–521.

    Google Scholar 

  8. Prangishvili, I.V. et al., Problem of System Modeling, in Sistemnye zakony i zakonomernosti v elektrodinamike, prirode i obshchestve (System Laws and Regularities in Electrodynamics, Nature, and Society), Moscow: Nauka, 2001, ch. 3, pp. 200–235.

    Google Scholar 

  9. Korolyuk, V.S. et al., Spravochnik po teorii veroyatnostei and matematicheskoi statistike (Handbook on Probability Theory and Mathematical Statistics), Moscow: Nauka, 1985.

    Google Scholar 

  10. Sarmanov, O.V., Pseudonormal Correlation and Its Various Generalizations, Dokl.Akad.Nauk SSSR, 1960, vol. 132, no. 2, pp. 299–302.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Chernyshev, K.R. Using Informational Measures of Dependence in Statistical Linearization. Automation and Remote Control 63, 1439–1447 (2002). https://doi.org/10.1023/A:1020082120927

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1020082120927

Keywords

Navigation