Skip to main content

Markov-Modulated Samples and Their Applications

  • Conference paper
  • First Online:

Part of the book series: Springer Proceedings in Mathematics & Statistics ((PROMS,volume 114))

Abstract

Samples which elements are dependent random variables are considered. This dependence arises due to total external random environment in which sample elements operate. The environment is described by continuous-time finite and ergodic Markov chain. Sample elements are positive random variables, which can be interpreted as lifetime till a failure. If this random variable is greater than value t > 0 and the environment has state i, then failure rate γ i (t; β (i)) is a known function of unknown coefficients \(\beta ^{(i)} = (\beta _{1}^{(i)},\ldots,\beta _{m}^{(i)})\). Maximum likelihood equations are derived for estimators of {β (i)}. A partial case when m = 1 and γ i (t; β (i)) = β i is considered in detail.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Andronov, A.M.: Parameter statistical estimates of Markov-modulated linear regression. In: Statistical Method of Parameter Estimation and Hypotheses Testing, vol. 24, pp. 163–180. Perm State University, Perm (2012) (in Russian)

    Google Scholar 

  2. Andronov, A.M.: Maximum Likelihood Estimates for Markov-additive Processes of Arrivals by Aggregated Data. In: Kollo, T. (ed.) Multivariate Statistics: Theory and Applications. Proceedings of IX Tartu Conference on Multivariate Statistics and XX International Workshop on Matrices and Statistics, pp. 17–33. World Scientific, Singapore (2013)

    Google Scholar 

  3. Andronov, A.M., Gertsbakh, I.B.: Signatures in Markov-modulated processes. Stoch. Models 30, 1–15 (2014)

    Article  MATH  MathSciNet  Google Scholar 

  4. Bellman, R.: Introduction to Matrix Analysis. McGraw-Hill, New York (1969)

    Google Scholar 

  5. Kollo, T., von Rosen, D.: Advanced Multivariate Statistics with Matrices. Springer, Dordrecht (2005)

    Book  MATH  Google Scholar 

  6. Pacheco A., Tang, L.C., Prabhu N.U.: Markov-Modulated Processes & Semiregenerative Phenomena. World Scientific, New Jersey (2009)

    Google Scholar 

  7. Pontryagin, L.S.: Ordinary Differential Equations. Nauka, Moscow (2011) (in Russian)

    Google Scholar 

  8. Rao, C.R.: Linear Statistical Inference and Its Application. Wiley, New York (1965)

    Google Scholar 

  9. Turkington, D.A.: Matrix calculus and zero-one matrices. Statistical and Econometric Applications. Cambridge University Press, Cambridge (2002)

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alexander Andronov .

Editor information

Editors and Affiliations

Appendix

Appendix

Lemma 3.1.

If elements of matrix G(t) are differentiable function of t, then

$$\displaystyle\begin{array}{rcl} \frac{\partial } {\partial t}G(t)^{n}& =& \sum _{ i=0}^{n-1}G(t)^{i}{\biggl [ \frac{\partial } {\partial t}G(t)\biggr ]}G(t)^{n-1-i},\,n = 1,2,\ldots, {}\\ \frac{\partial } {\partial t}\exp (G(t))& =& \frac{\partial } {\partial t}\sum _{i=0}^{\infty }\frac{1} {i!}G(t)^{i} =\sum _{ i=1}^{\infty }\frac{1} {i!}\sum _{j=0}^{i-1}G(t)^{j}{\biggl [ \frac{\partial } {\partial t}G(t)\biggr ]}G(t)^{i-1-j}. {}\\ \end{array}$$

Proof.

Lemma 3.1 is true for n = 1 and 2. If one is true for n > 1, then

$$\displaystyle\begin{array}{rcl} \frac{\partial } {\partial t}G(t)^{n+1}& =& {\biggl [ \frac{\partial } {\partial t}G(t)\biggr ]}G(t)^{n} + G(t) \frac{\partial } {\partial t}G(t)^{n} {}\\ & =& {\biggl [ \frac{\partial } {\partial t}G(t)\biggr ]}G(t)^{n} + G(t)\sum _{ i=0}^{n-1}G(t)^{i}{\biggl [ \frac{\partial } {\partial t}G(t)\biggr ]}G(t)^{n-1-i} {}\\ & =& \sum _{i=0}^{n}G(t)^{i}{\biggl [ \frac{\partial } {\partial t}G(t)\biggr ]}G(t)^{n-i}. {}\\ \end{array}$$

⊓⊔

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer Science+Business Media New York

About this paper

Cite this paper

Andronov, A. (2014). Markov-Modulated Samples and Their Applications. In: Melas, V., Mignani, S., Monari, P., Salmaso, L. (eds) Topics in Statistical Simulation. Springer Proceedings in Mathematics & Statistics, vol 114. Springer, New York, NY. https://doi.org/10.1007/978-1-4939-2104-1_3

Download citation

Publish with us

Policies and ethics