Abstract
Reliability problems are normally concerned with systems that are discrete in space, i.e., they can exist in one of a number of discrete and identifiable states, and continuous in time; i.e., they exist continuously in one of the system states until a transition occurs which takes them discretely to another state in which they then exist continuously until another transition occurs. The techniques described in this chapter pertain to systems that can be described as stationary Markov processes i.e., the conditional probability of failure or repair during any fixed interval of time is constant. This implies that the failure and repair characteristics of the components are associated with (negative) exponential distributions. In the case of a single component, or in systems composed of statistically independent components, the limiting or steady-state probabilities are not dependent on the state residence time distributions, only upon their mean values. This is discussed further in Chapter 12. It must be stressed however, that very considerable differences can exist in the values of the time-dependent state probabilities as these are very dependent on the distributional assumptions.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 1992 Springer Science+Business Media New York
About this chapter
Cite this chapter
Billinton, R., Allan, R.N. (1992). Continuous Markov processes. In: Reliability Evaluation of Engineering Systems. Springer, Boston, MA. https://doi.org/10.1007/978-1-4899-0685-4_9
Download citation
DOI: https://doi.org/10.1007/978-1-4899-0685-4_9
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4899-0687-8
Online ISBN: 978-1-4899-0685-4
eBook Packages: Springer Book Archive