Abstract
The future is largely unpredictable. Nondeterminism allows modelling of some phenomena arising in reactive systems, but it does not allow a quantitative estimation of how likely is one event w.r.t. another.We use the term random or probabilistic to denote systems where the quantitative estimation is possible. In this chapter we present well-studied models of probabilistic systems, called random processes and Markov chains in particular. The second come in two flavours, depending on the underlying model of time (discrete or continuous). Their key feature is called the Markov property and it allows us to develop an elegant theoretical setting, where it can be conveniently estimated, e.g., how long a system will sojourn in a given state, or the probability of finding the system in a given state at a given time or in the long run. We conclude the chapter by discussing how bisimilarity equivalences can be extended to Markov chains.
The future is independent of the past, given the present. (Markov property as folklore)
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2017 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Bruni, R., Montanari, U. (2017). Measure Theory and Markov Chains. In: Models of Computation. Texts in Theoretical Computer Science. An EATCS Series. Springer, Cham. https://doi.org/10.1007/978-3-319-42900-7_14
Download citation
DOI: https://doi.org/10.1007/978-3-319-42900-7_14
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-42898-7
Online ISBN: 978-3-319-42900-7
eBook Packages: Computer ScienceComputer Science (R0)