Skip to main content

Continuous-Time Markov Chains

  • Chapter
  • First Online:
Understanding Markov Chains

Part of the book series: Springer Undergraduate Mathematics Series ((SUMS))

  • 6127 Accesses

Abstract

In this chapter we start the study of continuous-time stochastic processes, which are families \((X_t)_{t\in {{\mathbb R}}_+}\) of random variables indexed by \({{\mathbb R}}_+\). Our aim is to make the transition from discrete to continuous-time Markov chains, the main difference between the two settings being the replacement of the transition matrix with the continuous-time infinitesimal generator of the process. We will start with the two fundamental examples of the Poisson and birth and death processes, followed by the construction of continuous-time Markov chains and their generators in more generality. From the point of view of simulations, the use of continuous-time Markov chains does not bring any special difficulty as any continuous-time simulation is actually based on discrete-time samples. From a theoretical point of view, however, the rigorous treatment of the continuous-time Markov property is much more demanding than its discrete-time counterpart, notably due to the use of the strong Markov property. Here we focus on the understanding of the continuous-time case by simple calculations, and we will refer to the literature for the use of the strong Markov property.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 34.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 44.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    The notation \(f(h) \simeq h^k\) means \(\lim _{h\rightarrow 0} f(h)/h^k = 1\), and \(f(h) = o(h)\) means \(\lim _{h\rightarrow 0} f(h)/h = 0\).

  2. 2.

    Recall that by definition \(f( h ) \simeq g( h )\), \(h \rightarrow 0\), if and only if \(\lim _{h \rightarrow 0} f(h)/g(h) = 1\).

  3. 3.

    Recall that a finite-valued random variable may have an infinite mean.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nicolas Privault .

Exercises

Exercises

Exercise 9.1

A workshop has five machines and one repairman. Each machine functions until it fails at an exponentially distributed random time with rate \(\mu = 0.20\) per hour. On the other hand, it takes a exponentially distributed random time with (rate) \(\lambda = 0.50\) per hour to repair a given machine. We assume that the machines behave independently of one another, and that

  1. (i)

    up to five machines can operate at any given time,

  2. (ii)

    at most one can be under repair at any time.

Compute the proportion of time the repairman is idle in the long run.

Exercise 9.2

Two types of consultations occur at a database according to two independent Poisson processes: “read” consultations arrive at the rate (or intensity) \(\lambda _R\) and “write” consultations arrive at the rate (or intensity) \(\lambda _W\).

  1. (a)

    What is the probability that the time interval between two consecutive “read” consultations is larger than \(t>0\)?

  2. (b)

    What is the probability that during the time interval [0, t], at most three “write” consultations arrive?

  3. (c)

    What is the probability that the next arriving consultation is a “read” consultation?

  4. (d)

    Determine the distribution of the number of arrived “read” consultations during [0, t], given that in this interval a total number of n consultations occurred.

Exercise 9.3

Consider two machines, operating simultaneously and independently, where both machines have an exponentially distributed time to failure with mean \(1/\mu \). There is a single repair facility, and the repair times are exponentially distributed with rate \(\lambda \).

  1. (a)

    In the long run, what is the probability that no machines are operating when \(\lambda = \mu =1\)?

  2. (b)

    We now assume that at most one machine can operate at any time. Namely, while one machine is working, the other one may be either under repair or already fixed and waiting to take over. How does this modify your answer to question (a)?

Exercise 9.4

Passengers arrive at a cable car station according to a Poisson process with intensity \(\lambda >0\). Each car contains at most 4 passengers, and a cable car arrives immediately and leaves with 4 passengers as soon as there are at least 4 people in the queue. We let \((X_t)_{t\in {{\mathbb R}}_+}\) denote the number of passengers in the waiting queue at time \(t\ge 0\).

  1. (a)

    Explain why \((X_t)_{t\in {\mathbb R}_+}\) is a continuous-time Markov chain with state space \(\mathbb {S}= \{ 0 , 1 , 2 , 3 \}\), and give its matrix infinitesimal generator Q.

  2. (b)

    Compute the limiting distribution \(\pi = [\pi _0,\pi _1,\pi _2,\pi _3]\) of \((X_t)_{t\in {\mathbb R}_+}\).

  3. (c)

    Compute the mean time between two departures of cable cars.

Exercise 9.5

[MT15] We consider a stock whose prices can only belong to the following five ticks:

$$ \$ 10.01; \ \ \$ 10.02; \ \ \$ 10.03; \ \ \$ 10.04; \ \ \$ 10.05, $$

numbered \(k=1,2,3,4,5\).

At time t, the order book for this stock contains exactly \(N^{(k)}_t\) sell orders at the price tick n\(^ok\), \(k=1,2,3,4,5\), where \((N^{(k)}_t)_{t\in {{\mathbb R}}_+}\) are independent Poisson processes with same intensity \(\lambda >0\). In addition,

  • any sell order can be cancelled after an exponentially distributed random time with parameter \(\mu >0\),

  • buy market orders are submitted according to another Poisson process with intensity \(\theta >0\), and are filled instantly at the lowest order price present in the book.

Order cancellations can occur as a result of various trading algorithms such as, e.g., “spoofing”, “layering”, or “front running” .

  1. (a)

    Show that the total number of sell orders \(L_t\) in the order book at time t forms a continuous-time Markov chain, and write down its infinitesimal generator Q.

  2. (b)

    It is estimated that 95% percent of high-frequency trader orders are later cancelled. What relation does this imply between \(\mu \) and \(\lambda \)?

Exercise 9.6

The size of a fracture in a rock formation is modeled by a continuous-time pure birth process with parameters

$$\lambda _k = ( 1 + k )^\rho , \qquad k \ge 1 , $$

i.e. the growth rate of the fracture is a power of \(1+k\), where k is the current fracture length. Show that when \(\rho > 1\), the mean time for the fracture length to grow to infinity is finite. Conclude that the time to failure of the rock formation is almost-surely finite.Footnote 3

Exercise 9.7

Customers arrive at a processing station according to a Poisson process with rate \(\lambda = 0.1\), i.e. on average one customer per ten minutes. Processing of customer queries starts as soon as the third customer enters the queue.

  1. (a)

    Compute the expected time until the start of the customer service.

  2. (b)

    Compute the probability that no customer service occurs within the first hour.

Exercise 9.8

Suppose that customers arrive at a facility according to a Poisson process having rate \(\lambda = 3\). Let \(N_t\) be the number of customers that have arrived up to time t and let \(T_n\) be the arrival time of the nth customer, \(n=1,2,\ldots \) Determine the following (conditional) probabilities and (conditional) expectations, where \(0<t_1<t_2<t_3<t_4\) .

  1. (a)

    \({{\mathbb P}}( N_{t_3} = 5 \mid N_{t_1} = 1)\).

  2. (b)

    \({\mathrm{I}}\!{\mathrm{E}}[ N_{t_1} N_{t_4} ( N_{t_3} - N_{t_2} )]\).

  3. (c)

    \({\mathrm{I}}\!{\mathrm{E}}[ N_{t_2} \mid T_2 > t_1 ]\).

Exercise 9.9

Let \((X_t)_{t\in {\mathbb R}_+}\) be a birth and death process on \(\{0,1,2\}\) with birth and death parameters \(\lambda _0 = 2 \alpha \), \(\lambda _1 = \alpha \), \(\lambda _2 = 0\), and \(\mu _0 = 0\), \(\mu _1 = \beta \), \(\mu _2 = 2 \beta \). Determine the stationary distribution of \((X_t)_{t\in {{\mathbb R}}_+}\).

Exercise 9.10

Let \((X_t)_{t\in {\mathbb R}_+}\) be a birth and death process on \(0,1,\ldots , N\) with birth and death parameters \(\lambda _n = \alpha ( N - n )\) and \(\mu _n = \beta n\), respectively. Determine the stationary distribution of \((X_t)_{t\in {{\mathbb R}}_+}\).

Exercise 9.11

Consider a pure birth process with birth rates \(\lambda _0 = 1\), \(\lambda _1 = 3\), \(\lambda _2 = 2\), \(\lambda _3 = 5\). Compute \(P_{0,n} (t)\) for \(n=0,1,2,3\).

Exercise 9.12

Consider a pure birth process \((X_t)_{t\in {{\mathbb R}}_+}\) started at \(X_0=0\), and let \(T_k\) denote the time until the kth birth. Show that

$$ {{\mathbb P}}( T_1> t \text{ and } T_2 > t+s ) = P_{0,0} (t) ( P_{0,0} (s)+P_{0,1} (s) ) . $$

Determine the joint probability density function of \((T_1 , T_2)\), and then the joint density of \((\tau _0 ,\tau _1) : = (T_1, T_2 - T_1)\).

Exercise 9.13

Cars pass a certain street location with identical speeds, according to a Poisson process with rate \(\lambda >0\). A woman at that location needs T units of time to cross the street, i.e. she waits until it appears that no car will cross that location within the next T time units.

  1. (a)

    Find the probability that her waiting time is 0.

  2. (b)

    Find her expected waiting time.

  3. (c)

    Find the total average time it takes to cross the street.

  4. (d)

    Assume that, due to other factors, the crossing time in the absence of cars is an independent exponentially distributed random variable with parameter \(\mu >0\). Find the total average time it takes to cross the street in this case.

Exercise 9.14

A machine is maintained at random times, such that the inter-service times \((\tau _k)_{k\ge 0}\) are i.i.d. with exponential distribution of parameter \(\mu >0\). The machine breaks down if it has not received maintenance for more than T units of time. After breaking down it is automatically repaired.

  1. (a)

    Compute the probability that the machine breaks down before its first maintenance after it is started.

  2. (b)

    Find the expected time until the machine breaks down.

  3. (c)

    Assuming that the repair time is exponentially distributed with parameter \(\lambda >0\), find the proportion of time the machine is working.

Exercise 9.15

A system consists of two machines and two repairmen. Each machine can work until failure at an exponentially distributed random time with parameter 0.2. A failed machine can be repaired only by one repairman, within an exponentially distributed random time with parameter 0.25. We model the number \(X_t\) of working machines at time \(t \in {\mathbb R}_+\) as a continuous-time Markov process.

  1. (a)

    Complete the missing entries in the matrix

    $$ Q = \left[ \begin{array}{ccc} \Box &{} 0.5 &{} 0 \\ 0.2 &{} \Box &{} \Box \\ 0 &{} \Box &{} -0.4 \end{array} \right] $$

    of its generator.

  2. (b)

    Calculate the long-run probability distribution \([\pi _0,\pi _1,\pi _2]\) of \(X_t\).

  3. (c)

    Compute the average number of working machines in the long run.

  4. (d)

    Given that a working machine can produce 100 units every hour, how many units can the system produce per hour in the long run?

  5. (e)

    Assume now that in case a single machine is under failure then both repairmen can work on it, therefore dividing the mean repair time by a factor 2. Complete the missing entries in the matrix

    $$ Q = \left[ \begin{array}{ccc} -0.5 &{} \Box &{} \Box \\ \Box &{} -0.7 &{} \Box \\ \Box &{} \Box &{} -0.4 \end{array} \right] $$

    of the modified generator and calculate the long run probability distribution \([\pi _0,\pi _1,\pi _2]\) for \(X_t\).

Exercise 9.16

Let \(X_1(t)\) and \(X_2(t)\) be two independent two-state Markov chains on \(\{0,1\}\) and having the same infinitesimal matrix

$$ \left[ \begin{array}{cc} -\lambda &{} ~\lambda \\ \mu &{} ~- \mu \end{array} \right] . $$

Argue that \(Z(t) := X_1(t) + X_2(t)\) is a Markov chain on the state space \(\mathbb {S}= \{0,1,2\}\) and determine the transition semigroup P(t) of Z(t).

Exercise 9.17

Consider a two-state discrete-time Markov chain \((\xi _n)_{n\ge 0}\) on \(\{ 0,1 \}\) with transition matrix

$$\begin{aligned} \left[ \begin{array}{cc} 0 &{} ~1 \\ 1 - \alpha &{} ~\alpha \end{array} \right] . \end{aligned}$$
(9.8.4)

Let \((N_t)_{t\in {\mathbb R}_+}\) be a Poisson process with parameter \(\lambda > 0\), and let the

$$ X_t = \xi _{N_t}, \qquad t \in {\mathbb R}_+, $$

i.e. \((X_t)_{t\in {{\mathbb R}}_+}\) is a two-state birth and death process.

  1. (a)

    Compute the mean return time \({\mathbb E} [ T^r_0 \mid X_0 = 0]\) of \(X_t\) to state , where \(T_0^r\) is defined as

    $$ T_0^r = \inf \{ t > T_1 \ : \ X_t = 0\} $$

    and

    $$ T_1 = \inf \{ t > 0 \ : \ X_t = 1\} $$

    is the first hitting time of state . Note that the return time

  2. (b)

    Compute the mean return time \({\mathbb E} [ T^r_1 \mid X_0 = 1]\) of \(X_t\) to state , where \(T_1^r\) is defined as

    $$ T_1^r = \inf \{ t > T_0 \ : \ X_t = 1 \} $$

    and

    $$ T_0 = \inf \{ t > 0 \ : \ X_t = 0\} $$

    is the first hitting time of state . The return time \(T^r_1\) to starting from is evaluated by switching first to state before returning to state .

  3. (c)

    Show that \((X_t)_{t\in {{\mathbb R}}_+}\) is a two-state birth and death process and determine its generator matrix Q in terms of \(\alpha \) and \(\lambda \).

Problem 9.18

Let \((N^1_t)_{t\in {{\mathbb R}}_+}\) and \((N^2_t)_{t\in {{\mathbb R}}_+}\) be two independent Poisson processes with intensities \(\lambda _1>0\) and \(\lambda _2>0\).

  1. (a)

    Show that \( ( N^1_t+N^2_t )_{t\in {{\mathbb R}}_+}\) is a Poisson process and find its intensity.

  2. (b)

    Consider the difference

    $$ M_t = N^1_t - N^2_t, \qquad t\in {{\mathbb R}}_+, $$

    and that \((M_t)_{t\in {{\mathbb R}}_+}\) has stationary independent increments.

  3. (c)

    Find the distribution of \(M_t-M_s\), \(0<s<t\).

  4. (d)

    Compute

    $$ \lim _{t\rightarrow \infty } {{\mathbb P}}( | M_t | \le c ) $$

    for any \(c>0\).

  5. (e)

    Suppose that \(N_t^1\) denotes the number of clients arriving at a taxi station during the time interval [0, t], and that \(N_t^2\) denotes the number of taxis arriving at that same station during the same time interval [0, t].

    How do you interpret the value of \(M_t\) depending on its sign?

    How do you interpret the result of Question (d)?

Problem 9.19

We consider a birth and death process \((X_t)_{t\in {{\mathbb R}}_+}\) on \(\{0,1,\ldots , N\}\) with transition semigroup \((P(t))_{t\in {{\mathbb R}}}\) and birth and death rates

$$ \lambda _n = (N-n) \lambda , \qquad \mu _n = n \mu , \qquad n =0,1,\ldots , N. $$

This process is used for the modeling of membrane channels in neuroscience.

  1. (a)

    Write down the infinitesimal generator Q of \((X_t)_{t\in {{\mathbb R}}_+}\).

  2. (b)

    From the forward Kolmogorov equation \(P'(t) = P(t)Q\), show that for all \(n = 0,1, \ldots , N\) we have

    $$ \left\{ \begin{array}{l} P_{n, 0}' (t) = - \lambda _0 P_{n, 0}(t) + \mu _1 P_{n, 1}(t) , \\ \\ P_{n, k}' (t) = \lambda _{k-1} P_{n, k-1}(t) - ( \lambda _k + \mu _k ) P_{n, k}(t) + \mu _{k+1} P_{n, k+1}(t), \\ \\ P_{n, N}' (t) = \lambda _{N-1} P_{n, N-1}(t) - \mu _N P_{n, N}(t) , \end{array} \right. $$

    \(k=1,2,\ldots , N-1\).

  3. (c)

    Let

    $$ G_k (s, t) = {\mathrm{I}}\!{\mathrm{E}}\big [ s^{X_t} \mid X_0 = k \big ] = \sum _{n=0}^N s^n {{\mathbb P}}( X_t = n \mid X_0 = k ) = \sum _{n=0}^N s^n P_{k, n} (t ) $$

    denote the generating function of \(X_t\) given that \(X_0 = k \in \{ 0 , 1 , \ldots , N \}\). From the result of Question (b), show that \(G_k(s, t)\) satisfies the partial differential equation

    $$\begin{aligned} \frac{\partial G_k }{\partial t} (s, t) = \lambda N ( s-1) G_k (s, t) + ( \mu + ( \lambda - \mu ) s - \lambda s^2 ) \frac{\partial G_k }{\partial s} (s, t) , \end{aligned}$$
    (9.8.5)

    with \(G_k(s, 0) = s^k\), \(k = 0,1,\ldots , N\).

  4. (d)

    Verify that the solution of (9.8.5) is given by

    $$ G_k (s, t) = \frac{1}{(\lambda + \mu )^N} ( \mu + \lambda s + \mu ( s - 1 ) \mathrm {e}^{-(\lambda + \mu )t})^k ( \mu + \lambda s - \lambda ( s - 1 ) \mathrm {e}^{-(\lambda + \mu )t})^{N-k} , $$

    \(k = 0,1,\ldots , N\).

  5. (e)

    Show that

    $$\begin{aligned} {\mathrm{I}}\!{\mathrm{E}}[ X_t \mid X_0 = k ]= & {} \frac{k}{(\lambda + \mu )^N} ( \lambda + \mu \mathrm {e}^{-(\lambda + \mu )t}) ( \mu + \lambda )^{k-1} ( \mu + \lambda )^{N-k} \\&+\, \frac{N-k}{(\lambda + \mu )^N} ( \mu + \lambda )^k (\lambda - \lambda \mathrm {e}^{-(\lambda + \mu )t}) ( \mu + \lambda )^{N-k-1} . \end{aligned}$$
  6. (f)

    Compute

    $$ \lim _{t\rightarrow \infty } {\mathrm{I}}\!{\mathrm{E}}[ X_t \mid X_0 = k ] $$

    and show that it does not depend on \(k\in \{0,1,\ldots , N\}\).

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Singapore Pte Ltd.

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Privault, N. (2018). Continuous-Time Markov Chains. In: Understanding Markov Chains. Springer Undergraduate Mathematics Series. Springer, Singapore. https://doi.org/10.1007/978-981-13-0659-4_9

Download citation

Publish with us

Policies and ethics