# Long Term Behaviour of a Reversible System of Interacting Random Walks

- 78 Downloads

## Abstract

This paper studies the long-term behaviour of a system of interacting random walks labelled by vertices of a finite graph. We show that the system undergoes phase transitions, with different behaviour in various regions, depending on model parameters and properties of the underlying graph. We provide the complete classification of the long-term behaviour of the corresponding continuous time Markov chain, identifying whether it is null recurrent, positive recurrent, or transient. The proofs are partially based on the reversibility of the model, which allows us to use the method of electric networks. We also provide some alternative proofs (based on the Lyapunov function method and the renewal theory), which are of interest in their own right, since they do not require reversibility and can be applied to more general situations.

## Keywords

Markov chain Random walk Transience Recurrence Lyapunov function Martingale Renewal measure Return time## Mathematics Subject Classification

60K35 60G50## 1 Introduction

Let *G* be a finite non-oriented graph with \(n\ge 1\) vertices labelled by \(1,2,\dots , n\). Somewhat abusing notation, we will use *G* also for the set of the vertices of this graph. Let \(A=(a_{ij})\) be the adjacency matrix of the graph, that is \(a_{ij}=a_{ji}=1\) or \(a_{ij}=0\) according to whether vertices *i* and *j* are adjacent (connected by an edge) or not. If vertices \(i, j\in G\) are connected by an edge, i.e. \(a_{ij}=1\), call them *neighbours* and write \(i\sim j\). By definition, a vertex is not a neighbour of itself, i.e. \(a_{ii}=0\) for all \(i=1,\dots ,n\) (i.e. there are no self-loops).

*i*-th unit vector, and \(||\cdot ||\) denotes the usual Euclidean norm.

It is easy to see that if \(\beta =0\), then CTMC \(\xi (t)\) is a collection of *n* independent reflected continuous-time random walks on \({\mathbb {Z}}_{+}\) (symmetric if also \(\alpha =0\)). In general, the Markov chain can be regarded as an inhomogeneous random walk on the infinite graph \({\mathbb {Z}}_{+}^G\). Alternatively, it can be interpreted as a system of *n* random walks on \({\mathbb {Z}}_{+}\) labelled by the vertices of graph *G* and evolving subject to a nearest neighbour interaction.

The purpose of the present paper is to study how the long term behaviour of CTMC \(\xi (t)\) depends on the parameters \(\alpha \) and \(\beta \) together with properties of the graph *G*. In our main result (Theorem 2.1), we give a complete classification saying whether the Markov chain is recurrent or transient, and in the recurrent case whether it is positive recurrent or null recurrent. We find phase transitions, with different behaviour in various regions depending on the parameters \(\alpha \), \(\beta \) and properties of graph *G*. Furthermore, we give results (Theorem 6.1) on whether the Markov chain is explosive or not. (This is relevant for the transient case only, since a recurrent CTMC always is non-explosive.) These results are less complete and leave one case open.

It is obvious that CTMC \(\xi (t)\) is irreducible; hence the initial distribution is irrelevant for our results. (We may if we like assume that we start at \({{\mathbf {0}}}=(0,\dots ,0)\in {\mathbb {Z}}_+^n\).)

CTMC \(\xi (t)\) was introduced in [13], where its long term behaviour was studied in several cases. In particular, conditions for positive or null recurrence and transience were obtained in some special cases; these results are extended in the present paper. In addition, the typical asymptotic behaviour of the Markov chain was studied in some transient cases.

One example of our results is the case \(\alpha <0\) and \(\beta >0\), which is of a particular interest because of the following phenomenon observed in [13] in some special cases. If \(\alpha <0\) and \(\beta =0\), then, as said above, CTMC \(\xi (t)\) is formed by a collection of independent positive recurrent reflected random walks on \({\mathbb {Z}}_{+}\), and is thus positive recurrent. If both \(\alpha <0\) and \(\beta <0\), then the Markov chain is still positive recurrent (as shown below). The interaction in this case is, in a sense, competitive, as neighbours obstruct the growth of each other. Now keep \(\alpha <0\) fixed but let \(\beta >0\). If \(\beta \) is positive, but not large, then one could intuitively expect that the Markov chain is still positive recurrent (“stable”), as the interaction (cooperative in this case) is not strong enough. On the other hand, if \(\beta >0\) is sufficiently large, then the intuition suggests that the Markov chain becomes transient (“unstable”). It turns out that this is correct and that the phase transition in the model behaviour occurs at the critical value \(\beta =\frac{|\alpha |}{\lambda _1(G)}\), where \(\lambda _1(G)\) is the largest eigenvalue of (the adjacency matrix of) the graph *G*. Namely, if \(\beta <\frac{|\alpha |}{\lambda _1(G)}\) then the Markov chain is positive recurrent, and if \(\beta \ge \frac{|\alpha |}{\lambda _1(G)}\) then the Markov chain is transient. Moreover, it turns out that exactly at the critical regime, i.e., \(\beta =\frac{|\alpha |}{\lambda _1(G)}\), the Markov chain is non-explosive transient. We conjecture that if \(\beta >\frac{|\alpha |}{\lambda _1(G)}\), then it is explosive transient. This remains as an open problem in the general case (see Remark 6.1 below). Another important contribution of this paper to the previous study of the Markov chain is a recurrence/transience classification in the case \(\alpha =0\) and \(\beta <0\). This case was discussed in [13] only for the simplest graph with two vertices. We show that in general there are only two possible long term behaviours of the Markov chain if \(\alpha =0\) and \(\beta <0\). Namely, CTMC \(\xi (t)\) is either non-explosive transient or null recurrent, and this depends only on the independence number of the graph *G*.

We also consider some variations of the Markov chain defined above. First, we include in our results the Markov chain above with dynamics obtained by setting \(\beta =-\infty \) (with convention \(0\cdot \infty =0\)). In other words, a component cannot jump up (only down, when possible), if at least one of its neighbours is non-zero; this can thus be interpreted as *hard-core* interaction. See Sect. 3.3 for more details on this hard-core case.

In Sect. 5 we consider the discrete time Markov chain (DTMC) \(\zeta (t)\in {\mathbb {Z}}_{+}^n\) that corresponds to CTMC \(\xi (t)\), i.e. the corresponding embedded DTMC. We show that our main results also apply to this DTMC.

We use essentially the method of electric networks in our proofs; this is possible since the CTMC \(\xi (t)\) is reversible (see Sect. 3.1). The use of reversibility was rather limited in [13], where the Lyapunov function method and direct probabilistic arguments were the main research techniques. In addition, we provide examples of alternative proofs of some of our results based on the Lyapunov function method and renewal theory for random walks. The advantage of these alternative methods is that they do not require reversibility and can be applied in more general situations. Therefore, the alternative proofs are of interest on their own right.

### Remark 1.1

In the case \(\alpha =\beta =0\), all rates in (1.1) equal 1, and the Markov chain is a continuous-time version of a simple random walk on \({\mathbb {Z}}_+^n\). It is known that a simple random walk on the octant \({\mathbb {Z}}_+^n\) is null recurrent for \(n\le 2\) and transient for \(n\ge 3\); this is a variant of the corresponding well-known result for simple random walk on \({\mathbb {Z}}^n\), and can rather easily be shown using electric network theory, see Example 3.1 below.

### Remark 1.2

We allow the graph *G* to be disconnected. However, there is no interaction between different components of *G*, and the CTMC \(\xi (t)\) consists of independent Markov chains defined by the connected components of *G*. Hence, the case of main interest is when *G* is connected.

### Remark 1.3

The case when *G* has no edges is somewhat exceptional but also rather trivial, since then the value of \(\beta \) is irrelevant, and \(\xi (t)\) consists of *n* independent continuous-time random walks on \({\mathbb {Z}}_+\); in fact, \(\xi (t)\) then is as in the case \(\beta =0\) for any other *G* with *n* vertices. In particular, if *G* has no edges, we may assume \(\beta =0\).

### Remark 1.4

CTMC \(\xi (t)\) is a model of interacting spins and, as such, is related to models of statistical physics. The stationary distribution of a finite Markov chain with bounded components and the same transition rates is of interest in statistical physics. In particular, if components take only values 0 and 1, then the stationary distribution of the corresponding Markov chain is equivalent to a special case of the famous Ising model. One of the main problems in statistical physics is to determine whether such a probability distribution is subject to phase transition as the underlying graph indefinitely expands. In the present paper, we keep the finite graph *G* fixed, but allow arbitrarily large components \(\xi _i\). We then study phase transitions of this model, in the sense discussed above.

## 2 The Main Results

In order to state our results, we need two definitions from graph theory. We also let *e*(*G*) denote the number of edges in *G*.

### Definition 2.1

The *eigenvalues* of a finite graph *G* are the eigenvalues of its adjacency matrix *A*. These are real, since *A* is symmetric, and we denote them by \(\lambda _1(G)\ge \lambda _2(G)\ge \dots \ge \lambda _n(G)\), so that \(\lambda _1:=\lambda _1(G)\) is the largest eigenvalue.

Note that \(\lambda _1(G)>0\) except in the rather trivial case \(e(G)=0\) (see Remark 1.3).

### Definition 2.2

- (i)
An

*independent set*of vertices in a graph*G*is a set of the vertices such that no two vertices in the set are adjacent. - (ii)
The

*independence number*\(\kappa = \kappa (G)\) of a graph*G*is the cardinality of the largest independent set of vertices.

For example, if *G* is a cycle graph \({\mathsf {C}}_n\) with *n* vertices, then \(\kappa =\lfloor n/2 \rfloor \).

### Theorem 2.1

- (i)
If \(\alpha <0\) and \(\alpha +\beta \lambda _1(G)<0\), then \(\xi (t)\) is positive recurrent.

- (ii)\(\xi (t)\) is null recurrent in the following cases:
- (a)
\(\alpha =0\), \(\beta <0\) and \(\kappa (G)\le 2\),

- (b)
\(\alpha =\beta =0\) and \(n\le 2\),

- (c)
\(\alpha =0\), \(\beta >0\), \(e(G)=0\) and \(n\le 2\).

- (a)
- (iii)In all other cases, \(\xi (t)\) is transient. This means the cases
- (a)
\(\alpha >0\),

- (b)
\(\alpha =0\), \(\beta >0\) and \(e(G)>0\),

- (c)
\(\alpha =0\), \(\beta >0\), \(e(G)=0\) and \(n\ge 3\),

- (d)
\(\alpha =\beta =0\) and \(n\ge 3\),

- (e)
\(\alpha =0\), \(\beta <0\) and \(\kappa (G)\ge 3\),

- (f)
\(\alpha <0\) and \(\alpha +\beta \lambda _1(G)\ge 0\).

- (a)

### Remark 2.1

Theorem 2.1 shows that the behaviour of the Markov chain has the following monotonicity property: if the Markov chain is transient for some given parameters \((\alpha _0, \beta _0)\), then it is also transient for all parameters \((\alpha , \beta )\) such that \(\alpha \ge \alpha _0\) and \(\beta \ge \beta _0\). This can also easily be seen directly using electric networks as in Sect. 3.2, see the proof of Lemma 4.8.

### Remark 2.2

There is a vast literature devoted to a graph eigenvalues. In particular, there are well known bounds for the largest eigenvalue \(\lambda _1\). We give two simple examples where the largest eigenvalue \(\lambda _1\) easily can be computed explicitly, which allows us to rewrite the conditions of Theorem 2.1 in the case \(\alpha <0\) in more explicit form. These examples basically rephrase results previously obtained in [13, Theorems 4 and 6].

### Example 2.1

Assume that *G* is a regular graph, i.e., a graph with constant vertex degrees \(\nu \), say. Then \(\lambda _1=\nu \). Hence, the Markov chain is positive recurrent if and only if \(\alpha <0\) and \(\alpha +\beta \nu <0\). If \(\alpha <0\) and \(\alpha +\beta \nu \ge 0\), then the Markov chain is transient.

### Example 2.2

Assume that the graph *G* is a star \({\mathsf {K}}_{1,m}\) with \(m=n-1\) non-central vertices, where \(m\ge 1\). A direct computation gives that \(\lambda _1=\sqrt{m}\). Hence, the Markov chain is positive recurrent if and only if \(\alpha <0\) and \(\alpha +\beta \sqrt{m}<0\). If \(\alpha <0\) and \(\alpha +\beta \sqrt{m}\ge 0\), then the Markov chain is transient.

We consider also two examples with \(\alpha =0\) and \(\beta <0\), when the independence number \(\kappa (G)\) is crucial.

### Example 2.3

Let, as in Example 2.2, *G* be a star \({\mathsf {K}}_{1,m}\), where \(m\ge 1\). Then \(\kappa (G)=m=n-1\). Assume that \(\alpha =0\) and \(\beta <0\). Then, the Markov chain is null recurrent if \(n\le 3\), and transient if \(n\ge 4\).

### Example 2.4

Let *G* be a cycle \({\mathsf {C}}_{n}\), where \(n\ge 3\). Then \(\kappa (G)=\lfloor n/2\rfloor \). Assume that \(\alpha =0\) and \(\beta <0\). Then, the Markov chain is null recurrent if \(n\le 5\), and transient if \(n\ge 6\).

## 3 Preliminaries

### 3.1 Reversibility of the Markov Chain

*G*), \(\langle \cdot , \cdot \rangle \) is the Euclidean scalar product,

*E*is the \(n\times n\) identity matrix,

*A*is the adjacency matrix of the graph

*G*and

The explicit formula for the invariant measure \(\mu \) enables us to easily see when \(\mu \) is summable, and thus can be normalised to an invariant distribution (i.e., a probability measure); we return to this in Lemma 4.12.

### Remark 3.1

Recall that a recurrent CTMC has an invariant measure that is unique up to a multiplicative constant, while a transient CTMC in general may have several linearly independent invariant measures (or none). We do not investigate whether the invariant measure \(\mu \) is unique (up to constant factors) for our Markov chain also in transient cases.

### 3.2 Electric Network Corresponding to the Markov Chain

*i*in our case. Also, \(C_{{{\mathbf {0}}},{\mathbf{e}}_i}=e^{W(\mathbf{e}_i)}=1\), i.e., the edges connecting the origin \({{\mathbf {0}}}\) with \(\mathbf{e}_i\) have conductance 1, and thus resistance 1 (Ohm, say).

We denote the network consisting of \({\mathbb {Z}}_+^n\) with the conductances (3.5) by \(\varGamma _{\alpha , \beta , G}\). Otherwise, we will for convenience sometimes denote an electric network by the same symbol as the underlying graph when it is clear from the context what the conductances are.

Let \(N(\varGamma )\) be an electric network on an infinite graph \(\varGamma \). The effective resistance \(R_\infty (\varGamma )=R_\infty (N(\varGamma ))\) of the network is defined, loosely speaking, as the resistance between some fixed point of \(\varGamma \), which in our case we choose as \({{\mathbf {0}}}\), and infinity (see e.g. [2, 6] or [8] for more details). Recall that a reversible Markov chain is transient if and only if the effective resistance of the corresponding electric network is finite. Equivalently, a reversible Markov chain is recurrent if and only if the effective resistance of the corresponding electric network is infinite.

A common approach to showing either recurrence or transience of a reversible Markov chain is based on Rayleigh’s monotonicity law. In particular, if \(N(\varGamma ')\) is a subnetwork of \(N(\varGamma )\), obtained by deleting some edges, then \(R_\infty (\varGamma )\le R_\infty (\varGamma ')\). Therefore, if \(R_\infty (\varGamma ')<\infty \), then \(R_\infty (\varGamma )<\infty \) as well, and thus the corresponding Markov chain on \(\varGamma \) is transient. Similarly, if the network \(N(\varGamma '')\) is obtained from \(N(\varGamma )\) by short-circuiting one or several sets of vertices, then \(R_\infty (\varGamma '')\le R_\infty (\varGamma )\). Hence, if \(R_\infty (\varGamma '')=\infty \), then \(R_\infty (\varGamma )=\infty \) as well, and the corresponding Markov chain on \(\varGamma \) is recurrent.

### Example 3.1

We illustrate these methods, and give a flavour of later proofs, by showing how they work for a simple random walk (SRW) on \({\mathbb {Z}}_+^n\), which as said in Remark 1.1 is the special case \(\alpha =\beta =0\) of our model. The corresponding electric network has all resistances equal to 1.

*in parallel*connecting \(V_{L-1}\) and \(V_L\). As a result, their conductances (i.e. inverse of resistance) sum up; hence the effective resistance \(R_L\) between \(V_{L-1}\) and \(V_L\) is \(\asymp \frac{1}{L^{n-1}}\). Now \(\varGamma ''\) consists of a sequence of resistors \(R_L\)

*in series*, so we must sum them; consequently the resistance of the modified network is

On the other hand, if \(n\ge 3\), one can show that the random walk is transient. See, for example, the description of the tree \(NT_{2.5849}\) in [2, Sect. 2.2.9], or the construction of a flow with finite energy in [8, page 41] (there done for \({\mathbb {Z}}^n\), but works for \({\mathbb {Z}}_+^n\) too), for a direct proof that \(R_\infty ({\mathbb {Z}}_+^n)<\infty \). An alternative argument uses the well-known transience of SRW on \({\mathbb {Z}}^n\) (\(n\ge 3\)) as follows. Consider a unit current flow from \({{\mathbf {0}}}\) to infinity on \({\mathbb {Z}}^n\). By symmetry, for every vertex \((x_1,\dots ,x_n)\in {\mathbb {Z}}^n\), the potential is the same at all points \((\pm x_1,\dots ,\pm x_n)\). Hence we may short-circuit each such set without changing the effective resistance \(R_\infty \). The short-circuited network, \(\varGamma '\) say, is thus also transient. However, \(\varGamma '\) can be regarded as a network on \({\mathbb {Z}}_+^n\) where each edge has a conductance between 2 and \(2^n\) (depending only on the number of non-zero coordinates). Hence, by Rayleigh’s monotonicity law, \(R_\infty ({\mathbb {Z}}_+^n)\le 2^nR_\infty (\varGamma ')<\infty \), and thus the SRW is transient.

### 3.3 The Hard-Core Interaction

Therefore, in the hard-core case we consider the Markov chain with the state space \(\varGamma _0\). This chain on \(\varGamma _0\) is easily seen to be irreducible.

Note that \(\varGamma _0\) is the set of configurations such that \(\langle A\xi ,\xi \rangle =0\), where *A* is the adjacency matrix of graph *G*. Equivalently, a configuration \(\xi \) belongs to \(\varGamma _0\) if and only if the set \(\{i:\xi _i>0\}\) is an independent set of vertices in *G* (see Definition 2.2).

### Remark 3.2

## 4 Proof of Theorem 2.1

In this section we prove Theorem 2.1 by proving a long series of lemmas treating different cases. Note that we include the hard-core case \(\beta =-\infty \). (For emphasis we say this explicitly each time it may occur.) Recall that \(\varGamma _{\alpha , \beta , G}\) denotes \({\mathbb {Z}}_+^n\) regarded as an electrical network with conductances (3.5) corresponding to the CTMC \(\xi (t)\).

As a first application of the method of electric networks we treat the case \(\alpha >0\).

### Lemma 4.1

If \(\alpha >0\) and \(-\infty \le \beta <\infty \), then the CTMC \(\xi (t)\) is transient.

### Proof

We give similar arguments for the other transient cases. Recall that *A* is a non-negative symmetric matrix with eigenvalues \(\lambda _1,\dots ,\lambda _n\). Thus there exists an orthonormal basis of eigenvectors \(\mathbf{v}_i\) with \(A\mathbf{v}_i=\lambda _i\mathbf{v}_i\), \(i=1,\ldots , n\). By the Perron–Frobenius theorem \(\mathbf{v}_1\) can be chosen non-negative, i.e. \(\mathbf{v}_1\in {{\mathbb {R}}}_{+}^n\). (If *G* is connected, then \(\mathbf{v}_1\) is unique and strictly positive.)

### Lemma 4.2

If \(\alpha <0\) and \(\alpha +\beta \lambda _1\ge 0\), then the CTMC \(\xi (t)\) is transient.

### Proof

*y*(

*t*) is piecewise constant. Let \(y_0=0, y_1, y_2,\ldots \) be the sequence of different values of

*y*(

*t*), where at each

*t*such that two or more coordinates of

*y*(

*t*) jump simultaneously, we insert intermediate vectors, so that only one coordinate changes at a time, and \(||y_{k+1}-y_k||=1\) for all

*k*. Then \(S(y_k)\), the sum of coordinates of \(y_k\), is equal to

*k*, and thus \(k/n\le ||y_k||\le k\). Furthermore, for each

*k*there is a \(t_k\) such that \(||y_k-y(t_k)||\le n\), and thus

### Lemma 4.3

If \(\alpha =0\), \(\beta >0\) and \(e(G)>0\), then the CTMC \(\xi (t)\) is transient.

### Proof

*k*.

It follows again that the subnetwork \(\varGamma ':=\{y_k\}\) has finite effective resistance, and thus the Markov chain is transient. \(\square \)

Alternatively, several other choices of paths \(\{y_k\}\) could have been used in the proof of Lemma 4.2, for example \(\{(k,1,0,\dots ,0):k\ge 0\}\).

### Lemma 4.4

If \(\alpha =0\), \(\beta =0\) and \(n\ge 3\), then the CTMC \(\xi (t)\) is transient.

### Proof

As said in Remark 1.1 and Example 3.1, in this case, the Markov chain is just simple random walk on \({\mathbb {Z}}_+^n\), which is transient for \(n\ge 3\). \(\square \)

### Lemma 4.5

If \(\alpha =0\), \(\beta >0\), \(e(G)=0\) and \(n\ge 3\), then the CTMC \(\xi (t)\) is transient.

### Proof

When \(e(G)=0\), the parameter \(\beta \) is irrelevant and may be changed to 0. The result thus follows from Lemma 4.4. \(\square \)

### Lemma 4.6

If \(\alpha =0\), \(\beta \ge -\infty \) and \(\kappa \ge 3\), then the CTMC \(\xi (t)\) is transient.

### Proof

*G*not adjacent to each other; w.l.o.g. let them be 1, 2 and 3. Consider the subnetwork

We turn to proving recurrence in the remaining cases.

### Lemma 4.7

If \(\alpha <0\), \(\alpha +\beta \lambda _1<0\) and \(\beta \ge 0\), then the CTMC \(\xi (t)\) is recurrent.

### Proof

*c*,

*C*.

### Lemma 4.8

If \(\alpha <0\), \(\alpha +\beta \lambda _1<0\) and \(-\infty \le \beta \le 0\), then the CTMC \(\xi (t)\) is recurrent.

### Proof

We use monotonicity. If we replace \(\beta \) by 0, then Lemma 4.7 applies; consequently, \(R_\infty (\varGamma _{\alpha , 0, G})=\infty \). On the other hand, if \(W_0(\xi )\) is defined by (3.1) with \(\beta \) replaced by 0, then \(W(\xi )\le W_0(\xi )\) (since \(\beta \le 0\)), and thus by (3.5), each edge in \(\varGamma _{\alpha , \beta , G}\) has at most the same conductivity as in \(\varGamma _{\alpha , 0, G}\). Equivalently, each resistance is at least as large in \(\varGamma _{\alpha , \beta , G}\) as in \(\varGamma _{\alpha , 0, G}\), and thus by Rayleigh’s monotonicity law, \(R_\infty (\varGamma _{\alpha , \beta , G})\ge R_\infty (\varGamma _{\alpha , 0, G})=\infty \). Hence, the Markov chain is recurrent. \(\square \)

### Lemma 4.9

If \(\alpha =0\), \(\beta =0\) and \(n\le 2\), then the CTMC \(\xi (t)\) is recurrent.

### Lemma 4.10

If \(\alpha =0\), \(-\infty \le \beta <0\) and \(\kappa \le 2\), then the CTMC \(\xi (t)\) is recurrent.

### Proof

We assume that \(n\ge 3\); the case \(n\le 2\) follows by a simpler version of the same argument (taking \(u=0\) below), or by Lemma 4.9 and Rayleigh’s monotonicity law as in the proof of Lemma 4.8.

The assumption \(\kappa \le 2\) implies that amongst any *three* vertices of the graph there are at least two which are connected by an edge.

*L*possibilities for \(x_{(2)}\) and then \(x_{(1)}=L-\sum _{i\ge 2}^n x_{(i)}\) is determined, and there are at most

*n*! different orderings of \(x_i\) for each \(x_{(1)},\dots ,x_{(n)}\).

*in parallel*, so we sum their conductance to get an effective conductance between \(V_{L-1}\) and \(V_L\), which is thus bounded above by

### Lemma 4.11

If \(\alpha =0\), \(\beta >0\), \(e(G)=0\) and \(n\le 2\), then the CTMC \(\xi (t)\) is recurrent.

### Proof

Since \(e(G)=0\), we may replace \(\beta \) by 0; the result then follows from Lemma 4.9. \(\square \)

### Remark 4.1

In general, a CTMC may have an invariant distribution and be explosive (and thus transient), see e.g. [10, Sect. 3.5]; we will see that this does not happen in our case. In other words, our CTMC is positive recurrent exactly when \(Z_{{\alpha , \beta , G}}<\infty \). See also Sect. 5.

### Lemma 4.12

Let \(-\infty<\alpha <\infty \) and \(-\infty \le \beta <\infty \). Then \(Z_{{\alpha , \beta , G}}<\infty \) if and only if \(\alpha <0\) and \(\alpha +\beta \lambda _1<0\).

### Proof

We consider four different cases.

*Case 1:*\(\alpha \ge 0\). By (4.1), \(e^{W(k\mathbf{e}_1)}=e^{\frac{\alpha }{2}k(k-1)}\ge 1\), and thus \(Z_{{\alpha , \beta , G}}\ge \sum _{k=1}^\infty e^{W(k\mathbf{e}_1)}=\infty .\)

*Case 2:*\(\alpha <0\) and \(\alpha +\beta \lambda _1\ge 0\). Let \(y_k\) be as in Lemma 4.2. Then (4.5) applies and implies in particular \(W(y_k)\ge -C\) for some constant

*C*. Hence,

*Case 3:*\(\alpha <0\), \(\alpha +\beta \lambda _1<0\) and \(\beta \ge 0\). The estimate (4.11) applies for every \(\xi \in V_L\), and since the number of vertices in \(V_L\) is \(O(L^{n-1})\) for \(L\ge 1\), we have

*Case 4:*\(\alpha <0\), \(\alpha +\beta \lambda _1<0\) and \(-\infty \le \beta \le 0\). We use monotonicity as in the proof of Lemma 4.8. Let again \(W_0(\xi )\) be given by (3.1) with \(\beta \) replaced by 0. Then, since \(\beta \le 0\), \(W(\xi )\le W_0(\xi )\) and thus \(Z_{{\alpha , \beta , G}}\le Z_{{\alpha , 0, G}}\). Furthermore, \(Z_{{\alpha , 0, G}}<\infty \) by Case 3. Hence, \(Z_{{\alpha , \beta , G}}<\infty \). \(\square \)

### Lemma 4.13

- (i)
If \(\alpha <0\) and \(\alpha +\beta \lambda _1<0\), then the CTMC \(\xi (t)\) is positive recurrent.

- (ii)
If \(\alpha =0\), \(-\infty \le \beta <0\) and \(\kappa \le 2\), then the CTMC \(\xi (t)\) is null recurrent.

- (iii)
If \(\alpha =0\), \(\beta =0\) and \(n\le 2\), then the CTMC \(\xi (t)\) is null recurrent.

- (iv)
If \(\alpha =0\), \(\beta >0\), \(e(G)=0\) and \(n\le 2\), then the CTMC \(\xi (t)\) is null recurrent.

### Proof

In all four cases, the Markov chain is recurrent, by Lemmas 4.7, 4.8, 4.9, 4.10, 4.11. Hence the chain is non-explosive, and the invariant measure is unique up to a constant factor; furthermore, the chain is positive recurrent if and only if this measure has finite total mass so that there exists an invariant distribution. In other words, in these recurrent cases, the chain is positive recurrent if and only if \(Z_{{\alpha , \beta , G}}<\infty \). By Lemma 4.12, this holds in case (i), but not in (ii)–(iv). \(\square \)

## 5 The Corresponding Discrete Time Markov Chain

In this section we consider the discrete time Markov chain (DTMC) \(\zeta (t)\in {\mathbb {Z}}_{+}^n\) that corresponds to the CTMC \(\xi (t)\), i.e. the corresponding embedded DTMC. Note that we use *t* to denote both the continuous and the discrete time, although the two chains are related by a random change of time.

### Theorem 5.1

The conclusions in Theorem 2.1 hold also for the DTMC \(\zeta (t)\).

### Lemma 5.1

Let \(-\infty<\alpha <\infty \) and \(-\infty \le \beta <\infty \). Then \({{\widehat{Z}}}_{{\alpha , \beta , G}}<\infty \) if and only if \(\alpha <0\) and \(\alpha +\beta \lambda _1<0\).

### Proof of Theorem 5.1

As said above, \(\zeta (t)\) is transient precisely when \(\xi (t)\) is. A DTMC is positive recurrent if and only if it has an invariant distribution, and then every invariant measure is a multiple of the stationary distribution. Hence, \(\zeta (t)\) is positive recurrent if and only if the invariant measure \({{\widehat{\mu }}}(\xi )\) has finite mass, i.e., if \({{\widehat{Z}}}_{{\alpha , \beta , G}}<\infty \). Lemma 5.1 shows that this holds precisely in case (i) of Theorem 2.1, i.e., when \(\xi (t)\) is positive recurrent. \(\square \)

### Remark 5.1

We can use the DTMC \(\zeta (t)\) to give an alternative proof of Lemma 4.13(i) without Lemmas 4.7–4.8. Assume \(\alpha <0\) and \(\alpha +\beta \lambda _1<0\). Then, by Lemma 5.1, \({{\widehat{Z}}}_{{\alpha , \beta , G}}<\infty \). Hence, the DTMC \(\zeta (t)\) has a stationary distribution and is thus positive recurrent. (Recall that this implication holds in general for a DTMC, but not for a CTMC, see Remark 4.1.) Hence \(\xi (t)\) is recurrent, and thus non-explosive. Furthermore, Lemma 4.12 shows that also \(Z_{{\alpha , \beta , G}}<\infty \), and thus also \(\xi (t)\) has a stationary distribution. Since \(\xi (t)\) is non-explosive, this implies that \(\xi (t)\) is positive recurrent.

## 6 Explosions

It was shown in [13] that in most of the transient cases in Theorem 2.1, the CTMC \(\xi (t)\) is explosive. (Recall that a recurrent CTMC is non-explosive.) We complement this by exhibiting in Lemma 6.1 one non-trivial transient case where \(\xi (t)\) is non-explosive.

Recall also the standard fact that if, as above, \(q_\xi :=\sum _\eta q_{\xi ,\eta }\) is the total rate of leaving \(\xi \), and \(\zeta (t)\) is the DTMC in Sect. 5, then \(\xi (t)\) is explosive if and only if \(\sum _{t=1}^\infty q_{\zeta (t)}^{-1}<\infty \) with positive probability. In particular, \(\xi (t)\) is non-explosive when the rates \(q_{\xi }\) are bounded.

### Theorem 6.1

- (i)\(\xi (t)\) is non-explosive in the following cases:
- (a)
\(\alpha <0\) and \(\alpha +\beta \lambda _1(G)\le 0\),

- (b)
\(\alpha =0\) and \(\beta \le 0\),

- (c)
\(\alpha =0\), \(\beta >0\) and \(e(G)=0\).

- (a)
- (ii)\(\xi (t)\) explodes a.s. in the following cases:
- (a)
\(\alpha >0\),

- (b)
\(\alpha =0\), \(\beta >0\) and \(e(G)>0\),

- (c)
\(\alpha <0\) and \(\alpha +\beta \min _i \nu _i>0\).

- (a)

### Remark 6.1

Theorem 6.1 gives a complete characterization of explosions when the graph *G* is regular, i.e., \(\nu _i\) is constant, since then \(\min _i \nu _i=\lambda _1\), see (6.1).

*G*, Theorem 6.1 leaves one case open, viz.

### Lemma 6.1

If \(\alpha <0\) and \(\alpha +\beta \lambda _1(G)=0\), then the CTMC \(\xi (t)\) is transient and non-explosive.

We prove first an elementary lemma.

### Lemma 6.2

### Proof

Note that \(\phi (u)>0\) and \(\psi (u)\ge 0\) for all \(u\in {{\mathbb {R}}}\), and that \(\psi (u)/\phi (u)\rightarrow +\infty \) as \(u\rightarrow \pm \infty \).

*u*| is large enough, and thus there exists a constant \(C=C(B)\ge 0\) such that \(\psi (u)-B\phi (u) \ge -C\) for all \(u\in {{\mathbb {R}}}\). Consequently, for any \(\mathbf{u}\in {{\mathbb {R}}}^n\),

*B*is arbitrary, this completes the proof. \(\square \)

### Proof of Lemma 6.1

*A*with \(A\mathbf{v}_k=\lambda _k\mathbf{v}_k\). The assumptions imply \(\beta >0\) and thus, for any \(k\le n\), \(\alpha +\beta \lambda _k\le \alpha +\beta \lambda _1=0\). Hence, for any vector \(\mathbf{x}=\sum _{k=1}^nc_k\mathbf{v}_k\),

*i*-th component is, by (1.2) and (3.3),

We have shown that a.s. \(|Q_{\zeta }(t)|=|Q(\zeta (t))|\) does not converge to \(\infty \). In other words, a.s. there exists a (random) constant *M* such that \(|Q(\zeta (t))|\le M\) infinitely often. By (6.18) and (6.13), there exists for each \(M<\infty \) a constant \({}C_{3}(M)<\infty \) such that \(|Q(\xi )|\le M\) implies \(q_\xi \le C_{3}(M)\). Consequently, a.s., \(q_{\zeta (t)}\le C_{3}(M)\) infinitely often, and thus \(\sum _{t=0}^\infty q_{\zeta (t)}^{-1}=\infty \), which implies that \(\xi (t)\) does not explode. \(\square \)

### Remark 6.2

Note that if \(\alpha <0\), \(\beta >0\) and \(\alpha +\lambda _1\beta <0\), then the function *Q* defined in (6.6) is negative definite, so that \({{\tilde{Q}}}(\mathbf{x}):=-Q(\mathbf{x})\rightarrow \infty \) as \(||\mathbf{x}||\rightarrow \infty \). Therefore, it follows from equation (6.14) that the CTMC \(\xi (t)\) is positive recurrent by Foster’s criterion for positive recurrence (e.g. see [9, Theorem 2.6.4]). In other words, the function \({{\tilde{Q}}}\) can be used as the Lyapunov function in Foster’s criterion for showing positive recurrence of the DTMC \(\zeta (t)\), which in this case implies positive recurrence for the CTMC \(\xi (t)\) by (5.4). In fact, function \({{\tilde{Q}}}\) was used in Foster’s criterion to show positive recurrence of the DTMC \(\zeta (t)\) in the following special case \(\alpha <0\) and \(\alpha +\beta \max _i\nu _i<0\) in [13, Sect. 4.1.1].

### Proof of Lemma 6.1

The non-explosive case (i)(a) follows from Theorem 2.1(i) when \(\alpha +\beta \lambda _1(G)<0\) (then the chain is positive recurrent), and from Lemma 6.2 when \(\alpha +\beta \lambda _1(G)=0\). The other non-explosive cases (i)(b) and (i)(c) are trivial because in these cases (1.2) implies \(q_{\xi ,\eta }\le 1\), and thus \(q_\xi \le 2n\) is bounded.

For explosion, we may assume that *G* is connected, since we otherwise may consider the components of *G* separately, see Remark 1.2. Then, [13, Theorem 1(3) and its proof] show that if \(\alpha +\beta \min _i \nu _i>0\) and \(\beta \ge 0\), then \(\xi (t)\) explodes a.s.; this includes the cases (ii)(b) and (ii)(c) above, and the case \(\alpha >0\), \(\beta \ge 0\). Furthermore, [13, Theorem 2] shows that if \(\alpha >0\) and \(\beta \le 0\), then \(\xi (t)\) a.s. explodes; together with the result just mentioned, this shows explosion when \(\alpha >0\). \(\square \)

### Remark 6.3

It is shown in [13] that explosion may occur in several different ways, depending on both the parameters \(\alpha ,\beta \) and the graph *G*. For example, if *G* is a star, then there are (at least) three possibilities, each occuring with probability 1 when \((\alpha ,\beta )\) is in some region: a single component \(\xi _i\) explodes (tends to infinity in finite time); two adjacent components explode simultaneously; or all components explode simultaneously.

Furthermore, the results in [13] show that in the explosive cases in Theorem 6.1, the Markov chain asymptotically evolves as a pure birth process, in the sense that, with probability one, there is a random finite time after which none of the components decreases, i.e. there are no “death” events after this time. Consequently, the corresponding discrete time Markov chain can be regarded as a growth process on a graph similar to interacting urn models (e.g., see models in [1, 11] and [12]). One of the main problems in such growth processes is the same as in the urn models. Namely, it is of interest to understand how exactly the process escapes to infinity, i.e. whether all components grow indefinitely, or the growth localises in a particular subset of the underlying graph.

We do not discuss this sort of problems here and hope to address it elsewhere.

## 7 A Modified Model

In this section, we study the CTMC \({{\widetilde{\xi }}}(t)\) with the rates \({{\widetilde{q}}}_{\xi ,\eta }\) in (1.3), and the corresponding DTMC \({{\widetilde{\zeta }}}(t)\). This model is interesting since we have “decoupled” \(\alpha \) and \(\beta \), with birth rates depending on \(\alpha \) and death rates depending on \(\beta \).

Since \({{\widetilde{q}}}_{\xi ,\xi \pm \mathbf{e}_i}\) differ from \(q_{\xi ,\xi \pm \mathbf{e}_i}\) by the same factor \(e^{-\beta \sum _{j:j\sim i}\xi _j}\), which furthermore does not depend on \(\xi _i\), the balance equation (3.4) holds for \({{\widetilde{q}}}_{\xi ,\eta }\) too, and thus \({{\widetilde{\xi }}}(t)\) has the same invariant measure \(\mu (\xi )=e^{W(\xi )}\) as \(\xi (t)\).

### Remark 7.1

If \(\beta >0\), then \({{\widetilde{C}}}_{\xi ,\eta }\le C_{\xi ,\eta }\), and if \(\beta <0\), then \({{\widetilde{C}}}_{\xi ,\eta }\ge C_{\xi ,\eta }\). (If \(\beta =0\), the two models are obviously identical.)

### Theorem 7.1

The results in Theorem 2.1 hold for \({{\widetilde{\xi }}}(t)\) too, with a single exception: If \(e(G)=1\), \(\alpha <0\) and \(\alpha +\beta \lambda _1(G)=0\), then \({{\widetilde{\xi }}}(t)\) is null recurrent while \(\xi (t)\) is transient.

Here \(\lambda _1(G)\) is as above the largest eigenvalue of *G*. If \(e(G)=1\), then \(\lambda _1(G)=1\); thus the exceptional case is \(e(G)=1\), \(\alpha =-\beta <0\).

### Proof

The lemmas in Sect. 4 all hold for \({{\widetilde{\xi }}}(t)\) too by the same proofs with no or minor modifications, except Lemma 4.2 in the case \(\alpha <0\), \(\alpha +\beta \lambda _1=0\); we omit the details. This exceptional case is treated in Lemmas 7.2 and 7.3 below. \(\square \)

A few cases alternatively follow by Remark 7.1 and the Rayleigh monotonicity law.

Before treating the exceptional case, we give a simple combinatorial lemma.

### Lemma 7.1

*G*is a connected graph with \(e(G)\ge 2\), and let as above \(\mathbf{v}_1=(v_{11},\dots ,v_{1n})\) be a positive eigenvector of

*A*with eigenvalue \(\lambda _1\). Then, for each

*i*,

### Proof

*i*,

*i*. Consequently, if \(j\ne i\), then

*G*has at least 3 vertices, and thus (7.4) implies \(\sum _{j\ne i} v_{1j} \ge 2 v_{1i}>v_{1i}\), so (7.2) holds in this case too. \(\square \)

### Lemma 7.2

If \(\alpha <0\), \(\alpha +\beta \lambda _1\ge 0\) and \(e(G)\ge 2\), then the CTMC \({{\widetilde{\xi }}}(t)\) is transient.

### Proof

If *G* is connected, then \(\mathbf{v}_1\) satisfies (7.2) by Lemma 7.1. On the other hand, if *G* is disconnected and has a component with at least two edges, it suffices to consider that component.

In the remaining case, *G* consists only of isolated edges and vertices. There are at least two edges, which we w.l.o.g. may assume are 12 and 34. Then \(\lambda _1=1\) and \(\mathbf{v}_1:=\frac{1}{2}(\mathbf{e}_1+\mathbf{e}_2+\mathbf{e}_3+\mathbf{e}_4)=\frac{1}{2}(1,1,1,1,0,\dots )\) is an eigenvector satisfying (7.2).

*i*, again writing \(a':=-\alpha /2>0\), and using (4.3),

*i*,

### Lemma 7.3

If \(\alpha <0\), \(\alpha +\beta \lambda _1\ge 0\) and \(e(G)=1\), then the CTMC \({{\widetilde{\xi }}}(t)\) is null recurrent.

### Proof

The invariant measure \(e^{W(\xi )}\) is the same as for \(\xi (t)\) and has total mass \(Z_{{\alpha , \beta , G}}=\infty \) by Lemma 4.12; hence \({{\widetilde{\xi }}}(t)\) is not positive recurrent.

This completes the proof when *G* is connected. If *G* is disconnected, then *G* consist of one edge and one or several isolated vertices. By Remark 1.2, \(\xi (t)\) then consists of \(n-1\) independent parts: one part is the CTMC in \({\mathbb {Z}}_+^2\) defined by the graph \({\mathsf {K}}_2\), which is null recurrent by the first part of the proof; the other parts are independent copies of the CMTC in \({\mathbb {Z}}_+\) defined by a single vertex, and these are positive recurrent since \(\alpha <0\). It is now easy to see that the combined \(\xi (t)\) is null recurrent. \(\square \)

### Lemma 7.4

Let \(-\infty<\alpha <\infty \) and \(-\infty \le \beta <\infty \). Then \({{\widetilde{Z}}}_{{\alpha , \beta , G}}<\infty \) if and only if \(\alpha <0\) and \(\alpha +\beta \lambda _1<0\).

### Proof

By the proof of Lemma 4.12 with minor modifications. In particular, in the case \(\alpha <0\) and \(\alpha +\beta \lambda _1=0\), we argue also as in (7.5)–(7.7) in the proof of Lemma 7.2 (but now allowing \(\delta =0\)). We omit the details. \(\square \)

### Theorem 7.2

Theorem 7.1 holds for the DTMC \({{\widetilde{\zeta }}}(t)\) too.

### Proof

By Theorem 7.1 for recurrence vs transience, and by Theorem 7.2 for positive recurrence vs null recurrence. \(\square \)

We are not going to analyse the modified model any further.

## 8 Alternative Proofs Using Lyapunov Functions

In this section we give alternative proofs of some parts of Theorem 2.1. These proofs do not use reversibility, and have therefore potential extensions also to cases where electric networks are not applicable. They are based on the following recurrence criterion for countable Markov chains using Lyapunov functions, see e.g. [9, Theorem 7.2.1].

### Recurrence criterion 8.1

A CTMC with values in \({\mathbb {Z}}_{+}^n\) is recurrent if and only if there exists a positive function *f* (the Lyapunov function) on \({\mathbb {Z}}_{+}^n\) such that \(f(\xi )\rightarrow \infty \) as \(\xi \rightarrow \infty \) and \(\mathsf{L}f(\xi )\le 0\) for all \(\xi \notin D\), where \(\mathsf{L}\) is the Markov chain generator, and *D* is a finite set.

Note that the Lyapunov function \(f(\xi )\) is far from unique. The idea of the method is to find some explicit function *f* for which the conditions can be verified. There is also a related criterion for transience [4, Theorem 2.2.2], but we will not use it here.

We give only some examples. (See also [13] for further examples.) It might be possible to give a complete proof of Theorem 2.1 using these methods, but this seems rather challenging. Note that (since our Markov chains have bounded steps), the Lyapunov function *f* can be changed arbitrarily on a finite set; hence it suffices to define \(f(\xi )\) (and verify its properties) for \(||\xi ||\) large. We do so, usually without comment, in the examples below.

### Example 8.1

*Proof of the hard-core case of Theorem* 2.1(ii)(a) *by the recurrence criterion* 8.1. Assume that \(\alpha =0\), \(\beta =-\infty \) and \(k_{\max }(G)\)\(\le 2\). As said in Sect. 3.3, we may assume that the Markov chain lives on \(\varGamma _0\) defined in (3.8); since \(\kappa \le 2\), this implies that no more than two components of the process can be non-zero. Therefore, the Markov chain evolves as a simple random walk on a certain finite union of quadrants of \({\mathbb {Z}}_+^2\) and half-lines \({\mathbb {Z}}_+\) glued along the axes. Each of these random walks is null-recurrent, and, hence, the whole process should be null-recurrent as well. We provide a rigorous justification to this heuristic argument by using the recurrence criterion 8.1.

*x*and

*y*can increases as well as decrease. A direct computation gives that

*some*of the other components may also increase by 1, and assume there are \(m\ge 0\) such components. A direct computation gives that

*G*, using modifications of the Lyapunov function (8.1) used in the hard-core case.

Recurrence in the case \(\alpha =0\), \(b:=-\beta >0\) and \(G={\mathsf {K}}_2\), the graph with just 2 vertices and a single edge, was shown in [13] by applying the recurrence criterion 8.1 with the Lyapunov function \(f(\xi )=\log (\xi _1+\xi _2+1)\). Alternatively, one could use e.g. \(f(\xi )=\log (\xi _1+\xi _2)\) or \(\log (\xi _1^2+\xi _2^2)\). We extend this to the case \(G={\mathsf {K}}_n\), the complete graph with *n* vertices, for any \(n\ge 2\).

### Example 8.2

*Recurrence in the case*\(\alpha =0\), \(\beta =-b<0\)*and*\(G={\mathsf {K}}_n\). We use the function \(f(\xi ):=\log \Vert \xi \Vert \). (Similar arguments work for variations such as \(\log \bigl (||\xi ||^2\pm 1\bigr )\) and \(\log (\xi _1+\dots +\xi _n)\).)

*f*are

*i*, and thus (8.4) implies, since \(r\le \sum _i\xi _i\),

*r*, as required.

*x*is large.

Hence, in both cases, \({\mathsf {L}}f(\xi )\le 0\) when \(||\xi ||\) is large, and recurrence follows by the recurrence criterion 8.1.

The argument in Example 8.2 used the fact that *G* is a complete graph so that \((A\xi )_i\ge 1\) unless only \(\xi _i\) is non-zero. Similar arguments work for some other graphs.

### Example 8.3

Let again \(\alpha =0\), \(\beta <0\) and let \(G={\mathsf {K}}_{1,2}\), a star with 2 non-central vertices, which is the same as a path of 3 vertices. Number the vertices with the central vertex as 3, and writre \(\xi =(x,y,z)\). Taylor expansions similar to the one in (8.4), but going further, show that \(f(\xi ):=\log ||\xi ||\) is not a Lyapunov function. (The problematic case is \(\xi =(x,x,0)\), with \({\mathsf {L}}f(\xi )=\frac{1}{2}r^{-4}+O(r^{-6})\).) However, similar calculation also show that \(f(\xi ):=\log (||\xi ||^2-1)\) is a Lyapunov function, showing recurrence by the recurrence criterion 8.1. We omit the details.

## Notes

### Acknowledgements

We thank James Norris for helpful comments.

## References

- 1.Costa, M., Menshikov, M., Shcherbakov, V., Vachkovskaia, M.: Localisation in a growth model with interaction. J. Stat. Phys.
**171**(6), 1150–1175 (2018)ADSMathSciNetCrossRefzbMATHGoogle Scholar - 2.Doyle, P.G., Snell, J.L.: Random Walks and Electrical Networks. Mathematical Association of America, Washington, DC (1984)zbMATHGoogle Scholar
- 3.Erdős, P., Taylor, S.J.: Some problems concerning the structure of random walk paths. Acta Math. Acad. Sci. Hungar.
**11**, 137–162 (1960)MathSciNetCrossRefzbMATHGoogle Scholar - 4.Fayolle, G., Malyshev, V., Menshikov, M.: Topics in the constructive theory of countable Markov chains. Cambridge University Press, Cambridge (1995)CrossRefzbMATHGoogle Scholar
- 5.Karlin, S., Taylor, H.: A first course in stochastic processes, 2nd edn. Academic Press Inc., Cambridge (1975)zbMATHGoogle Scholar
- 6.Kelly, F.: Reversibility and Stochastic Networks. Wiley Series in Probability and Mathematical Statistics. Wiley, New York (1979)zbMATHGoogle Scholar
- 7.Liggett, T.: Continuous time Markov Processes : An Introduction. Graduate Studies in Mathematics, American Mathematical Society, Providence (2010)CrossRefzbMATHGoogle Scholar
- 8.Lyons, R., Peres, Y.: Probability on Trees and Electrical Networks. Cambridge University Press, Cambridge (2016)CrossRefzbMATHGoogle Scholar
- 9.Menshikov, M.V., Popov, S., Wade, A.R.: Non-homogeneous Random Walks: Lyapunov Function Methods for Near-Critical Stochastic Systems. Cambridge University Press, Cambridge (2017)CrossRefzbMATHGoogle Scholar
- 10.Norris, J.: Markov Chains. Cambridge University Press, Cambridge (1997)CrossRefzbMATHGoogle Scholar
- 11.Shcherbakov, V., Volkov, S.: Stability of a growth process generated by monomer filling with nearest neighbour cooperative effects. Stoch. Process. Appl.
**120**, 926–948 (2010)MathSciNetCrossRefzbMATHGoogle Scholar - 12.Shcherbakov, V. and Volkov, S.: Queueing with neighbours. In: Bingham, N.H., Goldie, C.M. (eds.) Probability and Mathematical Genetics. Papers in honour of Sir John Kingman. LMS Lecture Notes Series,
**378**, 463–481 (2010)Google Scholar - 13.Shcherbakov, V., Volkov, S.: Long term behaviour of locally interacting birth-and-death processes. J. Stat. Phys.
**158**(1), 132–157 (2015)ADSMathSciNetCrossRefzbMATHGoogle Scholar - 14.Volkov, S.: Vertex-reinforced random walk on arbitrary graphs. Ann. Probab.
**29**, 66–91 (2001)MathSciNetCrossRefzbMATHGoogle Scholar

## Copyright information

**OpenAccess**This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.