- 5.6k Downloads
Let G be a finite graph. We consider a random walk on the vertices of G of the following type. Start at a vertex u. (The vertex u could be chosen randomly according to some probability distribution or could be specified in advance.) Among all the edges incident to u, choose one uniformly at random (i.e., if there are k edges incident to u, then each of these edges is chosen with probability 1∕k). Travel to the vertex v at the other end of the chosen edge and continue as before from v. Readers with some familiarity with probability theory will recognize this random walk as a special case of a finite-state Markov chain. Many interesting questions may be asked about such walks; the basic one is to determine the probability of being at a given vertex after a given number ℓ of steps.