Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

Random walks

April 18, 2014


1 Markov chains
Exercise 1.1. Describe the Poisson process as a Markov chain in continuous
time.
2 Walks on graphs
A graph is a pair G = (V, E). The symmetry assumption is usually phrased
by saying that the graph is undirected or that its edges are unoriented.
There is a well-known and easily established correspondence between elec-
trical networks and random walks that holds for all graphs. Namely, given a
nite connected graph G with conductances assigned to the edges, we con-
sider the random walk that can go from a vertex only to an adjacent vertex
and whose transition probabilities from a vertex are proportional to the con-
ductances along the edges to be taken. That is, if x is a vertex with neighbors
y
1
, . . . , y
d
, and the conductance of the edge (x, y
i
) is c
i
, then
p(x, y
j
) =
c
j

d
i=1
c
i
.
In fact, we are interested only in reversible Markov chains, where we call
a Markov chain reversible if there is a positive function x (x) on the
state space such that the transition probabilities satisfy pi(x)p
xy
= pi(y)p
yx
for all pairs of states x, y. (Such a function () will then provide a stationary
measure. Note that () is not generally a probability measure.)
In this case, make a graph G by taking the states of the Markov chain
for vertices and joining two vertices x, y by an edge when p
xy
> 0. Assign
1
weight
c(x, y) := (x)p
xy
to that edge; note that the condition of reversibility ensures that this weight
is the same no matter in what order we take the endpoints of the edge. With
this network in hand, the Markov chain may be described as a random walk
on G: when the walk is at a vertex x, it chooses randomly among the vertices
adjacent to x with transition probabilities proportional to the weights of the
edges. Conversely, every connected graph with weights on the edges such
that the sum of the weights incident to every vertex is nite gives rise to a
random walk with transition probabilities proportional to the weights. Such
a random walk is an irreducible reversible Markov chain: dene (x) to be
the sum of the weights incident to x.
Exercise 2.1. Show that random walk on a connected weighted graph G is
positive recurrent (i.e., has a stationary probability distribution) if and only
if

x,y
c(x, y) < 1, in which case the stationary probability distribution is
proportional to (). Show that if the random walk is not positive recurrent,
then () is a stationary innite measure.
3 Queues
Exercise 3.1. Describe a simple Markov queue as a random walk.
Recall that the queue receives a Poisson inow of rate and the service
time is exponentially distributed with service rate . The queue behaves
dierently in cases > , = , and < .
Note that in all these cases the chain is called reversible, though for =
and > there is no equilibrium probability distribution! Still there exists
an equilibrium measure on the state space {0, 1, . . . }.
Exercise 3.2. Find this measure.
2

You might also like