This post explores the idea of lifted Markov chains by examining **Example 1.1** from
Lifting Markov Chains to Speed up Mixing
by Chen, Lovasz, and Pak. A GitHub repo with the source code and some addition colorful schematic
diagrams is here. The key idea is that on some
well-behaved graphs
(e.g. those with lots of symmetry) we can sometimes construct faster mixing random walks by lifting to a larger
graph with even nicer properties. Although this is unintuitive at first glance, as we are increasing the number of possible
states, the first example
in the paper nicely highlights the type of results we can expect.

Before considering the specific example, a few words about the general lifting procedure for walks on graphs. Given a graph, \(G\) our current interest is in drawing samples from the stationary distribution associated to a specified walk on the graph. However, the chosen walk may be slow to converge (i.e. it takes many steps/samples to get to stationarity) and we might desire a more rapid procedure for sampling from the corresponding stationary distribution. The technique requires us to to find a different graph, \(H\), along with a projection \(\pi\) from the nodes of the new graph to those of the old. We will then form a new chain by lifting from \(G\) to \(H\) using \(\pi^{-1}\), taking a step on a faster mixing chain \(M\) on \(H\), and then projecting back to \(G\) using \(\pi\):

\( \newcommand{\ra}[1]{\!\!\!\!\!\!\!\!\!\!\!\!\xrightarrow{\quad#1\quad}\!\!\!\!\!\!\!\!} \newcommand{\da}[1]{\left\downarrow{\scriptstyle#1}\vphantom{\displaystyle\int_0^1}\right.} \newcommand{\dra}[1]{\!\!\!\!\!\!\!\!\!\!\!\!\dashrightarrow{\quad#1\quad}\!\!\!\!\!\!\!\!} \newcommand{\ua}[1]{\left\uparrow{\scriptstyle#1}\vphantom{\displaystyle\int_0^1}\right.} \) \begin{array}{ccccc} H & & \ra{M} & &H\\ \ua{\pi^{-1}}& & & & \da{\pi}\\ G& & \ra{\hat{M}} & &G \end{array}This procedure defines a Markov chain \(\hat{M}=\pi M\pi^{-1}\) on \(G\) and if we choose \(H\), \(\pi\), and \(M\) cleverly we can guarntee both that the stationary distribution of \(\hat{M}\) is equal to that of our desired walk on \(G\) and favorably bound the mixing time.

For the specific example in the paper \(G\) will be the path graph on \(n\) nodes \(P_n\) and the corresponding \(H\) is the cycle graph on \(2n-2\) nodes \(C_{2n-2}\). The figures below show the graphs for \(n=6\): \(P_6\) and \(C_{10}\).

The simple random walk on the path, where at each step the walker moves to one of the neighboring nodes uniformly, has steady state proportional to the vector \( (1,2,2,\ldots,2,1) \) and requires \(\Theta(n^2)\) steps to mix. On the cycle, the same walk has a uniform stationary distribution as well as the same mixing time. More generally, any walk on the cycle where the probability of moving to the left or right doesn't depend on the current state also has a uniform stationary distribution. For this example, we will consider the walk where we move to the right with probability \(\frac{1}{3}\) and to the left with probability \(\frac{2}{3}\) which mixes in \(\Theta(n)\) steps. Details of these computations can be found in Chpater 12 of this book, Chapter 5 of this book, or these notes.

The lifting procedure described above requires us to choose a projection \(\pi: C_{2n-2}\rightarrow P_n\) so that the uniform distribution on the cycle is mapped to the stationary distribution of the simple random walk on the path. Summing the respective probabilities shows that any projection that maps exactly two nodes of the cycle to the interior nodes of the path and exactly one node each to the endpoints of \(P_n\) has this property. Mapping the nodes on the cycle to the node(s) of the corresponding color in the figures above gives one such projection. Note that there does not need to be any connection between the topology and the projection - all that matters is that the distribution is projected properly. The schematic below shows an example step of the lifted chain.

At every step, the following procedure is carried out (given that we are at node \(i\) in the path at step \(n\), so \(X_n = i\) ):

- Select a node \(p\) in the cycle uniformly from \(\pi^{-1}(i)\)
- Draw a uniform number \(z\) on \([0,1]\)
- Move to a neighbor \(q\) of \(p\): \[q =\begin{cases} p+1 \pmod{2n-2}& \textrm{if}\ z< \sigma \\ p-1 \pmod{2n-1} &\textrm{otherwise}\end{cases}\]
- Project back to the path \(X_{n+1}=\pi(q)\)
- Repeat!

Click the "Start Walking!" button below to experiment with these Markov chains. You can choose a value for \(n\) and select a specific projection or a random permutation. The widget will display the path and color the associated nodes in the cycle to help you visualize the projection. Next, the steady state distribution and the transition matrices for the chosen projection are displayed. Notice that the correct stationary distribution on the path is returned regardless of the projection.

Finally, the software also compares the distributions obtained from three simulated walks of length \(k\). The first walk is the
simple random walk on \(P_n\) where at each step the walker moves to a neighboring vertex. The second is the lifted walk, using the projection
shown by the colorings of the graphs and the input rightwards probability \(\sigma\), while the final walk uses the same projections but keeps track of its position on the cycle.
By varying \(n\) and \(k\) you
can experiment to verify the conclusions of the example, that the lifted walks mix more rapidly,
even with twice as many "nodes". **For faster display speeds, turn off "Display Matrices" for larger \(n\). **