Home > Mathematica, Mathematics > Knights tour revisited

## Knights tour revisited

Once again I have found food for thought in a post on John D Cook’s blog.

The puzzle posed:

Start a knight at a corner square of an otherwise-empty chessboard. Move the knight at random by choosing uniformly from the legal knight-moves at each step. What is the mean number of moves until the knight returns to the starting square?

As is discussed this can be solved by considering a random walk on the Knight’s tour graph (connected graph).

I found it instructive to count the edges of the Knight’s tour graph (though I am disappointed that it was not as ‘clear’ to me as suggested). Consider the $n\times n$ chessboard.  The sum of the degrees of all the vertices is twice the number of edges.

Consider the following 6 $\times$ 6 board. The color  represent  distinguishes the degree of the vertices.

Using insights from this chessboard, see the table to derive the general formula:

Therefore, the number of edges for an $n\times n$ Knight’s tour graph is: $4(n-1)(n-2)$ . This can be easily generalized to an $m\times n$ chessboard to yield: $4 m n -6(m+n)+8$. Thus, for the $8\times8$ board the number of edges: 168.

Now, to address the puzzle. The Knight’s tour is a connected graph and the random walk on this graph is reversible. A unique solution vector, $\mathbf{w}$, can be found to $\mathbf{w}\mathbf{P}=\mathbf{w}$, where $\mathbf{P}$ is the transition matrix.

$w_j=\frac{d_j}{\sum^n_{k=1} d_k}$ where $d_j$ is the degree of vertex $j$

Note the denominator of $w_j$ is the sum of the degrees of the graph (or twice the number of edges).

Let $\mathbf{M}$ be the graph of the mean transition time with the convention $m_{ii}=0$. Consider distinct states $i$ and $j$.  Considering the first step then recursively:

$m_{ij}=1+\sum_{k=1}^n p_{ik}m_{kj},\quad \text{for} i\neq j$

For mean recurrence time ($r_i$),

$r_i=1+\sum_{k=1}^n p_{ik}m_{ki}$

Let $\mathbf{C}$ be an $n\times n$ matrix of all 1’s,  $\mathbf{R}$  by diagonal matrix with $r_j$ on the diagonals.

Integrating all these considerations, this can be represented as:

$\mathbf{M}=\mathbf{P}\mathbf{M}+\mathbf{C}-\mathbf{R}$

As we are dealing with a reversible Markov chain, $\mathbf{w}\mathbf{P}=\mathbf{w}$,

$\begin{array}{ccc} (\mathbf{I}-\mathbf{P})\mathbf{M}&=&\mathbf{C}-\mathbf{R}\\\mathbf{w}(\mathbf{I}-\mathbf{P})\mathbf{M}&=&\mathbf{w}(\mathbf{C}-\mathbf{R})=\mathbf{0}\end{array}$

Hence,

$\mathbf{w}\mathbf{C}=\mathbf{w}\mathbf{R}$

Now ,

$\mathbf{w}$ is a row vector such that $\sum_{j=1}^n w_j =1$, so $\mathbf{w}\mathbf{C} =\overbrace{(1 1 1\ldots 1)}^{n}$

$\mathbf{w}\mathbf{R}=(w_1 r_1\ldots w_jr_j\ldots w_nr_n)$

Therefore,

$r_j=\frac{\sum_{j=1}^n d_j}{d_j}=\frac{2|E|}{d_j}$

where $|E|$ is the cardinality of the edge set (the number of edges)…as explained by John D Cook.

I found the above  approach here. It was easier for me to understand.

A 1000 run simulation yielded a mean of 163 (cf predicted 168):

An example walk is displayed below: