A drunkard meets Euler and Markov on the Konigsberg Bridge
At least two of the lives of those mentioned in the title did not intersect. In this post I explore an example used in Professor Blitzstein’s STAT110 course.
Professor Blitzstein describes the random walk on an undirected graph as an example to illustrate a reversible Markov chain and the existence of a probability vector such that in each time step the probability of going from state i to state j equals the probability of going from state j to state 1, for all states i and j. This vector is (or can rescaled/normalized) the stationary distribution (as was shown in in the lecture). Using notation in the lecture:
In the following animated gif, the underlying graph (network) has 4 nodes/vertices (states) with edges as indicated by the bidirectional blue arrows. This is the example used in the lecture.
At each node is a 10 x 10 array and each black square in the area means an object in the state and that time step.
I have color coded the states by the background of the array: state 1-red, state 2-green, state 3- blue, state 4 -purple.
At each step, the object moves to another connected state with probability 1/(degree of the node), i.e. it is equally likely to go along any available edge.
The initial state has 100 objects in state 1. The plot on the right illustrates the evolution across each time step. The end of colored line corresponding to the number of objects in each state at the displayed time step. The dashed horizontal lines are the stationary distribution values (note as the stationary distribution values for state 1 and 2 are the same the lines are superimposed).
The tabulated results show the number in the displayed state (colored entry) with the stationary distribution (I used 100 to allow considering these as percentages) in black.