Following is a question from Arora-Barak’s Computational Complexity, a simple exercise in probability.

Please note that a random step here is defined as: either *stay back* or *go to a neighbour* randomly.

I get the idea for each of the step but I am only left with proving the bound $Delta(mathbf{p^k}) leq (1 – n^{-10n}) Delta(mathbf{p})$ for some $k$.

It is easy to show that $Delta(mathbf{p^k})$ is a non-increasing sequence and furthermore since $G$ is connected it should strictly decrease after $n$ steps (that is $Delta(mathbf{p^n}) < Delta(mathbf{p})$). The question also assumes that $G$ is non-bipartite but I don’t think so if it’s required.

I have a hunch that $Delta(mathbf{p^n}) leq left(1 – frac1nright)Delta(mathbf{p})$. If I am able to prove this then I am done because $Delta(mathbf{p^{kn}}) leq left(1 – frac1nright)^kDelta(mathbf{p}) leq left(1 – frac{1}{n^k}right)Delta(mathbf{p})$ and so $k=10n$ gives us the desired inequality.

I shall first show that the inequality holds true for $mathbf{p} = mathbf{e_u}$ where $mathbf{e_u}$ is the probability vector with $1$ at vertex $u$ and $0$ elsewhere.

Let $P$ be the transition matrix ($P_{xx} = frac{1}{d+1}$ and $P_{xy} = frac{1}{d+1}$ iff $(x,y) in E$) for our random walk and so $P^k$ is the $k$-step random walk.

Let $M_k$ denote the maximum entry of the $u^{th}$ row of $P^k$. Then notice that for every vertex $v$:

$$(P^{k+1})_{uv} = sum_w (P^{k})_{uw} (P^1)_{wv} leq M_k sum_w P_{wv} = M_k$$

Thus, $M_{k+1} leq M_k$ and so in particular $M_n leq M_1 leq frac{1}{d+1}$.

Finally $mathbf{p^n} = mathbf{p}P^n = $ $u^{th}$ row of $P^n$ and so $max_{mathbf{v}} {mathbf{p^n_v}} = M_n leq frac{1}{d+1} leq left(1 – frac{1}{n}right)^2$ because $frac{1}{d+1} + frac{2}{n} leq 1$ for all $n geq 3$ and $d geq 2$.

(*Note*: For $n<3$, $mathbf{p^k}$ is uniform distribution for all $k > 1$ and so the result follows immediately. And for $dleq 1$, since $G$ is connected, it follows that $nleq 2$ and so we are done)

Hence $Delta(mathbf{p^n}) = max_{mathbf{v}} {mathbf{p^n_v} – frac1n} leq max_{mathbf{v}} {mathbf{p^n_v}} leq left(1-frac1nright)Delta(mathbf{p})$ (as $Delta(mathbf{p}) = left(1-frac1nright)$) as desired.

Now for any general distribution $mathbf{p}$, we can express it as $mathbf{p} = sum_{mathbf{v}} mathbf{p_v}mathbf{e_v}$. I am stuck at this point. I had in mind that once I have showed for the “good” probability vectors ($mathbf{e_u}$), this case should follow immediately. But it doesn’t seem so.

Any help with proceeding further would be really appreciated.

**Note**: I know that $mathbf{p^k}$ will converge to uniform distribution (from a theorem in Markov Chains that for irreducible, aperiodic chains distribution converges to the stationary distribution) which will directly give that $Delta(mathbf{p^k})$ converges to $0$. But proof of this general theorem from Markov Chains requires sophisticated methods like coupling argument or spectral theorem. And this question, as it is asked, suggests me that it can be done without using that theorem.

Another question I wanted to ask: The above bound shows that $Delta(mathbf{p^k})$ converges to $0$ and so the maximum entry from each day distribution converges to $frac1n$. But can we conclude from it that $mathbf{p^k}$ converges to uniform distribution?