Markov Chains

MATH 4221

Fall 2013

Skiles 254

Instructor

Will Perkins

wperkins3@math.gatech.edu

Office: Skiles 017

Topics
  • Sections: 6.1 - 6.6, 6.14
  • Definitions:
    1. Markov Chain
    2. Homogeneous Markov Chain
    3. Transition Matrix
    4. n-step Transition Matrix
    5. Recurrent State
    6. Transient State
    7. Null and positive recurrent
    8. Periodic
    9. Intercommunicating States
    10. Closed
    11. Irreducible
    12. Stationary Distribution
    13. Reversible
    14. Total Variation Distance
    15. Mixing Time
    16. Coupling of Two Markov Chains
    17. Random Walk on a Graph
  • Theorems: 6.1.5, 6.1.8, 6.2.3, 6.2.4, 6.2.5, 6.2.9, 6.3.2, 6.3.3, 6.3.4, 6.3.5, 6.4.3, 6.4.6, 6.4.17, 6.5.4, 6.6.1, 6.14.9, Coupling Collison Time Bound on Mixing Time,
  • Sample Questions:
    1. What is the mixing time for a random walk on the complete graph with self loops?
    2. Give upper and lower bounds for the mixing time of a random walk on two complete graphs of size n connected by a single edge.
    3. Can a single Markov Chain have more than one stationary distribution? Why or why not?
    4. Consider a two state MC with states A, B. p(A,A) =1/2, p(A, B) =1/2. P(B,B) =1/4, P(B,A) =3/4. What is the stationary distribution? What are the eigenvalues of the adjacency matrix?
    5. Is a branching process a reversible Markov Chain? How about SRW?
    6. Decompose the state space of the branching process MC.
    7. Prove that 3d random walk is transient.
    8. Prove that 2d random walk is null recurrent.
    9. What is the stationary distribution of the random walk on the complete bipartite graph where the left partition has n vertices, the right 2n, and at each step there is probability 1/2 of remaining in the current state.
    10. Give a graph whose random walk has period 3.
    11. Give a Markov chain whose state space is not irreducible.
    12. Give a Markov chain whose state space can be partitioned into two closed sets of states.
    13. Consider m balls lying in n bins. At each step we pick a ball at random out of a bin, and throw it into a bin chosen uniformly at random. Is this a Markov Chain? What is the state space? Clasify the chain. What is the stationary distribution? Give upper and lower bounds for its mixing time.