Markov Chains - University of Cambridge
www.statslab.cam.ac.uk/~rrw1/markov/M.pdfMarkov Chains These notes contain material prepared by colleagues who have also presented this course at Cambridge, especially James Norris. The material mainly comes from books of Norris, Grimmett & Stirzaker, Ross, Aldous & Fill, and Grinstead & Snell. Many of the examples are classic and ought to occur in any sensible course on Markov chains ...
Chapter 8: Markov Chains - Auckland
https://www.stat.auckland.ac.nz/~fewster/325/notes/ch8.pdfThe Markov chain is the process X 0,X 1,X 2,.... Definition: The state of a Markov chain at time t is the value ofX t. For example, if X t = 6, we say the process is in state6 at timet. Definition: The state space of a Markov chain, S, is the set of values that each X t can take. For example, S = {1,2,3,4,5,6,7}. Let S have size N (possibly ...
Lecture Notes Markov Chains
blogs.post-gazette.com › lecture-notes-markovSemi-Markov Chains and Hidden Semi-Markov Models toward Applications Dependence in Probability and Statistics Markov Chain Monte Carlo (MCMC) originated in statistical physics, but has spilled over into various application areas, leading to a corresponding variety of techniques and methods.