This document provides an introduction to Markov chains, which are discrete-value random sequences where the next value depends only on the current value and not on previous values. It defines a Markov chain as a discrete-value random sequence where Xn+1 depends on Xn but not earlier values of X0 through Xn-1. It focuses on the case where each variable X in the sequence is a discrete random variable that can take on the values 0, 1, 2, etc.
This document provides an introduction to Markov chains, which are discrete-value random sequences where the next value depends only on the current value and not on previous values. It defines a Markov chain as a discrete-value random sequence where Xn+1 depends on Xn but not earlier values of X0 through Xn-1. It focuses on the case where each variable X in the sequence is a discrete random variable that can take on the values 0, 1, 2, etc.
This document provides an introduction to Markov chains, which are discrete-value random sequences where the next value depends only on the current value and not on previous values. It defines a Markov chain as a discrete-value random sequence where Xn+1 depends on Xn but not earlier values of X0 through Xn-1. It focuses on the case where each variable X in the sequence is a discrete random variable that can take on the values 0, 1, 2, etc.
Professor, EEE Department, Islamic University of Technology • Introduction: • We consider a discrete-value random sequence {Xn, {n = 0, 1,2, . . .}} • that is not an iid random sequence. • In Markov chains Xn+1 depends on Xn, but not on the earlier values X0, . . . , Xn-1 of the random sequence. • To keep things reasonably simple, we restrict our attention to the case where each X, is a discrete random variable with range Sx = {0,1,2, . . .}.