Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

Markov Chains

Ayush Jha – 22B0051


Mentor – Shikhar Moondra
Summer of Science

Plan of Ac on
Week0: Probability Primer

 Review basic concepts of probability theory (e.g., sample space, events, probability
axioms, conditional probability, independence).

Week1: Introduction to Discrete-Time Markov Chains (DTMC)

 Transition Probability Matrix (TPM) and its properties.


 Definition and properties of DTMC.
 Stopping Time and its significance in DTMC.
 Strong Markov Property and its implications.
 Visualization of DTMC with examples.
 Gambler’s Ruin problem and other examples.
 Communicating Classes and their properties.
 Proofs for the classification of states in communicating classes.

Week 2: Advanced Topics in DTMC

 Further exploration of the Strong Markov Property.


 Examples illustrating the application of DTMC with visualization.
 Frequency of visits to a particular state in DTMC.
 Criteria for determining transient/recurrent DTMC.
 Theorem of stationarity in irreducible positive recurrent DTMC.
 Finding and interpreting stationary measures.
 Practical considerations and limitations in finding stationary measures.

Week 3: Continuous-Time Markov Chains (CTMC) and beyond

 Introduction to Continuous-Time Markov Chains and their differences from DTMC.


 Renewal theory and its connection to Markov Chains.
 Analysis of regeneration processes and its application.
 Extensions of Markov Chains in various fields (e.g., queuing theory, reliability analysis,
machine learning).

Week 4: Continuous-Time Markov Chains (CTMC)

 Introduction to Continuous-Time Markov Chains and their properties.


 Transition Rate Matrix (Q-matrix) and its interpretation.
 Exponential distributions and their relevance to CTMC.
 Poisson processes and their connection to CTMC.
 Chapman-Kolmogorov equations for CTMC.
 Stationary distributions and their computation in CTMC.
 Markovian arrival processes and their applications.
Week 5: Renewal Theory

 Introduction to Renewal Theory and its importance.


 Renewal processes and their characteristics.
 Distribution of renewal times and their properties.
 Renewal reward processes and their applications.
 Key theorems in Renewal Theory and their proofs.
 Renewal equation and its solution methods.

Week 6: Regeneration Process Analysis

 Definition and properties of Regenerative Processes.


 Regenerative cycle length and its distribution.
 Renewal reward processes in the context of Regeneration.
 Regenerative processes in Markov Chains and their analysis.
 Practical applications of Regenerative Process Analysis.
 Advanced topics and recent developments in Regenerative Process Theory.

End Goal
To develop a good understanding of Markov chains because of its importance in other subjects.

They are crucial in economics for modeling market trends and predicting economic behavior. In
finance, they are essential for option pricing and risk management. Operations research leverages
Markov chains to optimize supply chains and improve logistics. In computer science, they
underpin algorithms in machine learning, particularly in reinforcement learning, as well as natural
language processing and search engine optimization.

They are used in modeling statistical mechanics in physics, population dynamics in biology, and
decision-making processes in social sciences. The mathematical elegance and practical utility of
Markov chains make them an intellectually stimulating subject, providing critical insights into both
theoretical and real-world problems. My end goal is to develop a good conceptually understanding
of it and its implementations in different domains.

You might also like