A Markov chain with a finite number of states is a probabilistic experiment where, if Xt is the outcome state at time t, then the chain will move from Xt to Xt+1 in one iteration. The way that the Markov chain moves from one state to the next is through the use of a transition matrix, whose entries contain probabilities that influence the outcomes of transitions. There are many real-world applications to which Markov chains relate. One application is the use of Markov chains in baseball situations. The Markov chain process can be useful in simulating one inning of baseball, predicting the number of runs that will score in an inning, as well as many other useful results that will benefit the manager of a baseball team.
Mancine, Jarrod A., "Applying Markov Chains to Baseball" (2014). Senior Independent Study Theses. Paper 5742.
Statistics and Probability
Markov Chains, probability, baseball
Bachelor of Arts
Senior Independent Study Thesis
© Copyright 2014 Jarrod A. Mancine