High School Mathematics Extensions/Markov Chains

Markov Chains edit

 
Example Markov Chain #1

In short, Markov Chains are models of a random process that is based on the probabilities of previous process. In other words, the following state of the model is dependent on the probabilities of the previous state of the model. These models have to follow a rule known as the Markov Property, a property that requires the probability of the future of a random process (a stochastic process in math lingo!) to be dependent only on its present state.

For example, based on the Example Markov Chain Image to the right, the probability that a creature will eat a certain food (whether that be grapes, lettuce, or cheese) is dependent on the last one they ate. If they ate lettuce previously, they have a 40% chance to eat grapes next, a 60% chance they will eat cheese next, and a 0% chance they will eat more lettuce afterwards.