What is a Markov chain?
The simplest example is a drunkard's walk (also called a random walk). The drunk might stumble in any direction but will move only 1 step from the current position.
So to make a "chain" we just feed tomorrows result back into today. Then we can get a long chain like rain rain rain no rain no rain no rain rain rain rain no rain no rain no rain a pattern will emerge that there will be long "chains" of rain or no rain based on how we setup our "chances" or probabilities.
- Hidden blue prints of nature / objects around us
- Once you begin each sequence will converge to one ratio
- First order and second order model defined by Claude Shannon