"No one is harder on a talented person than the person themselves" - Linda Wilkinson ; "Trust your guts and don't follow the herd" ; "Validate direction not destination" ;

February 10, 2017

Day #55 - Markov chains Basics

This post is from my notes. I had bookmarked some interesting answers on understanding Markov chains.

What is a Markov chain?
The simplest example is a drunkard's walk (also called a random walk). The drunk might stumble in any direction but will move only 1 step from the current position.

The ink drop in a glass of water example

Imagine a traffic light with three states: yellow, green, red; however, instead of going Green-> Yellow-> Red at "fixed intervals", it would go at any color at any time.(randomly - Imagine a dice with 3 color and you throw it and decide what color it will be next).   Alternatively, imagine you are in certain color, say green. If you don't allow to be in the same color again, flip a coin. If it is heads go to red, and if tails go to yellow.

So to make a "chain" we just feed tomorrows result back into today. Then we can get a long chain like rain rain rain no rain no rain no rain rain rain rain no rain no rain no rain a pattern will emerge that there will be long "chains" of rain or no rain based on how we setup our "chances" or probabilities.

Markov Chain - Khan Academy
  • Hidden blue prints of nature / objects around us
  • Once you begin each sequence will converge to one ratio
  • First order and second order model defined by Claude Shannon
Happy Learning!!!

No comments: