# Model Markov Processes

In this lesson, we will be discussing the Model Markov Process, which is a completely different kind of distribution. It is neither completely discrete nor continuous.

So far in this course, we’ve very briefly looked at continuous distributions on doubles and spent a lot of time looking at discrete distributions with small supports.

## Introduction to the Markov Model

Let’s take a look at a completely different kind of distribution that doesn’t easily fit into our “small discrete” bucket but isn’t continuous either.

### Some Applications of Markov Model

Some motivating scenarios:

- Imagine you find yourself placed at random in a city and you want to take a stroll. At every intersection, you make a random choice of which direction to go next. The random choice of starting points has a particular distribution, and the distribution of possible direction choices could be different at each intersection. The result of each stroll is a path through the city. What is the distribution of possible strolls?
- Imagine a different kind of random walk: you are on a page in a wiki, like Wikipedia or TV Tropes. You read the page and then select at random a link to another page in the site, if there is one. What is the distribution of possible sequences of pages?
- Imagine you are a gambler who starts with a certain random amount of cash in your pocket and a target for how much you want to win and how much you are willing to lose. If you’ve won or lost too much, you go home; otherwise, you make a random-sized bet on a slot machine with certain odds, where the distribution of bet sizes and machine choices is a function of how much money you have in your pocket right now. What is the distribution of possible sequences of cash in your pocket?

Each of these cases has several things in common:

- We start in a randomly chosen state.
- We randomly move to a new state depending
*solely on our current state*and*not*on any “memory” of how we got to that state. - The new state might be a “stop state”, in which case the sequence of states is over.
- The question is: what is the distribution of
*possible state sequences*?

A random process that has these properties is called a Markov process, after the Russian mathematician **Andrey Markov**. Such processes are beneficial and have been heavily studied for the last hundred years or so.

Get hands-on with 1000+ tech skills courses.