# Introduction to Asymptotic notation

So far, the way we analyzed linear search and binary search has been by counting the maximum number of guesses we need to make. But what we really want to know is how long these algorithms take. We're interested in time, not just guesses. The running times of linear search and binary search include the times to make and check guesses, but there's more to these algorithms than making and checking guesses.

The running time of an algorithm depends on how long it takes a computer to run the lines of code of the algorithm. And that depends on the speed of the computer, the programming language, and the compiler that translates the program from the programming language into code that runs directly on the computer, as well as other factors.

Let's think about the running time of an algorithm more carefully. We use a combination of two ideas. First, we determine how long the algorithm takes, in terms of the size of its input. This idea makes intuitive sense, doesn't it? We've already seen that the maximum number of guesses in linear search and binary search increases as the length of the array increases. Or think about a GPS. If it knew about only the interstate highway system, and not about every little road, it should be able to find routes more quickly, right? So we think about the running time of the algorithm as a function of the size of its input.

The second idea is that we focus on how fast this function grows with the input size. We call that the rate of growth of the running time. To keep things manageable, we simplify the function to distill the most important part and cast aside the less important parts. For example, suppose that an algorithm, running on an input of size $n$, takes $6n^2 + 100n + 300$ machine instructions. The $6n^2$ term becomes larger than the remaining terms, $100n + 300$, once $n$ becomes large enough ($20$ in this case). Here's a chart showing values of $6n^2$​ and $100n + 300$ for values of $n$ from $0$ to $100$.