# The Big-O Notation

Have a look at the runtime of different algorithms.

To figure out how long a program would take to run on a real computer, we would need to know things like the speed of the computer, the system architecture, the compiler being used, details of the memory hierarchy, and so on. Therefore, carefully estimating the running time is a rather difficult task. Moreover, in practice, we might not even know some of these details. That is why computer scientists use the big-$O$ notation to estimate the running time of an algorithm without knowing anything about all these details!

## Big-$O$ notation: Comparing running times

Consider two algorithms and denote by $f(n)$ and $g(n)$ their running times on an input of size $n$. We say that $f$ grows no faster than $g$, if there exists a constant $c$ such that for every positive integer $n$, $f(n)\leq c\cdot g(n)$ (equivalently, $\frac{\text{f(n)}}{\text{g(n)}} \leq c$). In this case, we write $f = O(g)$ or $f \preceq g$. The notation $f = O(g)$ is a standard one, whereas some learners find the notation $f \preceq g$ to be more intuitive.

To give an example, let $f(n) = 4n\log_{2}{n}$ and $g(n) = n^2$. To visualize the order of growth of the two considered functions, let’s plot them for $1 \leq n \leq 30$. Here is a simple Python code for generating this plot that you can use as a template for plotting any other functions.

Click the “Run” button below to generate a graph.

Press + to interact
import matplotlib.pyplot as pltimport numpy as npn = np.linspace(1, 30)plt.plot(n, 4 * n * np.log2(n), label='$4n\log_2(n)$')plt.plot(n, n ** 2, label='$n^2$')plt.legend()plt.savefig('output/to.png')

As the picture reveals, $4n\log_{2}n\geq n^2$ for $n\leq16$, but then the two functions switch. Indeed, $4n\log_{2}n\geq n$ for $n\geq16$. In particular,

$4n\log_{2}n\leq 16n^2$ for all $n\geq1$.

We conclude that $4n\log_{2}n=O(n^2)$.

Now, let $f(n)=2n^2+5n+3$ and $g(n)=n^2$. On one hand, $f(n)$ is larger than $g(n)$ for all positive $n$. On the other hand,

$f(n)=2n^2+5n+3 \leq 2n^2+5n^2+3n^2=10n^2=10g(n)$

for all positive integers $n$. That is, $f(n)$ is at most ten times larger than $g(n)$. We conclude that $f$ grows no faster than $g$ $($and write $f=O(g)$ or $f\preceq g)$.

Click the “Run” button below to generate a graph.

Press + to interact
import matplotlib.pyplot as pltimport numpy as npn = np.linspace(1, 100)plt.plot(n, 2 * n ** 2 + 5 * n + 3, label='$2n^2+5n+3$')plt.plot(n, n ** 2, label='$n^2$')plt.plot(n, 10 * n ** 2, label='$10n^2$')plt.legend(loc='upper left')plt.savefig('output/to.png')

We can also plot the fraction $\frac{f(n)}{g(n)}$. Run the code below to generate a graph.

Press + to interact
import matplotlib.pyplot as pltimport numpy as npn = np.linspace(1, 100)plt.plot(n, (2 * n ** 2 + 5 * n + 3) / (n ** 2))plt.savefig('output/to.png')

This plot shows that (at least, in the considered range $1\leq n\leq 100$) the fraction $\frac{f(n)}{g(n)}$ is at most $10$ and, in fact, approaches $2$ as $n$ grows.

Now, let’s compare $f(n)=11n$ and $g(n)=2n^2+5n+3$. First, let’s look at their plots. Click the “Run” button below to generate a graph.

Press + to interact
import matplotlib.pyplot as pltimport numpy as npn = np.linspace(1, 100)plt.plot(n, 11 * n, label='$11n$')plt.plot(n, 2 * n ** 2 + 5 * n + 3, label='$2n^2+5n+3$')plt.legend()plt.savefig('output/to.png')

We notice that both functions grow (as $n$ grows) but $11n$ grows slower. This can be made formal as follows.

For two functions, $f, g$ we say that $f$ grows slower than $g$ and write $f = o(g)$ or $f\prec g$, if the fraction $\frac{f(n)}{g(n)}$ goes to zero as $n$ grows.

Exercise break: Plot the fraction $\frac{11n}{2n^2+5n+3}$ to ensure that it goes to zero as $n$ grows.

Of course, if $f\prec g$ (equivalently, $f=o(g)$), then also $f\preceq g$ (equivalently, $f=O(g)$). In plain English: if $f$ grows slower than $g$, then certainly $f$ grows no faster than $g$.