Introduction to Asymptotic Analysis and Big O

In this lesson, you will learn about asymptotic notation, it is an important tool applied to the analysis of algorithms.

You have seen that the time complexity of an algorithm can be expressed as polynomial. To compare two algorithms, you can compare the respective polynomials. However, the analysis performed in the previous lessons is cumbersome and would become intractable for bigger algorithms that are normally encountered in practice.

Asymptotic analysis

One observation that may help is that you want to worry about large input sizes only. If the input size is really small, how bad can a poorly designed algorithm get, right? Mathematicians have a tool for this sort of analysis called asymptotic notation. The asymptotic notation compares two functions; let’s say, f(n)f(n) and g(n)g(n) for large values of nn. This fits in nicely with your need for comparing algorithms for very large input sizes.

Big O notation

One of the asymptotic notations is the “Big O” notation. A function f(n)f(n) is considered O(g(n))O(g(n)), which is read as Big Oh of g(n)g(n). If a positive real constant cc and an integer n0>0n_0> 0 exists, such that the following inequality holds for all nn0n \geq n_0:

f(n)cg(n)f(n) \leq cg(n)

The following graph shows a plot of a function f(n)f(n) and cg(n)cg(n) that demonstrates this inequality.

Note that the above inequality does not have to hold for all nn. For n<n0n < n_0, f(n)f(n) is allowed to exceed cg(n)cg(n) but for all values of nn beyond some value n0n_0, f(n)f(n) is not allowed to exceed cg(n)cg(n).

What good is this? It tells you that for large values of nn, f(n)f(n) will be within a constant factor of g(n)g(n) at most. In other words, f(n)f(n) will not grow faster than a constant multiple of g(n)g(n). Put yet another way, the rate of the growth of f(n)f(n) is within constant factors of that of g(n)g(n).

People tend to write f(n)f(n) = O(g(n))O(g(n)), which isn’t technically accurate. Many functions can satisfy the O(g(n))O(g(n)) conditions. Thus, O(g(n))O(g(n)) is a set of functions. It is fine to say that f(n)f(n) belongs to O(g(n))O(g(n)).

Example

Consider an algorithm whose running time is given by f(n)=3n3+4n+2f(n) = 3n^3 + 4n + 2. Try to verify that this algorithm’s time complexity is in O(n3)O(n^3). To do this, you need to find a positive constant cc and an integer n0>0n_0 > 0 for all nn0n \geq n_0:

3n3+4n+2cn33n^3 + 4n + 2 \leq cn^3

The above inequality would still be true if you rewrote it while replacing cn3cn^3 with 3n3+4n3+2n33n^3 + 4n^3 + 2n^3. What you have just done is the replacement of the variable part in all terms with n3n^3, which is the variable-part of the highest order term. This gives us:

3n3+4n+23n3+4n3+2n3=9n33n^3 + 4n + 2 \leq 3n^3 + 4n^3 + 2n^3 = 9n^3

This does not violate the inequality because each term on the right-hand side is greater than the corresponding term on the left-hand side. Now, comparing it with the definition of Big-O, you can pick c = 9.

That leaves n0n_0. For what values of nn is the inequality 9n3cn39n^3 \leq cn^3 satisfied? All of them actually! You can pick n0=1n_0 = 1.

The above solution (c=9,n0=1)(c=9, n_0 = 1) is not unique. You could have picked any value for cc that exceeds the coefficient of the highest power of nn in f(n)f(n). Suppose you decided to pick c=4c = 4. The reader can verify that the inequality 3n3+4n+2cn33n^3 + 4n + 2 \leq cn^3 still holds for n0=3n_0 = 3 or higher.

Note that it is not possible to find a constant cc and n0n_0 to show that f(n)=3n3+4n+2f(n) = 3n^3 + 4n + 2 is O(n2)O(n^2) or O(n)O(n). It is possible to show that f(n)f(n) is O(n4)O(n^4) or O(n5)O(n^5) or any higher power of nn. Mathematically, it is correct to say that 3n3+4n+23n^3 + 4n + 2 is O(n4)O(n^4). It gives you a loose bound on the asymptotic running time of the algorithm. When dealing with time and space complexities, you should be generally interested in the tightest possible bound regarding asymptotic notation.

Note: All challenges in this chapter should be answered with the lowest order Big Oh.

Suppose algorithms A and B have a running time of O(n)O(n) and O(n2)O(n^2), respectively. For sufficiently large input sizes, algorithm A will run faster than algorithm B. That does not mean that algorithm A will always run faster than algorithm B.

Both algorithms A and B have running time O(n)O(n). The execution time for these algorithms, for very large input sizes, will be within constant factors of each other. For all practical purposes, they are considered equally good.

Simplified asymptotic analysis

Once you have obtained the time complexity of an algorithm by counting the number of primitive operations as discussed in the previous two lessons, you can arrive at the Big O notation for the algorithm by:

  • Dropping the multiplicative constants with all terms
  • Dropping all but the highest order term

Thus, n2+2n+1n^2 + 2n + 1 is O(n2)O(n^2), while n5+4n3+2n+43n^5 + 4n^3 + 2n + 43 is O(n5)O(n^5).

Notice how the constant coefficients have become insignificant in the Big O notation. Recall that these constants represent the number of primitive operations on a given line of code. This means that while analyzing code, counting a line of code as contributing four primitive operations is as good as counting it as one primitive operation. The only aspect that matters is correctly counting the number of times each line of code is repeated.

Moving forward, you will simplify the analysis by counting the number of executions of each line of code instead of counting the number of operations.

Sometimes the constant coefficients become important. For example, consider algorithms A and B that have a worst-case running time of 100000n + 4 and 10n + 6, respectively. Asymptotically, both are O(n)O(n). However, the worst-case running time for algorithm B is numerically better than A.

A comparison of some common functions

It is easy to work with simple polynomials in nn but when the time complexity involves other types of functions like log()log(), you may find it hard to identify the “highest order term.” The following table lists some commonly encountered functions in ascending order of rate of growth. Any function can be given as Big O of any other function that appears in this table.

Function Name Function Name
Any constant Constant n2n^2 Quadratic
lognlog n Logarithmic n3n^3 Cubic
log2nlog^2 n Log-square n4n^4 Quartic
n\sqrt n Root-n 2n2^n Exponential
nn Linear ene^n Exponential
nlognnlogn Linearithmic n!n! n-Factorial

The following graph shows some of the functions from the above table.

A quick quiz on Big O!

1

e3ne^{3n} is in O(en)O(e^n)

A)

True

B)

False

Question 1 of 20 attempted