Search⌘ K
AI Features

Functions in Asymptotic Notation

Explore how asymptotic notation expresses the growth rates of algorithm running times in relation to input size. Understand constant, logarithmic, polynomial, and exponential time complexities, and learn why logarithm bases are interchangeable in this context. This lesson helps clarify the order of growth to better analyze algorithm efficiency.

We'll cover the following...

When we use asymptotic notation to express the rate of growth of an algorithm's running time in terms of the input size nn, it's good to bear a few things in mind.

Let's start with something easy. Suppose that an algorithm took a constant amount of time, regardless of the input size. For example, if you were given an array that is already sorted into increasing order and you had to find the minimum element, it would take constant time, since the minimum element must be at index 00. Since we like to use a function of nn in asymptotic notation, you could say that this algorithm runs in Θ(n0)\Theta(n^0) time. Why? Because n0 =1n^0 = 1, and the algorithm's running time is within some constant factor of 11. In practice, we don't write Θ(n0)\Theta(n^0), however; we write Θ(1)\Theta(1).

Now suppose an algorithm took Θ(log10n)\Theta(\log_{10}n) time. You could also say that it took Θ(lgn)\Theta(\lg{n}) time (that is, Θ(log2n)\Theta(\log_2n) ...