Functions in Asymptotic Notation

When we use asymptotic notation to express the rate of growth of an algorithm's running time in terms of the input size nn, it's good to bear a few things in mind.

Let's start with something easy. Suppose that an algorithm took a constant amount of time, regardless of the input size. For example, if you were given an array that is already sorted into increasing order and you had to find the minimum element, it would take constant time, since the minimum element must be at index 00. Since we like to use a function of nn in asymptotic notation, you could say that this algorithm runs in Θ(n0)\Theta(n^0) time. Why? Because n0 =1n^0 = 1, and the algorithm's running time is within some constant factor of 11. In practice, we don't write Θ(n0)\Theta(n^0), however; we write Θ(1)\Theta(1).

Now suppose an algorithm took Θ(log10n)\Theta(\log_{10}n) time. You could also say that it took Θ(lgn)\Theta(\lg{n}) time (that is, Θ(log2n)\Theta(\log_2n) time. Whenever the base of the logarithm is a constant, it doesn't matter what base we use in asymptotic notation. Why not? Because there's a mathematical formula that says

Create a free account to access the full course.

By signing up, you agree to Educative's Terms of Service and Privacy Policy