Time Complexity Order

Algorithms’ time complexity

The following table clarifies big-OO representations of different time complexity functions.

Name Notation
Constant O(1)O(1)
Logarithmic O(log(n))O(log(n))
Linear O(n)O(n)
Quadratic O(n2)O(n^2)
Polynomial O(nc)O(n^c) where c is constant and greater than 1
Exponential O(cn)O(c^n) where c is constant and greater than 1
Factorial or N-power-N O(n!)O(n!) or O(nn)O(n^n)

The time taken by certain algorithms to run varies dramatically with the size of the input. Some algorithms take seconds or even minutes to run on huge input, whereas others may take days to complete their execution. To understand how the rate of growth changes with the size of the input in different functions, the following table presents the approximate number of steps required to run an algorithm:

Growth rate of functions

N

Function Growth Rate (Approximate)





O(1)

O(log n)

O(n)

O(n log n)

O(n2)

O(n3)

O(2 n)

10

1

3

10

30

102

103

103

102

1

6

102

6x102

104

106

1030

103

1

9

103

9x103

106

109

10300

104

1

13

104

13x104

108

1012

103000

105

1

16

105

16x105

1010

1015

1030000

106

1

19

106

19x106

1012

1018

10300000

Growth of functions

Let’s look at these growth rates of functions in more detail to apply them easily in later lessons. The size of the input is represented as nn.

Constant function O(1)O(1)

The number of steps required to execute an algorithm does not change with the change in the input size. The complexity of such algorithms is represented as O(1)O(1). For example:

  • Push and pop operations of a stack.
  • Returning the first element of a list.
  • Getting an element from a hash table.

Linear time O(n)O(n)

If the execution time of an algorithm is directly proportional to its input size, that means the algorithm runs in linear time. It is represented as O(n)O(n). Some examples of linear time are:

  • Searching for an element in an array.
  • Traversal of a linked list to find the maximum or minimum element.

Note: When we need to see all of the nodes in a data structure for any job, the complexity is O(n)O(n).

Logarithmic time O(log(n))O(log(n))

If the execution time of an algorithm is proportional to the logarithm of the input size, it is said to run in logarithmic time. It is represented as O(log(n))O(log(n)). A large portion (e.g., half) of the input is pruned out without traversing it at each algorithm stage. Binary Search is a common example of logarithmic time.

Quadratic time O(n2)O(n^2)

If the execution time of an algorithm is proportional to the square of the input size, it is said to run in quadratic time. It is represented as OO(n2n^{2}). Each element is compared to all other elements in these algorithms. For example, in two nested loops, if the outer loop runs nn times and the inner loop also runs nn times, then the body of the inner loop will execute n2n^2 times.

Exponential time O(2n)O(2^n)

The exponential time is represented as O(2n)O(2^n). Usually, in such algorithms, all possible subsets of elements of input data are generated. A common example is the power set.

Factorial time $O(n!)

In such algorithms, all possible permutations of the elements of input data are generated. Finding permutations is an example of a factorial time algorithm. It is represented as O(n!)O(n!).