Related Tags

algorithms
complexity
communitycreator

# What are the differences between O(n), O(n^2), and O(log(N))?

When we analyze the efficiency of an algorithm, we often use the $O(n)$, $O(n^{2})$, and $O(log(n))$ notations.

The complexities $O(n)$ and $O(n^{2})$ are fairly simple to understand. Perhaps the easiest way to understand them is by considering the example of arrays.

### Examples

Consider a 1-dimensional array:

• arr = [1, 2, 3, 4, 5, 6]

#### $O(n)$

Now, an example of an $O(n)$ algorithm would be to search for any element $e$ in arr by iterating over all of its elements with a for loop. In this technique, the worst case would be that we have to search for $e$ that is at the very end of arr. This would mean that we would have to search over all $n$ elements to be sure if $e$ is present in arr or not.

### Code

The code below implements the ﻿ ﻿$O(n)$ algorithm in Python.

def orderOfN (array, target):
for number in array:
if number == target:
return True
return False

def main ():
arr = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20]
targetVal = 20
present = orderOfN(arr, targetVal)

if present:
print(targetVal, " is present in ", arr)
else:
print(targetVal, " is not present in ", arr)

targetVal = 100
present = orderOfN(arr, targetVal)
if present:
print(targetVal, " is present in ", arr)
else:
print(targetVal, " is not present in ", arr)
main()

#### $O(n^{2})$

In an $O(n^{2})$ algorithm, for each element in arr, the whole array arr is traversed for a particular goal. If the number of elements $n$ increases linearly, the time to perform the algorithm will increase quadratically. One example of this is bubble sort. In this technique, every $e$ in arr is compared with all the other elements in the array to compare their values and swap them if they are in the wrong (unsorted) order.

The general rule to know the value of $c$ in $O(n^{c})$ algorithms, is to check the number of loops in a nested block of loops. The total number of loops in that block will equal the value of $c$.

### Code

The code below implements the bubble sort algorithm.

def orderOfN2 (array):
for i in range(len(array) - 1):
for j in range(len(array) - i - 1):
if array[j] > array[j + 1]:
t = array[j]
array[j] = array[j + 1]
array[j + 1] = t
return array

array = [4, 1, 3, 10, -1, 0]
array = orderOfN2(array)
print(array)

#### $O(log(n))$

In the case of $O(log(n))$ algorithms, the time to perform the algorithm goes up linearly if the value of $n$ goes up exponentially. An example of $O(log(n))$ on arrays would be any divide and conquer approach, such as binary search, to see if an element $e$ exists in an array arr. In a binary search, the search space halves in every iteration, which makes the process much more efficient.

### Code

The program below implements binary search, which has a time complexity of $O(log(n))$.

def binary_search(arr, low, high, target):
if high >= low:
mid = (high + low) // 2
if arr[mid] == target:
return mid
elif arr[mid] > target:
return binary_search(arr, low, mid - 1, target)
else:
return binary_search(arr, mid + 1, high, target)
else:
return False

def main():
arr = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20]
target = 20
b = binary_search(arr, 0, len(arr) - 1, target)

if b:
print(target, ' is present in ', arr)
else:
print(target, ' is not present in ', arr)

target = 100
b = binary_search(arr, 0, len(arr) - 1, target)
if b:
print(target, ' is present in ', arr)
else:
print(target, ' is not present in ', arr)
main ()

RELATED TAGS

algorithms
complexity
communitycreator

CONTRIBUTOR 