When we analyze the efficiency of an algorithm, we often use the $O(n)$, $O(n^{2})$, and $O(log(n))$ notations.

The complexities $O(n)$ and $O(n^{2})$ are fairly simple to understand. Perhaps the easiest way to understand them is by considering the example of arrays.

Consider a 1-dimensional array:

`arr`

= [1, 2, 3, 4, 5, 6]

Now, an example of an $O(n)$ algorithm would be to search for any element $e$ in `arr`

by iterating over all of its elements with a `for`

loop. In this technique, the worst case would be that we have to search for $e$ that is at the very end of `arr`

. This would mean that we would have to search over all $n$ elements to be sure if $e$ is present in `arr`

or not.

The code below implements the $O(n)$ algorithm in Python.

def orderOfN (array, target):for number in array:if number == target:return Truereturn Falsedef main ():arr = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20]targetVal = 20present = orderOfN(arr, targetVal)if present:print(targetVal, " is present in ", arr)else:print(targetVal, " is not present in ", arr)targetVal = 100present = orderOfN(arr, targetVal)if present:print(targetVal, " is present in ", arr)else:print(targetVal, " is not present in ", arr)main()

In an $O(n^{2})$ algorithm, for each element in `arr`

, the whole array `arr`

is traversed for a particular goal. If the number of elements $n$ increases linearly, the time to perform the algorithm will increase quadratically. One example of this is *bubble sort*. In this technique, every $e$ in `arr`

is compared with all the other elements in the array to compare their values and swap them if they are in the wrong (unsorted) order.

The general rule to know the value of $c$ in $O(n^{c})$ algorithms, is to check the number of loops in a nested block of loops. The total number of loops in that block will equal the value of $c$.

The code below implements the bubble sort algorithm.

def orderOfN2 (array):for i in range(len(array) - 1):for j in range(len(array) - i - 1):if array[j] > array[j + 1]:t = array[j]array[j] = array[j + 1]array[j + 1] = treturn arrayarray = [4, 1, 3, 10, -1, 0]array = orderOfN2(array)print(array)

In the case of $O(log(n))$ algorithms, the time to perform the algorithm goes up linearly if the value of $n$ goes up exponentially. An example of $O(log(n))$ on arrays would be any divide and conquer approach, such as *binary search*, to see if an element $e$ exists in an array `arr`

. In a binary search, the search space halves in every iteration, which makes the process much more efficient.

The program below implements binary search, which has a time complexity of $O(log(n))$.

def binary_search(arr, low, high, target):if high >= low:mid = (high + low) // 2if arr[mid] == target:return midelif arr[mid] > target:return binary_search(arr, low, mid - 1, target)else:return binary_search(arr, mid + 1, high, target)else:return -1def main():arr = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20]target = 20b = binary_search(arr, 0, len(arr) - 1, target)if b!=-1:print(target, ' is present in ', arr)else:print(target, ' is not present in ', arr)target = 100b = binary_search(arr, 0, len(arr) - 1, target)if b!=-1:print(target, ' is present in ', arr)else:print(target, ' is not present in ', arr)main ()

TRENDING TOPICS