Search⌘ K

Other Common Asymptotic Notations and Why Big O Trumps Them

Learn to identify and differentiate various asymptotic notations such as Big Omega, Big Theta, little o, and little omega. Understand their mathematical definitions and use cases in algorithm analysis, and discover why Big O notation is most commonly applied to describe worst-case complexity in coding interviews.

Big ‘Omega’ - Ω(.)\Omega(.)

Mathematically, a function f(n)f(n) is in Ω(g(n))\Omega(g(n)) if there exists a real constant c>0c > 0 and if there exists no>0n_o > 0 such that f(n)cg(n)f(n) \geq cg(n) for nnon \geq n_o. In other words, for sufficiently large values of nn, f(n)f(n) will grow at least as fast as g(n)g(n).

It is a common misconception that Big O characterizes worst-case running time, while Big Omega characterizes the best-case running time of an algorithm. There is not a one-to-one relationship between any of the cases and the asymptotic notations.

The following graph shows an example of functions f(n)f(n) and g(n)g(n) that have a Big Omega relationship.

A quick quiz

A quick quiz on Big Omega!

1.

n3Ω(1)n^3 \in \Omega(1)

A.

True

B.

False


1 / 3

Did you know that if f(n)O(g(n))f(n) \in O(g(n)), then g(n)Ω(f(n))g(n) \in \Omega(f(n))? Try proving it!

Big ‘Theta’ - Θ(.)\Theta(.)

If f ...