Trusted answers to developer questions

What are the common asymptotic notations?

Get Started With Data Science

Learn the fundamentals of Data Science with this free course. Future-proof your career by adding Data Science skills to your toolkit — or prepare to land a job in AI, Machine Learning, or Data Analysis.

Asymptotic notations are descriptions that allow us to examine an algorithm’s running time by expressing its performance as the input size, n, of an algorithm increases. There are three common asymptotic notations: Big O, Big Theta, and Big Omega.

Big O

Big O, commonly written as OO(n), is an asymptotic notation for the worstworst casecase. It provides an upper bound for the runtime of an algorithm. OO(n) is useful when you only have an upper bound on the time complexity of an algorithm. Since you can easily find an upper bound just by looking at an algorithm, OO(n) is the most commonly used amongst the three notations.

svg viewer

Big Theta

Big Theta, commonly written as Θ\Theta(n), is an asymptotic notation for the averageaverage casecase. It provides a tight bound for the runtime of an algorithm. Θ\Theta(n), bounds a function from above and below, so it defines the exact runtime behavior.

svg viewer

Big Omega

Big Omega, commonly written as Ω\Omega(n), is an asymptotic notation for the bestbest casecase. It provides you with a lower bound for the runtime of an algorithm. Ω\Omega(n) notation can be useful when you have a lower bound on the time complexity of an algorithm. Omega notation is the least used notation among all three since best case performance of an algorithm is generally not useful.

svg viewer

RELATED TAGS

asymptotic notations
big o
big theta
big omega
Copyright ©2024 Educative, Inc. All rights reserved
Did you find this helpful?