Trusted answers to developer questions

Educative Answers Team

Asymptotic notations are descriptions that allow us to examine an algorithm’s running time by expressing its performance as the input size, *n*, of an algorithm increases. There are *three* common asymptotic notations: **Big O**, **Big Theta**, and **Big Omega**.

Big O, commonly written as $O$(n), is an asymptotic notation for the $worst$ $case$. It provides an **upper bound** for the runtime of an algorithm. $O$(n) is useful when you only have an upper bound on the time complexity of an algorithm. Since you can easily find an upper bound just by looking at an algorithm, $O$(n) is the **most** commonly used amongst the three notations.

Big Theta, commonly written as $\Theta$(n), is an asymptotic notation for the $average$ $case$. It provides a **tight bound** for the runtime of an algorithm. $\Theta$(n), bounds a function from above and below, so it defines the **exact** runtime behavior.

Big Omega, commonly written as $\Omega$(n), is an asymptotic notation for the $best$ $case$. It provides you with a **lower bound** for the runtime of an algorithm.
$\Omega$(n) notation can be useful when you have a lower bound on the time complexity of an algorithm. Omega notation is the **least** used notation among all three since best case performance of an algorithm is generally **not** useful.

RELATED TAGS

big o

big theta

big omega

Copyright ©2022 Educative, Inc. All rights reserved

RELATED COURSES

View all Courses

Keep Exploring

Related Courses