Asymptotic notations are descriptions that allow us to examine an algorithm’s running time by expressing its performance as the input size, n, of an algorithm increases. There are three common asymptotic notations: Big O, Big Theta, and Big Omega.
Big O, commonly written as (n), is an asymptotic notation for the . It provides an upper bound for the runtime of an algorithm. (n) is useful when you only have an upper bound on the time complexity of an algorithm. Since you can easily find an upper bound just by looking at an algorithm, (n) is the most commonly used amongst the three notations.
Big Theta, commonly written as (n), is an asymptotic notation for the . It provides a tight bound for the runtime of an algorithm. (n), bounds a function from above and below, so it defines the exact runtime behavior.
Big Omega, commonly written as (n), is an asymptotic notation for the . It provides you with a lower bound for the runtime of an algorithm. (n) notation can be useful when you have a lower bound on the time complexity of an algorithm. Omega notation is the least used notation among all three since best case performance of an algorithm is generally not useful.
Free Resources