Introduction
Theory Theory Theory
Last updated
Theory Theory Theory
Last updated
Asymptotic notations are mathematical tools to represent the time complexity of algorithms for asymptotic analysis. The following 3 asymptotic notations are mostly used to represent the time complexity of algorithms.
Asymptotic analysis refers to computing the running time of any operation in mathematical units of computation
The theta notation bounds a function from above and below, so it defines exact asymptotic behavior.
A simple way to get the Theta notation of an expression is to drop low-order terms and ignore leading constants.
The Big O notation defines an upper bound of an algorithm, it bounds a function only from above
If we use O notation to represent time complexity of Insertion sort, we have to use two statements for best and worst cases
The worst case time complexity of Insertion Sort is O($n^2$).
The best case time complexity of Insertion Sort is O(n). Note that O($n^2$) also covers linear time.
Just as Big O notation provides an asymptotic upper bound on a function, Ω notation provides an asymptotic lower bound.
O(1): Time complexity of a function (or set of statements) is considered as O(1) if it doesn't contain loop, recursion and call to any other non-constant time function.
O(n): Time Complexity of a loop is considered as O(n) if the loop variables is incremented / decremented by a constant amount
O($n^c$): Time complexity of nested loops is equal to the number of times the innermost statement is executed. For example the following sample loops have O($n^c$) time complexity
O( Log(n) ): Time Complexity of a loop is considered as O( Log(n) ) if the loop variables is divided / multiplied by a constant amount
O( Log(Log(n)) ): Time Complexity of a loop is considered as O( Log(Log(n)) ) if the loop variables is reduced / increased exponentially by a constant amount
O( m+n ) : When there are consecutive loops, we calculate time complexity as sum of time complexities of individual loops.