When analyzing algorithms we analyze the order of growth of an algorithm in proption to the input data. Below is an example of three alogrithms and their respective run times.
From the diagram we can see that logrithmic alogrithms scale better with the number of items to process when compared to algorithms that have linear or exponential time complexity.
This RAM a common model to analyze algorithms. The RAM model makes the follow assumptions:
- The RAM model assumes that there is infinte memory (no memory constraints).
- Each operation takes a unit of time.
- Assumes all data are equaly accessible regardless of data set size.
(In reality big data sets might not fit in RAM memory and would take more time to access the data.)
Time complexity of an algorithm is expressed in terms of
O(...). The definition of big O notation is that there is a mathmatical upper bound in the performance of an algorithm. Take for example an algorithm that has
T(n)=2x^2 + 2x + 3 time complexity.
The upper bound of this algorithm can be shown by
C is a constant. At some value of
C, we can show that
n^2 is the limit. This means at some value of
C, the algorithm will not peform better that
n^2. This means that T(n) is bound by
cn^2 and therefore T(n) has time complexity of
The definition of Big O, O(…) is that for a set of functions, there is a positive contant C and n0 such that…
|O(..)||Definition of O(…)|
Table below shows three algorithm and whether the algorithm has a time complexity that can be described by
|T(n)||c||n0||cn||cn^2||T(n) <= C(n^2)||T(n) <= C(n)|
Note n0 is n naught, it is the value of n where the upper bound occurs,
log(2)n <= sqrt(n) <= n < nlog(2)n <= n^2 <= 2^n <= n! ...