Big-O Notation Analysis of Algorithms
Big-O Notation Analysis of Algorithms
Big-O Notation Analysis of Algorithms
Analysis of Algorithms
(how fast does an algorithm grow with respect to N)
(Note: Best recollection is that a good bit of this document comes from C++ For You++, by Litvin & Litvin)
The time efficiency of almost all of the algorithms we have discussed can be characterized by only a few growth rate functions:
Examples:
1. Some more simplistic sorting algorithms, for instance a selection sort of n elements;
2. Comparing two two-dimensional arrays of size n by n;
3. Finding duplicates in an unsorted list of n elements (implemented with two nested loops).
Examples:
Examples:
O(l) < O(log n) < O(n) < O(n log n) < O(n2) < O(n3) < O(an)
As we know from the Change of Base Theorem, for any a, b > 0, and a, b 1
log a
log a b
logb n = n
statement(s) S
Therefore,
{run through another loop n
loga n = C logb n times}
}
where C is a constant equal to loga b.
k k-1
p(n) = akn + ak-1n + … + a1n + a0
k
p(n) = O(n )
Example:
{perform any