Algorithm Analysis (Big O)
Algorithm Analysis (Big O)
Algorithm Analysis (Big O)
(Big O)
Complexity
In
examining algorithm efficiency we
must understand the idea of complexity
Space complexity
Time Complexity
Space Complexity
When memory was expensive we focused on
making programs as space efficient as possible
and developed schemes to make memory appear
larger than it really was (virtual memory and
memory paging schemes)
1
Common growth rates
1
Big Oh - Formal
Definition
Definition of "big oh":
f(n)=O(g(n)), iff there exist constants c and n
0
such that: f(n) <= c g(n) for all n>=n0
Thus, g(n) is an upper bound on f(n)
Note:
f(n) = O(g(n))
is NOT the same as
O(g(n)) = f(n)
The '=' is not the usual mathematical operator
"=" (it is not reflexive)
1
Big-O Notation
Comparing Algorithms
and
ADT Data Structures
1
Algorithm Efficiency
a measure of the amount of resources consumed in
solving a problem of size n
time
space
benchmarking – code the algorithm, run it with some
specific input and measure time taken
better for measuring and comparing the performance of
processors than for measuring and comparing the performance
of algorithms
Big Oh (asymptotic analysis) provides a formula that
associates n, the problem size, with t, the processing
time required to solve the problem
1
big Oh
measures an algorithm’s growth rate
how fast does the time required for an algorithm to
execute increase as the size of the problem increases?
is an intrinsic property of the algorithm
independent of particular machine or code
based on number of instructions executed
for some algorithms is data-dependent
meaningful for “large” problem sizes
1
Computing x for n >= 0 n
iterative definition
x * x * x .. * x (n times)
recursive definition
x0 = 1
xn = x * xn-1 (for n > 0)
another recursive definition
x0 = 1
xn = (xn/2)2 (for n > 0 and n is even)
xn = x * (xn/2)2 (for n > 0 and n is odd)
1
Iterative Power function
double IterPow (double X, int N) {
double Result = 1; 1
while (N > 0) { n+1
Result *= X; n critical region
N--; n
{
return Result; 1
}
Total instruction count: 3n+3
2
Common big Ohs
constant O(1)
logarithmic O(log2 N)
linear O(N)
n log n O(N log2 N)
quadratic O(N2)
cubic O(N3)
exponential O(2N)
2
Comparing Growth Rates
2n n2
n log2 n
T(n)
log2 n
Problem Size
2
An Experiment
Execution time (in seconds)
(1,000,000 repetitions) 2
Uses of big Oh
comparealgorithms which perform the
same function
search algorithms
sorting algorithms
comparing data structures for an ADT
each operation is an algorithm and has a big
Oh
data structure chosen affects big Oh of the
ADT's operations
2
Comparing algorithms
Sequential search Binary search
growth rate is growth rate is O(log2
O(n) n)
average number average number of
of comparisons comparisons done is
done is n/2
n n/2 2((log
2((log 2 n) -1)
2 n)-1)
100 50 12
500 250 16
1000 500 18
5000 2500 24 2