Daa C1
Daa C1
Daa C1
The above code-snippets are meant to solve the same problem, which is the square of a number. But
they end-up with different time.
Cont… 16
It is difficult to follow empirical approach as a consistent measure of an algorithm’s
efficiency, since it is dependent on
• Specific processor speed
• Current processor load
• Specific data for a particular run of the program
• Input Size
• Input Properties
• Operating Environment
The better way is using theoretical approach, where instead of examining and
comparing the algorithm using different instance(or inputs) on different platforms,
we mathematically calculate the operations executed or storage space used.
Rules for (time complexity) Analysis 17
1. We assume an arbitrary time unit.
2. Execution of the following operations takes 1 time unit:
• Assignment Operation
• Single Input/Output Operation
• Single Boolean Operations
• Single Arithmetic Operations
• Function Return
3. Running time of a selection statement (if, switch) is the time for the condition evaluation +
the maximum of the running times for the individual clauses in the selection.
Cont… 18
4. Running time for a loop is equal to the running time for the statements inside the loop *
number of iterations. The total running time of a statement inside a group of nested loops is
the running time of the statements multiplied by the product of the sizes of all the loops.
For nested loops, analyze inside out.
• Always assume that the loop executes the maximum number of iterations possible.
5. Running time of a function call is 1 for setup + the time for any parameter calculations +
the time required for the execution of the function body.
Examples 19
1. int total(int n) 1. void func()
{
{ int x=0;
int sum=0; int i=0;
int j=1;
for (int i=1;i<=n;i++) cout<< “Enter an Integer value”;
sum=sum+1; cin>>n;
while (i<n){
return sum; x++;
} i++;
2. int count() }
while (j<n)
{ {
int k=0; j++;
}
cout<< “Enter an integer”; }
cin>>n; 2. int sum (int n)
{
for (i=0;i<n;i++) int partial_sum = 0;
k=k+1; for (int i = 1; i <= n; i++)
return 0; partial_sum = partial_sum +(i * i * i);
return partial_sum;
} }
Formal Approaches to Analysis 20
1. Loops: In general, a loop translates to a summation. The index and bounds of the
summation are the same as the index and bounds of the for loop.
2. Conditions: Compute the maximum of the running time form the conditions.
Asymptotic Analysis 21
Asymptotic means a line that tends to converge to a curve, which may or may not
eventually touch the curve. It is the line that stays with bounds.
Asymptotic notation represents the “fastest possible” and “slowest possible” running times
for an algorithm using high and low bounds on speed.
Asymptotic analysis depends on the input state as:
1. Worst Case/ Upper bound: The amount of time the algorithm takes on the worst possible set
of inputs.
Big-Oh (O) and Little-oh(o) : our interests is on worst case! Why?
2. Average Case: The amount of time the algorithm takes on an "average" set of inputs.
Theta(θ)
3. Best Case/ Lower bound: The amount of time the algorithm takes on the smallest possible set
of inputs.
Big-omega(Ω) and Little-Omega(ω)
Big-Oh (O) 22
It is asymptotically tight upper bound of any function. Hence it denotes the worst-
case complexity of any algorithm.
Formally:
f(n)=O(g(n)) if there exist c, n0 ∊ ℛ+ such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 ≥1
Which is the set of functions which, as n gets large, grow no faster than a constant time.
Graphically: Example: show
1. f(n)=10n+5 and g(n)=n. Show that
f(n) is O(g(n)).
2. f(n) = 3n2 +4n+1. Show that
f(n)=O(n2).
Home Practice
3. f(n) = 2n + n2 show that f(n)=O(2n)
Big-omega (Ω) 23
It is asymptotically tight lower bound of any function. Hence it denotes the best-
case complexity of any algorithm.
Formally:
f(n)=Ω (g(n)) if there exist c, n0 ∊ ℛ+ such that f(n)≥cg(n) or 0 ≤ cg(n)≤ f(n) for all n ≥ n0 ≥1,
considering 0 is important. Why?
Graphically: Example: show
1. f(n)=10n+5 and g(n)=n. Show that
f(n) is Ω (g(n)).
2. f(n) = 3n2 +4n+1. Show that
f(n)=Ω (n2).
Theta (θ) 24
Theta denotes the average-case complexity of any algorithm.
Formally:
f(n)=θ(g(n)) if there exist c1,c2 and n0 ∊ ℛ+ such that c1g(n) ≤f(n) ≤c2g(n) for all n ≥ n0 ≥1
Graphically:
Example: show
1. f(n)=10n+5 and g(n)=n. Show that
f(n) is θ (g(n)).
2. f(n) = 3n2 +4n+1. Show that f(n)=θ
(n2).
Little-oh(o) and Little-omega(ω) 25
Both little-oh(o) and little-omega(ω) notations used to represent upper bound and
lower bound.
Main difference with Big-Oh is that Big-Oh defines for some constants c by Little Oh
defines for all constants.
Main difference with Ω is that, ω defines for some constants c by ω defines for all
constants.
Why? Theta
Cont… 26
Formally:
1. Little-oh(o):
f(n)=o(g(n)) means for all c>0 there exists some n0≥1 such that f(n)<cg(n) for all n>= n0.
2. Little-omega(ω):
f(n)= ω (g(n)) means for all c>0 there exists some n0≥1 such that f(n)>cg(n) for all n>= n0.
Example: show
1. f(n)=10n+5 and g(n)=n. Show that f(n) is not o(g(n)).
2. f(n) = 3n2 +4n+1. Show that f(n)is ω(n2).
3. f(n)=10n+5 and g(n)=n2. Show that f(n) is o(g(n)).
27
= O(n ) if a > bk
Cont… 37