02 Asymptotes

Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 16

Lecture

Asymptotic Notation
Dr. Ashish Sharma
Math Background: Review & Beyond
1. Asymptotic notation

2. Math used in asymptotics

3. Recurrences

4. Probabilistic analysis

2
Asymptotes: Why?
How to describe an algorithm’s running time?
(or space, …)

How does the running time depend on the input?


T(x) = running time for instance x

Problem: Impractical to use, e.g.,

“15 steps to sort [3 9 1 7], 13 steps to sort [1 2 0 3 9], …”

Need to abstract away from the individual instances.

4
Asymptotes: Why?
Standard solution: Abstract based on size of input.

How does the running time depend on the input?


T(n) = running time for instances of size n

Problem: Time also depends on other factors.


E.g., on sortedness of array.

5
Asymptotes: Why?
Solution: Provide a bound over these instances.

Most common. Default.

Worst case T(n) = max{T(x) | x is an instance of size n}


Best case T(n) = min{T(x) | x is an instance of size n}
Average case T(n) = |x|=n Pr{x}  T(x)

Determining the input probability distribution can be difficult.

6
Asymptotes: Why?
What’s confusing about this notation?

Worst case T(n) = max{T(x) | x is an instance of size n}


Best case T(n) = min{T(x) | x is an instance of size n}
Average case T(n) = |x|=n Pr{x}  T(x)

Two different kinds of functions:


T(instance) T(size of instance)

Won’t use T(instance) notation again, so can ignore.


7
Asymptotes: Why?
Problem: T(n) = 3n2 + 14n + 27
Too much detail: constants may reflect implementation
details & lower terms are insignificant.

n 3n2 14n+17
Solution: Ignore the constants 1 3 31
& low-order terms. 10 300 157
100 30,000 1,417
1000 3,000,000 14,017
10000 300,000,000 140,017

3n2 > 14n+17


 “large enough” n 8
Upper Bounds
Creating an algorithm proves we can solve the
problem within a given bound.

But another algorithm might be faster.

E.g., sorting an array.


Insertion sort  O(n2)

9
Lower Bounds
Sometimes can prove that we cannot compute something
without a sufficient amount of time.

That doesn't necessarily mean we know how to compute it in


this lower bound.

E.g., sorting an array.


# comparisons needed in worst case  (n log n)
Will prove this soon…

10
Upper & Lower Bounds: Summary
Upper bounds:
 O() < o()

Lower bounds:
 () > ()

Upper & lower (“tight”) bounds:


= ()

11
O-notation
For function g(n), we
define O(g(n)), big-O of
n, as the set:

O(g(n)) = {f(n) :
 positive constants c and n0,
such that n  n0,
we have 0  f(n)  cg(n) }

12
CS
 -notation
For function g(n), we define
(g(n)), big-Omega of n, as
the set:

(g(n)) = {f(n) :
 positive constants c and n0,
such that n  n0,
we have 0  cg(n)  f(n)}

13
CS
-notation
For function g(n), we define
(g(n)), big-Theta of n, as
the set:

(g(n)) = {f(n) :
 positive constants c1, c2, and n0,
such that n  n0,
we have 0  c1g(n)  f(n)  c2g(n)}

14
CS
Relationship Between , O, 

15
CS
o-notation
For a given function g(n),
the set little-o:
o(g(n)) = {f(n):  c > 0,  n0 > 0 such
that  n  n0,
we have 0  f(n) < cg(n)}.

Intuitively: Set of all functions whose rate


of growth is lower than that of g(n).

g(n) is an upper bound for f(n)that is not


asymptotically tight.

16
CS
 -notation
For a given function g(n),
the set little-omega:

(g(n)) = {f(n):  c > 0,  n0 > 0


such that  n  n0,
we have 0  cg(n) < f(n)}.
Intuitively: Set of all functions whose
rate of growth is higher than that of g(n).

g(n) is a lower bound for f(n) that is not


asymptotically tight.

17

You might also like