Mathematical Foundation: CSE 6331 Algorithms Steve Lai
Mathematical Foundation: CSE 6331 Algorithms Steve Lai
Mathematical Foundation: CSE 6331 Algorithms Steve Lai
2
• The running time of an algorithm is the number of
elementary operations required for the algorithm.
• It depends on the size of the input and the data
themselves.
• The worst-case time complexity of an algorithm is
a function of the input size n:
T (n) = the worst case running time over all instances
of size n.
• The worst-case asymptotic time complexity is the worst
case time complexity expressed in O, Ω, or Θ.
• The word asymptotic is often omitted.
3
O-Notation
Note: Unless otherwise stated, all functions considered
in this class are assumed to be nonnegative.
• Conventional Definition: We say f ( n ) = O ( g ( n )) or
f ( n ) is O ( g ( n )) if there exist positive constants c and
n0 such that f ( n ) ≤ cg ( n ) for all n ≥ n0 .
• More Abstract Definition:
f ( n ) : f ( n ) = O ( g ( n )) in the
O ( g ( n )) = ,
conventional meaning
i.e., the set of all functions that are O ( g ( n )) in the
conventional meaning.
4
• These expressions all mean the same thing:
4n 2 + 3n =
O(n 2 )
4n 2 + 3n is O ( n 2 )
4n 2 + 3n ∈ O ( n 2 )
4n 2 + 3n is in O ( n 2 ).
• Sometimes O ( g ( n )) is used to mean some function
in the set O ( g ( n )) which we don't care to specify.
• Example: we may write:
Let f ( n ) = 3n 2 + O (log n ).
5
Theorem 1
If f1 ( n ) O=
( g1 ( n )) and f 2 ( n ) O ( g 2 ( n )), then
1. f1 ( n ) + f 2 ( n )= O ( g1 ( n ) + g 2 ( n ))
2. f1 ( n ) + f 2 ( n ) =
O (max( g1 ( n ), g 2 ( n )))
3. f1 ( n ) · f 2 ( n ) = O ( g1 ( n ) · g 2 ( n ))
• Theorem 2: If f1 (n) =
Ω ( g1 (n)) and f 2 (n) =
Ω ( g 2 (n)), then
1. f1 (n) + f 2 (n) =
Ω ( g1 (n) + g 2 (n))
2. f1 (n) + f 2 (n) = Ω (max( g1 (n), g 2 (n)))
3. f1 (n) · f 2 (n) = Ω ( g1 (n) · g 2 (n))
7
Θ -Notation
• Conventional Definition: We say f ( n ) = Θ( g ( n )) if
there exist positive constants c1 , c2 , and n0 such that
c1 g ( n ) ≤ f ( n ) ≤ c2 g ( n ) for all n ≥ n0 . That is,
f ( n ) = Θ( g ( n )) ≡ [ f (n ) = O ( g (n )) and f (n ) = Ω ( g (n ))].
• In terms of sets: Θ( g ( n )) = O ( g ( n )) ∩ Ω ( g ( n ))
• Theorem 3: If f1 ( n ) =
Θ ( g1 ( n )) and f 2 ( n ) =
Θ ( g 2 ( n )), then
1. f1 ( n ) + f 2 ( n ) =
Θ ( g1 ( n ) + g 2 ( n ))
2. f1 ( n ) + f 2 ( n ) = Θ (max( g1 ( n ), g 2 ( n )))
3. f1 ( n ) · f 2 ( n ) = Θ ( g1 ( n ) · g 2 ( n ))
8
o -Notation, ω -Notation
• Definition of o
f ( n)
f (n) = o( g (n)) iff lim = 0.
n →∞ g ( n)
• Definition of ω
f ( n)
=f (n) ω ( g (n)) iff lim = ∞.
n →∞ g ( n)
9
Some Properties of Asymptotic Notation
• Transitive property:
= If f (n) O=
( g (n)) and g (n) O(h(n)),
10
Asymptotic Notation with Multiple Parameters
• Definition: We say that f ( m, n ) = O ( g ( m, n )) iff
11
Conditional Asymptotic Notation
• Let P(n) be a predicate. We write T (n) = O( f (n) | P(n))
iff there exist positive constants c, n0 such that T (n) ≤ cf (n)
for all n ≥ n0 for which P(n) is true.
4n 2 + 2n if n is even
T ( n) =
3n if n is odd
12
Smooth Functions
• A function f (n) is smooth iff f (n) is asymptotically
nondecreasing and f (2n) = O( f (n)).
• What about 2n ?
13
• Theorem 4. If f ( n ) is smooth, then f (bn ) = O ( f ( n ))
for any fixed positive integer b.
Proof. By induction on b.
Induction base: For b = 1, 2, obviously f (bn ) = O ( f ( n )).
Induction hypothesis: Assume f ( (b − 1)n ) =
O ( f ( n )),
where b > 2.
Induction step: Need to show f (bn ) = O ( f ( n )). We have:
) O ( f ( (b − 1)n ) ) ⊆ O ( f (n ))
f (bn ) ≤ f ( 2(b − 1)n=
(i.e., f (bn ) ≤ f ( 2(b − 1)n ) ≤ c1 f ( (b − 1)n ) ≤ c1c2 f ( n )
for some constants c1 , c2 and sufficiently large n ).
The theorem is proved.
14
• Theorem 5. If T (n) = O( f (n) | n a power of b), where
b ≥ 2 is a constant, T (n) is asymptotically nondecreasing
and f (n) is smooth, then T (n) = O( f (n)).
Proof. From the given conditions, we know:
1. T (n) is asymptotically nondecreasing.
2. T (n) ≤ c1 f (n) for n sufficiently large and a power of b.
3. f (bn) ≤ c2 f (n) for sufficiently large n.
For any n, there is a k such that b k ≤ n < b k +1.
When n is sufficiently large, we have
T (n) ≤ T (b k +1 ) ≤ c1 f (b k +1 ) ≤ c1c2 f (b k ) ≤ c1c2 f (n).
Be definition, T (n) = O( f (n)).
15
c1
bk n b k +1
17
Some Notations, Functions, Formulas
• x = the floor of x.
• x = the ceiling of x.
• log n = log 2 n. (Or lg n )
• 1 + 2 + · · · + n =n( n + 1) 2 =Θ( n 2 ).
• For constants k > 0, 1 + 2k + 3k + · · · + n k = Θ( n k +1 ).
n +1 n +1
1 − a a −1
• If a ≠ 1, then 1 + a + a + · · · =
2
+a n
= .
1− a a −1
• If a > 1, then f ( n ) =1 + a + a 2 + · · · + a n = Θ( a n ).
• If a < 1, then f ( n ) =+
1 a + a 2 + · · · + a n =Θ(1).
18
Analysis of Algorithm: Example
Procedure BinarySearch(A, x, i, j )
if i > j then return(0)
m ← (i + j ) 2
case
A[m] = x: return(m)
A[m] < x: return ( BinarySearch(A, x, m + 1, j ) )
A[m] > x: return ( BinarySearch(A, x, i, m − 1) )
end
19
Analysis of Binary Search
Let T ( n ) denote the worst-case running time.
T ( n ) satisfies the recurrence:
n
= T ( n ) T + c.
2
Solving the recurrence yields:
T ( n ) = Θ(log n )
20
Euclid's Algorithm
• Find gcd(a, b) for integers a, b ≥ 0, not both zero.
• Theorem:
= If b 0,= gcd(a, b) a.
If b > 0, gcd(a, b) = gcd(b, a mod b)
• function Euclid(a, b)
if b = 0
then return(a )
else return ( Euclid(b, a mod b) )
• Observe that =
ak b=
k −1 ck − 2 .
• W.l.o.g., assume a0 ≥ b0 . The values a0 , a2 , a4 ,
decrease by at least one half with each recursive call.
=
• Reason: If c : a mod b, then c < a 2.
• So, there are most O(log a0 ) recursive calls.
22
Solution to Q4 of example analysis
∑ log n 2
i2
i = 2 , 2 , 2 , , 2
0 1 2
log n 2 log n 2
∑( ) ∑ ( ) (
2
)
2
2 2 k log n
= = 2 k
= 2 Θ 2
=k 0=k 0
( =
)
( )
2
=
Θ 2 log n 2
Θ n 4
23