103 Assymptotic Notations
103 Assymptotic Notations
103 Assymptotic Notations
2 Asymptotic Notation
Asymptotic Growth of Functions
Greek: asumptotos - not touching (a - not, sumptotos -
intersecting)
Math: curves that do not intersect, but come infinitely
close
3 Asymptotic Notation
Why Asymptotic notation ?
The main idea of asymptotic analysis is to have a measure of
efficiency of algorithms that doesn’t depend on machine
specific constants, and doesn’t require algorithms to be
implemented and time taken by programs to be
compared.
Asymptotic Analysis is the evaluation of the performance of
an algorithm in terms of just the input size (N), where N is
very large.
It gives you an idea of the limiting behavior of an application,
and hence is very important to measure the performance of
your code.
We calculate, how the time (or space) taken by
an algorithm increases with the input size
4 Asymptotic Notation
Space ?
5 Asymptotic Notation
Asymptotic notation
Asymptotic notations are method used to estimate and
represent the efficiency of an algorithm using simple formula.
This can be useful for separating algorithms that leads to
different amounts of work for large inputs.
The notations we use to describe the asymptotic running time
of an algorithm are defined in terms of functions whose
domains are the set of natural numbers N = {0, 1, 2, …}
Such notations are convenient for describing the worst-case
running-time function T(n), which is usually defined only on
integer input sizes.
6 Asymptotic Notation
Asymptotic notation
Asymptotic notations give a limit (bound) to the growth of a
function. This bound could be upper or lower bound for a
function.
A bound is asymptotically tight if it bounds the function with a
constant difference.
7 Asymptotic Notation
Asymptotic notation
8 Asymptotic Notation
Asymptotic notation ?
Asymptotic notations are the mathematical notations used to
describe the running time of an algorithm when the input tends
towards a particular value or a limiting value.
We use three types of asymptotic notations to represent the
growth of any algorithm, as input increases: Big Theta (Θ) Big Oh(O)
Big Omega (Ω).
Big-O notation represents the upper bound of the running time
of an algorithm. Thus, it gives the worst-case complexity of an
algorithm.
Omega notation represents the lower bound of the running
time of an algorithm. Thus, it provides the best case complexity
of an algorithm.
Theta notation encloses the function from above and below. Since
it represents the upper and the lower bound of the running time of
an algorithm, it is used for analyzing the average-case complexity
of an algorithm.
9 Asymptotic Notation
Properties of Asymptotic Notations :
1. General Properties :
If f(n) is O(g(n)) then a*f(n) is also O(g(n)) ; where a is a
constant.
Example: f(n) = 2n²+5 is O(n²)
then 7*f(n) = 7(2n²+5) = 14n²+35 is also O(n²) .
Similarly this property satisfies for both Θ and Ω notation.
We can say
If f(n) is Θ(g(n)) then a*f(n) is also Θ(g(n)) ; where a is a
constant.
If f(n) is Ω (g(n)) then a*f(n) is also Ω (g(n)) ; where a is a
constant.
10 Asymptotic Notation
Properties of Asymptotic Notations
2. Transitive Properties :
If f(n) is O(g(n)) and g(n) is O(h(n)) then f(n) = O(h(n)) .
Example: if f(n) = n, g(n) = n² and h(n)=n³
n is O(n²) and n² is O(n³)
then n is O(n³)
Similarly this property satisfies for both Θ and Ω notation.
We can say
If f(n) is Θ(g(n)) and g(n) is Θ(h(n)) then f(n) = Θ(h(n)) .
If f(n) is Ω (g(n)) and g(n) is Ω (h(n)) then f(n) = Ω (h(n))
11 Asymptotic Notation
Properties of Asymptotic Notations
3. Reflexive Properties :
Reflexive properties are always easy to understand after
transitive.
If f(n) is given then f(n) is O(f(n)). Since MAXIMUMVALUE OF f(n)
will be f(n) ITSELF !
Hence x = f(n) and y = O(f(n) tie themselves in reflexive
relation always.
Example: f(n) = n² ; O(n²) i.e O(f(n))
Similarly this property satisfies for both Θ and Ω notation.
We can say that:
If f(n) is given then f(n) is Θ(f(n)).
If f(n) is given then f(n) is Ω (f(n)).
12 Asymptotic Notation
Properties of Asymptotic Notations
4. Symmetric Properties :
If f(n) is Θ(g(n)) then g(n) is Θ(f(n)) .
13 Asymptotic Notation
Properties of Asymptotic Notations
6. Some More Properties :
1.) If f(n) = O(g(n)) and f(n) = Ω(g(n)) then f(n) = Θ(g(n))
2.) If f(n) = O(g(n)) and d(n)=O(e(n))
then f(n) + d(n) = O( max( g(n), e(n) ))
Example: f(n) = n i.e O(n)
d(n) = n² i.e O(n²)
then f(n) + d(n) = n + n² i.e O(n²)
14 Asymptotic Notation
Order of growth
Lower order item(s) are ignored, just keep the highest order item.
15 Asymptotic Notation
Practical Complexity
Analysing the running time of algorithms
Suppose that we have two algorithms A1 and A2 of running times in
the order of n log n. Which one should we consider to be preferable
to the other?
Technically, since they have the same time complexity, we say that
they have the same running time within a multiplicative constant, that
is, the ratio between the two running times is constant.
In some cases, the constant may be important and more detailed
analysis of the algorithm or conducting some experiments on the
behavior of the algorithm may be helpful.
It may be necessary to investigate other factors, e.g. space
requirements and input distribution.
The latter is helpful in analyzing the behavior of an algorithm on the
average.
17 Asymptotic Notation
Growth of some typical functions that represent running times.
18 Asymptotic Notation
Observation
Exponential and hyper-exponential functions grow much faster
than the ones shown in the figure (previous slide).
Higher order functions and exponential and hyper-exponential
functions are not shown in the figure, even for moderate values
of n.
Functions of the form logk n; cn; cn2; cn3 are called, respectively,
logarithmic, linear, quadratic and cubic.
Functions of the form nc or nc logk n; 0 < c < 1; are called
sublinear.
Functions that lie between linear and quadratic, e.g. n log n and
n1:5, are called subquadratic.
19 Asymptotic Notation
Running times for different sizes of input.
Note: “nsec" stands for nanoseconds, “”is one microsecond and “cent" stands for centuries.
The explosive running time (measured in centuries) when it is of the order 2n.
20 Asymptotic Notation
Functions in order of growth rate
21 Asymptotic Notation
Recall the Hierarchy of Growth Rates
Easy
c log N N N log N N N
k 2 3
2 3 N ! N
N N N
Hard
22 Asymptotic Notation
Elementary operation
Definition: We denote by an “elementary operation" any
computational step whose cost is always upper bounded by a constant
amount of time regardless of the input data or the algorithm used.
The following operations on fixed-size operands are examples of
elementary operation.
Arithmetic operations: addition, subtraction, multiplication and
division.
Comparisons and logical operations.
Assignments, including assignments of pointers when, say, traversing
a list or a tree.
In order to formalize the notions of order of growth and time
complexity, special mathematical notations have been widely used.
These notations make it convenient to compare and analyze running
times with minimal use of mathematics and cumbersome
calculations.
23 Asymptotic Notation
Elementary operation
An elementary operation in an algorithm is called a basic
operation if it is of highest frequency to within a constant factor
among all other elementary operations.
When analyzing searching and sorting algorithms, we may
choose the element comparison operation if it is an
elementary operation.
In matrix multiplication algorithms, we select the operation of
scalar multiplication.
In traversing a linked list, we may select the “operation" of
setting or updating a pointer.
In graph traversals, we may choose the “action" of visiting a
node, and count the number of nodes visited.
24 Asymptotic Notation
BIG-OH COMPLEXITY
Big O notation
Big O notation or Big Oh notation, and also Landau
notation or asymptotic notation, is a mathematical
notation used to describe the asymptotic behavior of
functions.
Its purpose is to characterize a function's behavior for very
large (or very small) inputs in a simple but rigorous way
that enables comparison to other functions.
More precisely, the symbol O is used to describe an
asymptotic upper bound for the magnitude of a function in
terms of another, usually simpler, function.
26 Asymptotic Notation
Big O notation
There are also other symbols o, Ω, ω, and Θ for various
other upper, lower, and tight bounds.
Two main areas of application: in mathematics, it is usually
used to characterize the residual term of a truncated
infinite series, especially an asymptotic series, and in
computer science, it is useful in the analysis of the
complexity of algorithms.
Informally, the O notation is commonly employed to
describe an asymptotic tight bound, but tight bounds are
more formally and precisely denoted by the Θ (capital
theta) symbol as described below.
27 Asymptotic Notation
BIG Oh notation
28 Asymptotic Notation
BIG Oh notation
Let f and g be functions from positive integers to
positive integers. We say f is O(g(n)) (read: ''f is
order g'') if g is an upper bound on f: there exists a
fixed constant c and a fixed n0 such that for all n ≥
n0 ,
f(n) ≤ cg(n).
Equivalently, f is O(g(n)) if the function f(n)/g(n) is
bounded above by some constant.
29 Asymptotic Notation
Big oh notation(O)
O(g(n)) = {f(n): there exist positive constants c and n0
such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0}
30 Asymptotic Notation
Big oh notation(O)
The meaning of an expression like O(n2) is really a set of
functions: all the functions that are O(n2).
When we say that f(n) is O(n2), we mean that f(n)is a member
of this set.
It is also common to write this as f(n) = O(g(n)) although it
is not really an equality.
31 Asymptotic Notation
Big oh notation(O)
32 Asymptotic Notation
Big oh notation(O)
Examples – (‘=’ symbol readed as ‘is’ instead of ‘equal’)
The function 3n+2 = O(n) as 3n+2 ≤ 4n for all n ≥ 2
The function 3n+3 = O(n) as 3n+3 ≤ 4n for all n ≥ 3
The function 100n+6 = O(n) as 100n+6 ≤ 101n for all n ≥ 6
The function 10n2 +4n+2 = O(n2) as 10n2 +4n+2 ≤ 11n2 for all n≥5
The function 1000n2 +100n-6 = O(n2) as 1000n2 +100n-6 ≤ 1001n2 for all
n ≥ 100
The function 6*2n +n2 = O(2n) as 6*2n +n2 ≤ 7*2n for all n≥4
The function 3n+3 = O(n2 ) as 3n+3 ≤ 3n2, for n 2.
The function 10n2+4n +2 = O (n4) as 10n2+4n +2 ≤ 10n4 ,for n > 2
3n + 2 # O(1) as 3n +2 is not less than or equal to c for any constant c and
all n >n0.
10n2+4n +2 # O(n)
33 Asymptotic Notation
Big oh notation(O)
Consider the job offers from two companies. The first company offer
contract that will double the salary every year. The second company
offers you a contract that gives a raise of Rs. 1000 per year.
34 Asymptotic Notation
f(n) is a polynomial in n
35 Asymptotic Notation
Proving asymptotic bounds
36 Asymptotic Notation
Proving asymptotic bounds
37 Asymptotic Notation
Proving asymptotic bounds
38 Asymptotic Notation
Proving asymptotic bounds
39 Asymptotic Notation
Proving asymptotic bounds
40 Asymptotic Notation
Proving asymptotic bounds
41 Asymptotic Notation
Proving asymptotic bounds
42 Asymptotic Notation
Proving asymptotic bounds
43 Asymptotic Notation
Proving asymptotic bounds
44 Asymptotic Notation
Lower Bounds: Omega ()
Big Omega notation is used to define the lower bound of any
algorithm or we can say the best case of any algorithm.
This always indicates the minimum time required for any
algorithm for all input values, therefore the best case of any
algorithm.
In simple words, when we represent a time complexity for any
algorithm in the form of big-Ω, we mean that the algorithm will
take at least this much time to complete it's execution. It can
definitely take more time than this too.
The notation Ω(n) is the formal way to express the lower bound
of an algorithm's running time. It measures the best case time
complexity or the best amount of time an algorithm can possibly
take to complete.
45 Asymptotic Notation
BIG OMEGA notation ()
46 Asymptotic Notation
BIG OMEGA notation ()
We say that f is Ω(g(n)) (read: "f is omega of g") if g is
a lower bound on f for large n. Formally, f is Ω(g) if there is a
fixed constant c and a fixed n0 such that for all n>n0,
cg(n) ≤ f(n)
For example, any polynomial whose highest exponent
is nk is Ω(nk).
If f(n) is Ω(g(n)) then g(n) is O(f(n)). If f(n) is o(g(n))
then f(n) is not Ω(g(n)).
47 Asymptotic Notation
Big omega notation ()
This notation is used to describe the best case running time of algorithm
and concerned with the very large values of N.
48 Asymptotic Notation
Definition: Big omega ()
Ω(g(n)) = {f(n): there exist positive constants c and
n0 such that 0 ≤ cg(n) ≤ f(n) for all n ≥ n0}
49 Asymptotic Notation
Big omega ()
50 Asymptotic Notation
Big omega ()
The function 3n+2 = Ω(n) as 3n+2 ≥ 3n for all n ≥ 1
The function 3n+3 = Ω(n) as 3n+3 ≥ 3n for all n ≥ 1
The function 100n+6 = Ω(n) as 100n+6 ≥ 100n for all n≥1
The function 10n2 +4n+2 = Ω(n2) as 10n2 +4n+2 ≥ n2 for all
n≥1
The function 6*2n +n2 = Ω(2n ) as 6*2n +n2 ≥ 2n for all n≥1
Observation
The function 3n+3 = Ω(1)
The function 10n2 +4n+2 = Ω(n)
The function 10n2 +4n+2 = Ω(1)
51 Asymptotic Notation
Big omega ()
52 Asymptotic Notation
Big omega ()
53 Asymptotic Notation
BIG THETA Notation
This Notation is used to
describe the average
case running time of
algorithm and
concerned with very
large values of n.
The notation θ(n) is the formal way to express both the lower
bound and the upper bound of an algorithm's running time
54 Asymptotic Notation
Big theta notation(Θ)
We say that f is Θ(g(n)) (read: "f is theta of g") if g is an accurate
characterization of f for large n: it can be scaled so it is both an
upper and a lower bound of f. That is, f is both O(g(n)) and
Ω(g(n)).
Expanding out the definitions of Ω and O, f is Θ(g(n)) if there
are fixed constants c1and c2 and a fixed n0 such that for all n>n0,
c1g(n) ≤ f(n) ≤ c2 g(n)
For example, any polynomial whose highest exponent
is nk is Θ(nk). If f is Θ(g), then it is O(g) but not o(g).
Sometimes people use O(g(n)) a bit informally to mean the
stronger property Θ(g(n)); however, the two are different.
55 Asymptotic Notation
Big theta notation(Θ)
This Notation is used to describe the average case running
time of algorithm and concerned with very large values of n.
56 Asymptotic Notation
Definition: Big theta (Θ)
Θ(g(n)) = {f(n): there exist positive constants c1, c2 and
n0 such that 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n) for all n ≥ n0}
57 Asymptotic Notation
Big theta (Θ)
58 Asymptotic Notation
Big theta (Θ)
The function 3n+2 = Θ(n) as 3n+2 ≥ 3n for all n≥2 and 3n+2
≤4n for all n≥2 So, c1=3, c2=4 and n0=2.
The function 3n+3 = Θ(n)
The function 10n2 +4n+2 = Θ (n2 )
The function 6 * 2n +n2 = Θ(2n )
59 Asymptotic Notation
Big theta (Θ)
60 Asymptotic Notation
Examples
61 Asymptotic Notation
Examples
62 Asymptotic Notation
Calculating time complexity
A simple program fragment
1. int
2. sum(int n)
3. {
4. int i, partialsum;
5. partialsum = 0;
6. for( i=1; i<= n; i++)
7. partialsum + = i*i*i;
8. return partialsum
9. }
64 Asymptotic Notation
Little ο asymptotic notation
Big-Ο is used as a tight upper-bound on the growth of an
algorithm’s effort (this effort is described by the function f(n)),
even though, as written, it can also be a loose upper-bound.
“Little-ο” (ο()) notation is used to describe an upper-bound that
cannot be tight.
Definition : Let f(n) and g(n) be functions that map positive
integers to positive real numbers. We say that f(n) is ο(g(n)) (or
f(n) Ε ο(g(n))) if for any real constant c > 0, there exists an
integer constant n0 ≥ 1 such that 0 ≤ f(n) < c*g(n).
Thus, little o() means loose upper-bound of f(n). Little o is
a rough estimate of the maximum order of growth whereas Big-
Ο may be the actual order of growth.
65 Asymptotic Notation
Little ο asymptotic notation
In mathematical
relation,
f(n) = o(g(n)) means
lim f(n)/g(n) = 0
n→∞
66 Asymptotic Notation
Little ο asymptotic notation
67 Asymptotic Notation
Little ο asymptotic notation
Is 7n + 8 = o(n2)?
In order for that to be true, for any c, we have to be able to find an n0
that makes
f(n) < c * g(n) asymptotically true.
lets took some example,
If c = 100,we check the inequality is clearly true. If c = 1/100 ,
we’ll have to use a little more imagination, but we’ll be able to find
an n0.
(Try n0 = 1000.) From these examples, the conjecture appears to be
correct.
then check limits,
lim f(n)/g(n) = lim (7n + 8)/(n2) = lim 7/2n = 0 (l’hospital)
n→∞ n→∞ n→∞
68 Asymptotic Notation
Little ω asymptotic notation
Definition : Let f(n) and g(n) be functions that map positive
integers to positive real numbers. We say that f(n) is ω(g(n)) (or f(n)
∈ ω(g(n))) if for any real constant c > 0, there exists an integer
constant n0 ≥ 1 such that f(n) > c * g(n) ≥ 0 for every integer n ≥ n0.
f(n) has a higher growth rate than g(n) so main difference between
Big Omega (Ω) and little omega (ω) lies in their definitions. In the
case of Big Omega f(n)=Ω(g(n)) and the bound is 0<=cg(n)<=f(n),
but in case of little omega, it is true for 0<=c*g(n)<f(n).
The relationship between Big Omega (Ω) and Little Omega (ω) is
similar to that of Big-Ο and Little o except that now we are looking
at the lower bounds.
Little Omega (ω) is a rough estimate of the order of the growth
whereas Big Omega (Ω) may represent exact order of growth. We
use ω notation to denote a lower bound that is not asymptotically
tight. And, f(n) ∈ ω(g(n)) if and only if g(n) = ο((f(n)).
69 Asymptotic Notation
Little ω asymptotic notation
Prove that 4n + 6 = ω(1);
the little omega(ο) running time can be proven by applying
limit formula given below.
if lim f(n)/g(n) = ∞ then functions f(n) is ω(g(n))
n→∞
here,we have functions f(n)=4n+6 and g(n)=1
lim (4n+6)/(1) = ∞
n→∞
and,also for any c we can get n0 for this inequality 0 <= c*g(n)
< f(n), 0 <= c*1 < 4n+6
Hence proved.
70 Asymptotic Notation
Performance Measurement
Performance Measurement
Performance measurement is generally defined as
regular measurement of outcomes and results, which generates
reliable data on the effectiveness and efficiency of
algorithm/programs.
Performance measurement is concerned with obtaining the space
and time requirements of a particular algorithm.
These quantities depend on the compiler and options used as well
as on the computer on which the algorithm is run.
72 Asymptotic Notation
Performance Measurement
73 Asymptotic Notation
Input Size and Problem Instance
The size of the instance of a problem is the size of the
representation of the input.
The input size, as a quantity, is not a precise measure of the input,
and its interpretation is subject to the problem for which the
algorithm is, or is to be, designed.
Some of the commonly used measures of input size are the following:
In sorting and searching problems, we use the number of
entries in the array or list as the input size.
In graph algorithms, the input size usually refers to the number of
vertices or edges in the graph, or both.
74 Asymptotic Notation
Reference for Graph Theory & Graph algorithm
75 Asymptotic Notation
Reference for Graph Theory & Graph algorithm
76 Asymptotic Notation
Complexity of some Graph algorithm
A graph (sometimes called undirected graph for distinguishing from
a directed graph, or simple graph for distinguishing from a multigraph) is
a pair G = (V, E), where V is a set whose elements are
called vertices (singular: vertex), and E is a set of paired vertices, whose
elements are called edges (sometimes links or lines).
77 Asymptotic Notation
Input Size and Problem Instance
Some of the commonly used measures of input size are the following:
In sorting and searching problems, we use the number of
entries in the array or list as the input size.
In graph algorithms, the input size usually refers to the number of
vertices or edges in the graph, or both.
In computational geometry, the size of input to an algorithm is
usually expressed in terms of the number of points, vertices, edges,
line segments, polygons, etc.
In matrix operations, the input size is commonly taken to be the
dimensions of the input matrices.
In number theory algorithms and cryptography, the number
of bits in the input is usually chosen to denote its length. The number
of words used to represent a single number may also be chosen as
well, as each word consists of a fixed number of bits.
78 Asymptotic Notation
Comparison of two algorithms for computing the sum
79 Asymptotic Notation
Comparison of two algorithms for computing the sum
80 Asymptotic Notation
Example: Time complexity
81 Asymptotic Notation
Plan the experiment: To estimate run time of a Algorithm
1. What is the accuracy of the clock? How accurate do our results have to
be? Once the desired accuracy is known, we can determine the length of
the shortest event that should be timed.
2. For each instance size, a repetition factor needs to be determined. This is
to be chosen such that the event time is at least the minimum time that
can be clocked with the desired accuracy.
3. Are we measuring worst-case or average performance? Suitable test data
need to be generated.
4. What is the purpose of the experiment? Are the times being obtained for
comparative purposes, or are they to be used to predict run times?
5. In case the times are to be used to predict run times, then we need to fit
a curve through the points. For this, the asymptotic complexity should be
known. If the asymptotic complexity is linear, then a least-squares
straight line can be fit; if it is quadratic, then a parabola can be used
82 Asymptotic Notation
Shortcomings of asymptotic analysis
let algorithm A be asymptotically better than algorithm B. Here are some
common issues with algorithms that have better asymptotic behavior:
Implementation complexity
Algorithms with better complexity are often (much) more complicated.
This can increase coding time and the constants.
Small input sizes
Asymptotic analysis ignores small input sizes. At small input sizes, constant
factors or low order terms could dominate running time, causing B to
outperform A.
Worst case versus average performance
If A has better worst case performance than B, but the average performance
of B given the expected input is better, then B could be a better choice
than A. Conversely, if the worst case performance of B is unacceptable (say
for life-threatening or mission-critical reasons), A must still be used.
83 Asymptotic Notation
Testing an algorithm
[1] Is it efficient? “Does the algorithm do what it is supposed to do?”
If algorithm produces the correct output then most likely the algorithm is
efficient.
This can be done by some test cases that consist of several input samples
and check if we get the predicted output.
[2] Is it time efficient?
If the algorithm work and produces correct result with in a acceptable
time.
[3] Is it reliable?
Reliability of a algorithm is the ability to run fault-free for a certain period
of time in a given environment.
If your algorithm crashes when overwhelmed with a lot of input data at the
same time, then you better modify it or think about improving it as fast as
you can.
84 Asymptotic Notation
Generating Test Data
Generating a data set that results in the worst-case performance
of an algorithm:
Generating a data set that results in the average-case
performance of an algorithm:
Whether we are estimating worst-case or average-time using
random data, the number of instances that we can try is
generally much smaller than the total number of such instances.
Hence, it is desirable to analyze the algorithm being tested to
determine classes of data that should be generated for the
experiment.
85 Asymptotic Notation
Thanks for Your Attention!
86 Asymptotic Notation
Exercises
Problems
1. Suppose a certain algorithm has been empirically shown to sort 100,000
integers or 100,000 floating-point numbers in two seconds. Would we
expect the same algorithm to sort 100,000 strings in two seconds?
Explain.
2. Suppose you need to search a given ordered array of n integers for a
specific value. The entries are sorted in non-decreasing order.
Give a simple algorithm that has a worst-case complexity of Θ(n)
We can do better by using the “binary search” strategy. Explain what
it is and give an algorithm based on it. What is the worst-case
complexity of this algorithm?
3. Explain the meaning of asymptotic bounds. Explain the importance of
asymptotic notation. Determine if an assertion involving asymptotic
bounds is valid. Ex: Is 3n2+3n = O(n)?
4. Compare the growth rates of two expressions, 2 {lg n} and (lg n)2.
88 Asymptotic Notation
Problems
Explain why the statement, “The running time of algorithm A is
at least O(n2)” is meaningless.
89 Asymptotic Notation
Problems
90 Asymptotic Notation
Problems
91 Asymptotic Notation
Problems
92 Asymptotic Notation
Problems
93 Asymptotic Notation
Problems
94 Asymptotic Notation
Questions
1. How does one calculate the running time of an algorithm?
2. How can we compare two different algorithms?
3. How do we know if an algorithm is `optimal'?
4. When we say an algorithm is O(1) what do we mean?
5. You are given list of numbers, obtained by rotating a sorted list an
unknown number of times. Write a function to determine the
minimum number of times the original sorted list was rotated to
obtain the given list. Your function should have the worst-case
complexity of O(log N), where N is the length of the list. You can
assume that all the numbers in the list are unique.
Example: The list [5, 6, 9, 0, 2, 3, 4] was obtained by rotating
the sorted list [0, 2, 3, 4, 5, 6, 9] 3 times.
Problem Solving & Algorithm Development, Prof.
95 Bibhudatta Sahoo@2022 1/25/2022
Questions
6. Suppose that the actual time taken to execute an O(N2)
sorting algorithm on a very fast machine is 2.2x10-
8N2 seconds; now suppose that the actual time taken to
98 Asymptotic Notation
Properties
Transitivity
f(n) = (g(n)) & g(n) = (h(n)) f(n) = (h(n))
f(n) = O(g(n)) & g(n) = O(h(n)) f(n) = O(h(n))
f(n) = (g(n)) & g(n) = (h(n)) f(n) = (h(n))
f(n) = o (g(n)) & g(n) = o (h(n)) f(n) = o (h(n))
f(n) = (g(n)) & g(n) = (h(n)) f(n) = (h(n))
Reflexivity
f(n) = (f(n))
f(n) = O(f(n))
f(n) = (f(n))
99 Asymptotic Notation
Properties
Symmetry
f(n) = (g(n)) iff g(n) = (f(n))
Complementarity
f(n) = O(g(n)) iff g(n) = (f(n))
f(n) = o(g(n)) iff g(n) = ((f(n))
n n
n0 n0
f(n) = ( g(n)) f(n) = O( g(n))
There exist positive constants c
such that there is a positive constant n0
such that …
f(n)
cg(n)
n
n0
Asymptotic Notation
f(n) = ( g(n))
101
There exist c > 0 and n0 > 0 such that:
f(n) = O(g(n))
0 ≤ f(n) ≤ cg(n) for each n ≥ n0
, O, , o,
Defined for functions over the natural numbers.
Ex: f(n) = (n2).
Describes how f(n) grows in comparison to n2.
Define a set of functions; in practice used to compare two
function sizes.
The notations describe different rate-of-growth relations
between the defining function and the defined set of
functions.
Asymptotic notations
Big oh (O), big omega (Ω), big theta (Θ)
Example complexities
Searching an unordered array, finding the maximum element in
an array
Θ(n) where n is the size of the array
Matrix multiplication algorithm above
Θ(n3) where each matrix is nXn