Asymptotic Notation 1
Asymptotic Notation 1
Asymptotic Notation 1
Asymptotic Notation
The main idea of asymptotic analysis is to have a measure of
efficiency of algorithms that doesn’t depend on machine specific
constants, mainly because this analysis doesn’t require algorithms
to be implemented and time taken by programs to be compared
It is often used to describe how the size of the input data affects an
algorithm’s usage of computational resources. Running time of an
algorithm is described as a function of input size n for large n.
• Asymptotic Notation
• Whenever we want to perform analysis of an algorithm,
we need to calculate the complexity of that algorithm.
• But when we calculate the complexity of an algorithm it
does not provide the exact amount of resource required.
• So instead of taking the exact amount of resource, we
represent that complexity in a general form (Notation)
which produces the basic nature of that algorithm.
• We use that general form (Notation) for analysis process.
• Asymptotic notation of an algorithm is a
mathematical representation of its complexity.
• Note - In asymptotic notation, when we want
to represent the complexity of an algorithm,
we use only the most significant terms in the
complexity of that algorithm and ignore least
significant terms in the complexity of that
algorithm (Here complexity can be Space
Complexity or Time Complexity).
• For example, consider the following time complexities of two
algorithms...
• Algorithm 1 : 5n2 + 2n + 1
• Algorithm 2 : 10n2 + 8n + 3
• Generally, when we analyze an algorithm, we consider the time
complexity for larger values of input data (i.e. 'n' value).
• In above two time complexities, for larger value of 'n' the term '2n +
1' in algorithm 1 has least significance than the term '5n2', and the
term '8n + 3' in algorithm 2 has least significance than the term
'10n2'.
• Here, for larger value of 'n' the value of most significant terms ( 5n2
and 10n2 ) is very larger than the value of least significant terms ( 2n
+ 1 and 8n + 3 ).
• So for larger value of 'n' we ignore the least significant terms to
represent overall time required by an algorithm. In asymptotic
notation, we use only the most significant terms to represent the
time complexity of an algorithm.
Asymptotic Notation
Asymptotic analysis of an algorithm - refers to defining the
mathematical framing of its run-time performance.
Usually, time required by an algorithm falls under three types:
Best Case − Minimum time required for program execution.
Average Case − Average time required for program execution.
Worst Case − Maximum time required for program execution.
Asymptotic analysis are input bound i.e., if there's no input to the
algorithm it is concluded to work in a constant time. Other than the
"input" all other factors are considered constant.
Asymptotic analysis refers to computing the running time of any
operation in mathematical units of computation.
Asymptotic Notation
The Big-O
Big oh(O): Definition: f(n) = O(g(n)) (read as f of n is big oh of g of n) if
there exist a positive integer n0 and a positive number c such that |f(n)| ≤ c|
g(n)| for all n ≥ n0 . Here g(n) is the upper bound of the function f(n).
Big “oh”: O
The asymptotic running time of an algorithm is defined in terms of
functions.
The upper bound for the function ‘f’ is provided by the big oh notation
(O).
Considering ‘g’ to be a function from the non-negative integers to the
positive real numbers, then it define O(g) as the set of function f such that
for some real constant c>0 and some non-negative integers constant n0.
The Big-O Notation
Example:
• Example
Omega: Ω
The lower bound for the function ‘f’ is provided by the
big omega notation (Ω).
Considering ‘g’ to be a function from the non-negative
integers to the positive real numbers, we define Ω(g) as
the set of function f such that for some real constant c>0
and some non-negative integers constant n0.
f(n) = Ω (g(n)) iff there exist positive constants c and n 0
such that f(n) ≥ cg(n) for all n, n n0.
• Big - Omege Notation (Ω)
• Big - Omega notation is used to define the lower bound of an
algorithm in terms of Time Complexity.
• That means Big-Omega notation always indicates the minimum
time required by an algorithm for all input values. That means
Big-Omega notation describes the best case of an algorithm time
complexity.
• Big - Omega Notation can be defined as follows...
• Consider function f(n) as time complexity of an algorithm and
g(n) is the most significant term. If f(n) >= C g(n) for all n >= n0, C
> 0 and n0 >= 1. Then we can represent f(n) as Ω(g(n)).
• f(n) = Ω(g(n))
• Consider the following graph drawn for the
values of f(n) and C g(n) for input (n) value on
X-Axis and time required is on Y-Axis