Big Olittle o

Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

Big-O and Little-o

The Landau symbolic notation Big-O and little-o can be one of the most useful notations in
analysis. Learn it well and you become more flexible, quicker, and more sure of yourself in a wide
variety of analytic settings.
One concept associated with limits that we weave through this is the notion of eventually.
To understand this, let P (x) be a statement depending on the variable x that can be either true or
false. The statement as x 0, eventually P (x) means that > 0 such that for |x| < , P (x)
is true. The statement as n , eventually P (n) means that N such that for n > N, P (n)
is true. Similar statements can be made for all limit situations. As an example: lim f (x) = L iff
xa

 > 0, eventually |f (x) L| < , where in this case, eventually means > 0 such that for
|x a| < .
With the concept of eventually understood, we define of Big-O and little-o. We can define
these concepts in any limit situation. In what follows, the function g(x) that we are comparing
ought be eventually monotone , nonzero, and should be reasonably simple, a concept that we
wont define.
f (x)
= 0.
x? g(x)

Definition: As x ?, f (x) = o(g(x)) iff lim

Definition: As x ?, f (x) = O(g(x)) iff C such that eventually |f (x)| C|g(x)|.


The second definition could be rephrased as saying that the fraction

f (x)
is eventually
g(x)

bounded.
Some examples of this notation:
If m < n, as x , xm = o(xn ).
If m < n, as x 0, xn = o(xm ).
If n > 0, as x , ln(x) = o(xn ).
If n < 0, as x 0+ , ln(x) = o(xn ).
As x , sin x = O(1).
As x 0, sin x = O(x).
As x 0, sin x = x + O(x3 ).
x3
As x 0, sin x = x
+ O(x5 ).
6
p
As n , n2 + 1 = O(n).  
p
1
As n , n2 + 1 = n + O
.
n
This notation is very closely tied up with differentiation and with Taylor polynomials.
For instance, we can prove that the function f is differentiable as x iff a number which we call
f 0 (x) such that as h 0, f (x + h) = f (x) + f 0 (x)h + o(h). That is the definition of the derivative
right there. The thing were calling o(h) is f (x + h) f (x) f 0 (x)h and calling it o(h) is saying
that
f (x + h) f (x) f 0 (x)h
f (x + h) f (x)
lim
= lim
f 0 (x) = 0.
h0
h0
h
h
1

Thats what we get if f is differentiable at x. If we have something stronger, in particular that


f has a bounded second derivative in a neighborhood of x, than we can say something stronger:
f (x + h) = f (x) + f 0 (x)h + O(h2 ).
In general, if f has n bounded derivatives in a neighborhood of x, Taylors Theorem says
f (x + h) =

n1
X
k=0

f [k] (x) k
h + O(hn ).
k!

Youll notice that this last statement is slightly weaker than the full statement of Taylors
Theorem. We have merely said that the error is no worse than some constant times hn . The full
statement of the theorem gives us some tools for estimating how big that constant is; using big-O
notation essentially announces that we dont care how big the constant is, as long as its constant.
The reason big-O and little-o is a powerful an flexible notation is that more often than not, we
dont care what the constant is.

Manipulating this notation:


Most of what we have here is common sense. Some useful ideas:
O(g1 (x)) + O(g2 (x)) = O(max(g1 (x), g2 (x)))
g1 (x) O(g2 (x)) = O(g1 (x) g2 (x))
O(g1 (x)) O(g2 (x)) = O(g1 (x) g2 (x))
Wed like to be able to say that for reasonable functions w, w(O(g(x))) = O(w(g(x))), but we
cant always. Theres no problem with saying (O(h))2 = O(h2 ), but eO(ln x) isnt well defined. You
have to be careful there.
You should avoid dividing by big-O or little-o. However, one can make sense of something like
1
1
1
by long division:
= + O(x) as x 0.
2 + O(x)
2 + O(x)
2

Examples:
1 cos(x3 )
.
x0 sin2 x sin(x4 )

Example 1: Compute lim

Technically speaking, you could use LHopitals Rule. But Im not willing to demonstrate that,
nor am I willing to help you out if youre foolish enough to try. But this is in fact very quick, using
u2
the Taylor series of the functions in question: cos u = 1 + O(u4 ) and sin u = u + O(u3 ). Hence,
2


6
x6
12
1 1 x2 + O(x12 )
1 cos(x3 )
2 + O(x )
=
=
(x + O(x3 ))2 (x4 + O(x12 ))
(x2 + O(x4 ))(x4 + O(x12 ))
sin2 x sin(x4 )
=

x6
12
2 + O(x )
x6 + O(x8 )

1
2

+ O(x6 )
1 + O(x2 )

1
which clearly has the limit . To be honest, we spent more effort keeping track of exponents there
2
than we needed to. This is a problem that a reasonably experienced person can work in his or her
head.
Example 2: If f (x) =

p
cos(x), find f 00 (0) and f [4] (0).

That sounds simple enough, but it can get messy. Lets find the fourth degree Taylor polynomial
instead, using the cosine series and the binomial series. The binomial series tells us that
(1 + u) = 1 + u +

( 1) 2
u + O(u3 )
2

as u 0. (Note: if we need more terms than this, well find out, and can return to this point to
get more terms.)

 12
x2 x4
6
cos x = 1
+
+ O(x )
2
24

 2
 1
2
1 

1
x
x4
x2
6
4
2
2
=1+
+
+ O(x ) +
+ O(x ) + O(x6 )
2
2
24
2
2
 4

2
4
x
x
1 x
=1
+
+ O(x6 )
+ O(x6 ) + O(x6 )
4
48
8 4


x2
1
1
=1
+

x4 + O(x6 )
4
48 32
x2 x4

+ O(x6 )
=1
4
96
 


1
1
1
1
We can now read off that f 00 (0) = 2!
= and f [4] (0) = 4!
= .
4
2
96
3

" Z 
#n

1
cos x 1
Example 3: Find lim
1+
dx
n 0
n
This will provide a good workout for our calculus methods. This is a 1 indeterminate case,
so we start by taking the logarithm.
For small u, (1 + u)1 = 1 u + u2 + O(u3 ). (Geometric series).


cos x 1
cos x cos2 x
Thus, 1 +
=1 +
+ O(n3/2 ).
n
n
n
1

Z
0



Z
cos x 1
1
cos x cos2 x
1
1+
dx =
1 +
+ O(n3/2 ) dx = 1 + 0 +
+ O(n3/2 ).
0
n
2n
n
n

For small u, ln(1 + u) = u + O(u2 ). Thus,


" Z 
#





1
cos x 1
1
1
3/2
3/2
2
n ln
1+
dx = n ln 1 +
+ O(n
) =n
+ O(n
) + O(n )
0
2n
2n
n
=

1
1
+ O(n1/2 ). This goes to , so our original limit ie e1/2 = e.
2
2
3

Exercises:

1. Does the series

n3 [sin(arctan(n1 )) arctan(sin(n1 ))] converge or diverge?

n=1

2. Suppose that f is defined on an interval, and y in that interval, as x y,


f (x) f (y) = o(|x y|). Prove that f is constant.
3. In all parts of this problem, assume that f C 4 (that is, has four continuous derivatives).
f (x + h) f (x)
= f 0 (x) + O(h) as h 0.
h
f (x + h) f (x h)
(b) Show that
= f 0 (x) + O(h2 ) as h 0.
2h
f (x + h) 2f (x) + f (x h)
(c) Show that
= f 00 (x) + O(h2 ) as h 0
h2
 

X
1
as n , then
an converges. If true, prove. If false, give a
4. True or false: if an = o
n
n=1
counterexample.
(a) Show that

You might also like