Unit 3 - Bounds and Inequalities
Unit 3 - Bounds and Inequalities
Unit 3 - Bounds and Inequalities
Processes
Unit III – Bounds and Inequalities
Markov Inequality
Theorem
If X is any nonnegative random variable, then
E(X)
P (X ≥ c) ≤ , for any c > 0. (1)
c
Proof (X is discrete):
X
E(X) = x P (X = x)
X X
= x P (X = x) + x P (X = x)
x≥c x<c
X
≥ x P (X = x) (for any c > 0)
x≥c
X
≥ c P (X = x) (since x ≥ c)
x≥c
X
=c P (X = x)
x≥c
= c P (X ≥ c).
Hence,
E(X)
P (X ≥ c) ≤ .
c
Proof (X is continuous):
Z ∞
E(X) = x f (x)dx
−∞
Z ∞
= x f (x)dx (since X is positive-valued)
Z0 ∞
≥ x f (x)dx (for any c > 0)
c
Z ∞
≥ c f (x)dx (since x > c in the integrated region)
c
Z ∞
=c f (x)dx
c
= c P (X ≥ c).
Hence,
E(X)
P (X ≥ c) ≤ .
c
Other form of Markov’s inequality
When X is at least k times of mean E(X),
1
P (X ≥ k E(X)) ≤ , for any k > 0. (2)
k
Example 1
Suppose that the average grade on the upcoming math exam is 70%.
Give an upper bound of the proportion of the students who score at
least 90%.
E(X) 7
P (X ≥ 90) ≤ = .
90 9
Example 2
Suppose X ∼ Binomial(100, 12 ). Find an upper bound of P (X ≥ 75).
E(X) 50 2
P (X ≥ 75) ≤ = = .
75 75 3
General form
If U (X) is a function of the random variable X, then
E(U (X))
P (U (X) ≥ c) ≤ , for any c > 0. (3)
c
Chebyshev’s Inequality
Theorem
If X is a random variable with E(X) = µ and Var(X) = σ 2 , then
σ2
P {|X − µ|≥ c} ≤ , for any c > 0. (4)
c2
or,
σ2
P {|X − µ|< c} ≥ 1 − , for any c > 0. (5)
c2
Alternative forms
If we put c = kσ, where k > 0, then If X is a random variable with
E(X) = µ and Var(X) = σ 2 , then
1
P {|X − µ|≥ kσ} ≤ , (6)
k2
or,
1
P {|X − µ|< kσ} ≥ 1 − . (7)
k2
Example 3
If X is a RV with E(X) = 3 and E(X 2 ) = 13. Find the lower bound
for P (−2 < X < 8).
σ 2 = E(X 2 ) − {E(X)}2 = 4
σ2 21
P (−2 < X < 8) = P (|X − 3|< 5) ≥ 1 − =
52 25
Example 4
Two dice are thrown once, if X is the sum of the number showing up,
prove that P (|X − 7|≥ 3) ≤ 35
54 and compare this value with exact
probability.
X 2 3 4 5 6 7 8 9 10 11 12
1 2 3 4 5 6 5 4 3 2 1
P (X) 36 36 36 36 36 36 36 36 36 36 36
X
E(X) = x P (X = x) = 7
x
X 1974
E(X 2 ) = x2 P (X = x) =
x
36
35
Var(X) = σ 2 =
6
By Chebyshev’s inequality
35/6
P (|X − 7|≥ 3) ≤ = 0.6481
9
Exact Probability:
σ2 σ2
X X
P − µ < c ≥ 1 − 2 =⇒ P (µ − c < < µ + c) ≥ 1 − 2
n c n c
σ2 1
P (|X − µ|> c) < 2
=⇒ P (|X − 1|> 2) < .
c 4
Actual probability:
= e−3 .
Weak law of large number
Convergence in Probability
If P {|Xn − X|> } → 0 as n → ∞, then the sequence of RVs {Xn } is
said to convergence to X in probability.
σ2
P (|X n − µ|> ) <
n2
=⇒ P (|X n − µ|> ) → 0 as n → ∞
Central Limit Theorem
Central Limit Theorem
Let X1 , X2 , X3 , . . ., Xn , . . . be the sequence of independent and
identically distributed random variables with E(Xi ) = µ < ∞ and
0 < Var(Xi ) = σ 2 < ∞.
Then Sn = X1 + X2 + · · · + Xn follows a normal distribution with
mean nµ and variance nσ 2 as n tends to infinity, i.e., Sn ∼ N (nµ, nσ 2 )
Example 7
Let X1 , X2 , X3 , . . ., X10 be independent Poisson random variables
with mean 1. Using CLT find P {(X1 + X2 + · · · + X10 ) ≥ 15}
Example 9
The life time of a certain brand of an electric bulb may be considered
as a random variable with mean 1200 hours and standard deviation
250 hours. Find the probability using central limit theorem, that the
average life time of 60 bulbs exceeds 1250 hours.
Here n = 60, E(Xi ) = 1200 and Var(Xi ) = 250 (i = 1, 2, . . . , 60). Let
X = X1 +X260
+···+X60
. By above corollary, X ∼ N 1200, 250
60
Now
X − 1200 1250 − 1200
P (X > 1250) = P √ > √
60 60
= P (z > 1.55)
= 0.0606
Example 10
A random sample of size 100 is taken from a population whose mean
is 60 and variance 400. Using central limit theorems find what
probability that we can assert that the mean of the sample will not
differ from population mean more than 4.
Here n = 100, E(Xi ) = 60 and Var(Xi ) = 400 (i = 1, 2, . . . , 100). Let
X = X1 +X2100
+···+X100
. By above corollary, X ∼ N (60, 4)
Now
X1 + X2 + · · · + Xn
→ µ as n → ∞,
n
X1 + X2 + · · · + Xn
i.e., P lim = µ = 1.
n→∞ n
One Sided Chebyshev’s Inequality
σ2
P (X ≥ µ + c) ≤ ,
σ2 + c2
σ2
P (X ≤ µ − c) ≤ ,
σ 2 + c2
Example 11 (Markov inequality weaker than one-sided Chebyshev’s
inequality)
If the number of items produced in a factory during a week is a
random variable with mean 100 and variance 400, compute an upper
bound on the probability that this week’s production will be at least
120.
From one-sided Chebyshev’s inequality:
400 1
P (X ≥ 120) = P (X − 100 ≥ 20) ≤ = .
400 + (20)2 2
E(X) 5
P (X ≥ 120) ≤ = ,
120 6
which is far weaker than the Chebyshev’s.
Cauchy Schwartz inequality
For any two random variables X and Y , we have
p
|E(XY )|≤ E(X 2 )E(Y 2 ),
Example 12
Using the Cauchy-Schwartz inequality, show that for any two random
variables X and Y
|ρ(X, Y )|≤ 1.
X−E(X) Y −E(Y )
Let U = σX and V = σY .
Then E(U ) = E(V ) = 0, and Var(U ) = Var(V ) = 1. Now applying
Cauchy Schwartz inequality for U and V , we get
p
|E(U V )|≤ E(U 2 )E(V 2 ) =⇒ |ρ(X, Y )|≤ 1.
E{(X − E(X))(Y − E(Y ))})
as ρ(X, Y ) = = E(U V )
σX σY
Chernoff Bounds
Chernoff Bounds
If M (t) = E(etX ) be the moment generating function of the random
variable X, then
Jensen’s Inequality
If f (x) is convex function, then
Example 14
Show that the variance of a random variable X is non-negative.
We know that f (x) = x2 is a convex function. Now from Jensen’s
inequality we get
E(X 2 ) ≥ {E(X)}2
=⇒ Var(X) = E(X 2 ) − {E(X)}2 ≥ 0