Probablity
Probablity
Probablity
Random Processes
Title : Probability Theory and Random Processes
Course Code : 07B41MA106 (3-1-0), 4 Credits
Pre-requisite : Nil
Objectives :
To study
● Probability: its applications in studying the outcomes of random experiments
● Random variables: types, characteristics, modeling random data
● Stochastic systems: their reliability
● Random Processes: types, properties and characteristics with special
reference to signal processing and trunking theory.
Evaluation Components
Weightage ( in percent)
Teacher Assessment
25
(based on assignments, quizzes,attendence etc.)
T1 (1 hour)
Reference Material :
1. T. Veerarajan. Probability, Statistics and Random
processes. Tata McGraw-Hill.
2. J. I. Aunon & V. Chandrasekhar. Introduction to Probability
and Random Processes. McGraw-Hill International Ed.
3. A. Papoulis & S. U. Pillai. Probability, Random Variables
and Stochastic Processes. Tata WcGraw-Hill.
4. Stark, H. and Woods, J.M. Probability and Random
Processes with Applications to Signal Processing.
.
Origins of Probability
S = {(i, j) | i, j = 1, 2, 3, 4, 5, 6}
More Probability Definitions
• A subset (say E) of the sample space is
called an event. In other words, events
are sets of outcomes.
E = {(1, 6), (2, 5), (3, 4), (4, 3), (5, 2), (6, 1)}
Union: E∪ F
p(E) = n(E)/n(S)
number of heads
Pr( heads) ≈ lim T →∞
T
Probability as a Frequency
• Consider a random experiment with possible outcomes
w1, w2, …,wn. For example, we roll a die and the possible
outcomes are 1,2,3,4,5,6 corresponding to the side that
turns up. Or we toss a coin with possible outcomes H
(heads) or T (tails).
• We assign a probability p(wj) to each possible outcome wj
in such a way that:
p(w1) + p(w2) +… + p(wn) = 1
• For the dice, each outcome has probability 1/6. For the
coin, each outcome has probability ½.
Example
To find the probability that a spare part produced
by a machine is defective.
∩
∪
∩
∪
Ax
Example
Rolling a die is a random experiment.
•
• The outcomes are: 1, 2, 3, 4, 5, and 6. Suppose the die is
“loaded” so that 3 appears twice as often as every other number.
All other numbers are equally likely. Then to figure out the
probabilities, we need to solve:
p(1) + p(2) + p(3) + p(4) + p(5) + p(6) = 1 and p(3) = 2*p(1)
and p(1) = p(2) = p(4) = p(5) = p(6).
Solving, we get p(1) = p(2) = p(4) = p(5) = p(6) = 1/7 and p(3) = 2/7.
• One event is “odd numbers”, which consists of outcomes 1, 3,
and 5. The probability of this event is:
p(odd) = p(1) + p(3) + p(5) = 1/7 + 2/7 + 1/7 = 4/7.
Theorem 1
The probability of the impossible event is zero,
i.e. , if Φ is the subset (event) containing no sample
point, P( Φ ) = 0.
Pr oof
The certain event S and the impossible event
Φ are mutually exclusive.
Hence P(S ∪ Φ ) = P(S) + P( Φ )
But S ∪ Φ = S.
∴ P(S) = P(S) + P( Φ )
∴ P( Φ ) = 0
Theorem
If A and B are any 2 events,
P(A ∪ B) = P(A) + P(B) - P(A ∩ B)
≤ P(A) + P(B)
Pr oof :
A is the union of the mutually exclusive events
AB and AB and B is the union of the mutually
exclusive events AB and AB.
∴ P( A ) = P( AB ) + P( AB )
& P(B ) = P( AB ) + P( AB )
S
AB AB AB
A B
∴ P( A ) + P( B )
[
= P( AB ) + P( AB ) + P( AB ) + P( AB ) ]
= P( A ∪ B ) + P( A ∩ B )
Hence proved.
S
AB AB AB
A B
Conditional Probability
p( E1 E2 E3 ...En )
= p( E1 ) p( E2 | E1 ) p( E3 | E1 E2 ) ...p( En | E1 E2 ...En−1 )
Example.
Define events
p(E1) = C(4,1)C(48,12)/C(52,13)
p(E2|E1) = C(3,1)C(36,12)/C(39,13)
p(E3|E1E2) = C(2,1)C(24,12)/C(26,13)
p(E1E2E3E4) ≈ 0.1055
Let E and F be events. We can express E as
E = EF ∪ EFc
Therefore, we have
p(EF)=p(E)p(F)
p(EFG) = p(E)p(F)p(G)
p(EF) = p(E)p(F)
p(EG) = p(E)p(G)
p(FG) = p(F)p(G)
Example.
p(F) = 1/6
p(G) = 1/6
p(EF) = 1/36
p(FG) = 1/36
p(EFG) = 1/36
Pr oof
Given B1 , B2 ,....Bk be a partition of the sample space S.
i.e., (i) Bi ∩ B j = φ∀i ≠ j.
S
B1
B2
B4
B3 A
(ii) ∪ Bi = S(iii)P(Bi ) > 0∀i.
k
i =1
Let A be the event associated with S. Then
(A ∩ B1 ) ∪ (A ∩ B2 ) ∪ ....... ∪ (A ∩ Bk )
= A ∩ (B1 ∪ B2 ) ∪ (A ∩ B3 ) ∪ ....... ∪ (A ∩ Bk )
(by distributive law)
= A ∩ (B1 ∪ B 2 ∪ .... ∪ Bk ) = A ∩ S = A
Also all the events (A ∩ B1 ), (A ∩ B2 ),......., (A ∩ Bk )
are pairwise mutually exclusive. For,
(A ∩ B1 ) ∩ (A ∩ B2 ) ∩ ....... ∩ (A ∩ B k ) = A ∩ (Bi ∩ B j ), i ≠ j
= A∩φ = φ
Then P(A) = P(A ∩ B1 ) + P(A ∩ B2 ) + ...
+ P(A ∩ Bk )
However each term P(A ∩ B j ) may be expressed
as P(A/B j )P(B j ) & hence we obtain the theorem
on total probability.
P(A) = P(A / B1 )P(B1 ) + P(A / B2 )P(B2 ) + ...
+ P(A / B k )P(B k )
k
= ∑ P(A / B j )P(B j )
j=1
P ( Bi ∩ A ) = P ( Bi ) × P ( A / B i ) = P ( A ) × P ( Bi / A )
P ( Bi ) × P ( A / Bi )
∴ P ( Bi / A ) =
P(A)
P ( Bi ) × P ( A / Bi )
= k
∑ P( B j ) × P(A / B j )
j=1
Example.
p( C | K ) p( K )
p( K | C ) =
p( C | K ) p( K ) + p ( C |~ K ) p( ~ K )
p
=
p + (1− p )(1 / m )
mp
=
1 + ( m − 1) p
Example.
C = coin B is chosen
H = both flips show head
p( H | C ) p( C )
p( C | H ) =
p ( H | C ) p ( C ) + p ( H |~ C ) p( ~ C )
3 3 1
× ×
= 4 4 2
3 3 1 1 1 1
× × + × ×
4 4 2 4 4 2
9
= ≈ 0.9
10
Example.
A laboratory test is 95 percent correct in detecting a certain disease
when the disease is actually present. However, the test also yields a
“false” result for 1 percent of the healthy people tested. If 0.5 percent
of the population has the disease, what is the probability a person
has the disease given that his test
result is positive?
Example.
A suspect is believed 60 percent guilty. Suppose now a new
piece of evidence shows the criminal is left-handed. If 20
percent of the population is left-handed,and it turns out that the
suspect is also left-handed, then does this change the guilty
probability of the suspect? By how much?
Solution:
p( E | D ) p( D )
p( D | E ) =
p( E | D ) p( D ) + p( E |~ D ) p ( ~ D )
=
( 0.95)( 0.005)
( 0.95)( 0.005) + ( 0.01)( 0.995)
95
=
294
≈ 0.323
Example.
A suspect is believed 60 percent guilty. Suppose now a new
piece of evidence shows the criminal is left-handed. If 20
percent of the population is left-handed,and it turns out that
the suspect is also left-handed, then does this change the
guilty probability of the suspect? By how much?
Solution:
p ( LH | G ) p ( G )
p ( G | LH ) =
p( LH | G ) p( G ) + p ( LH |~ G ) p( ~ G )
=
( 1.0)( 0.6 )
(1.0)( 0.6) + ( 0.2)( 0.4)
60
=
68
≈ 0.88
Random Variable
Definition:
A random variable (RV) is a function X : S → R
that assigns a real number X(s) to every element
s ∈ S,where S is the sample space corresponding to
a random experiment E.
(i)p i ≥ 0, ∀i,&
(ii)∑ p i = 1
i
Example of a Discrete PDF
• Suppose that 10% of all households have
no children, 30% have one child, 40%
have two children, and 20% have three
children.
• Select a household at random and let X =
number of children.
• What is the pmf of X?
Example of a Discrete PDF
• We may list each value.
– P(X = 0) = 0.10
– P(X = 1) = 0.30
– P(X = 2) = 0.40
– P(X = 3) = 0.20
Example of a Discrete PDF
• Or we may present it as a chart.
x P(X = x)
0 0.10
1 0.30
2 0.40
3 0.20
Example of a Discrete PDF
• Or we may present it as a stick graph.
P(X = x)
0.40
0.30
0.20
0.10
x
0 1 2 3
Example of a Discrete PDF
• Or we may present it as a histogram.
P(X = x)
0.40
0.30
0.20
0.10
x
0 1 2 3
Example
The pmf. of a random variable X is given by
cλi
p(i) = i = 0,1,2,....., where λ is some positive
i!
value. Find (i) P(X = 0)(ii)P(X > 2)
Solution.
∞ λ∞
i
Since ∑ p(i) = 1, we have c ∑ = 1
i =0 i = 0 i!
∞
λ i
As e = ∑ , we have ce = 1.
λ λ
i = 0 i!
-λ 0
e λ −λ
Hence P(X = 0) = =e
0!
P ( X > 2) = 1 − P ( X ≤ 2)
−λ −λ λe 2 −λ
= 1 − e + λe +
2
-λ
e λ x
P (X = x) =
x!
If X represents the total number of heads obtained,
when a fair coin is tossed 5 times, find the
probability distribution of X.
X :0 1 2 3 4 5
1 5 10 10 5 1
P:
32 32 32 32 32 32
Continuous Random Variable
If X is an RV which can take all values (i.e., infinite
number of values ) in an interval, then X is called
a continuous RV.
0, x < 0,
X( x ) = x ,0 ≤ x < 1,
1, x ≥ 1
Probability Density Function
If X is a continuous RV,then f is said to be the
probability density function (pdf) of X , if it satisfies the
following conditions:
∞
(i)f ( x ) ≥ 0, ∀x ∈ R x , and (ii) ∫ f ( x )dx = 1
−∞
a
P(X = a ) = P(a ≤ X ≤ a ) = ∫ f ( x )dx = 0
a
0 0
A random variable X has the density function
1 / 4 - 2 < x < 2
f(x) =
0 elsewhere
Obtain (i) P(X < 1)(ii)P( X > 1)(iii)P(2X + 3 > 5)
(¾,1/2,1/4)
Find the formula for the probability distribution
of the number of heads when a fair coin is
tossed 4 times.
Cumulative Distribution Function (cdf)
If X is an RV, discrete or continuous , then P(X<=x)
is called the cumulative distribution function of X
or distribution function of X and denoted as F(x).
If X is discrete, F(x) = ∑ p j
j
X j≤x
X
If X is continuous, F(x) = P(-∞ < X ≤ x) = ∫ f(x)dx
-∞
Probability Density Function
0.6 0 if x < 0
0.5 f X ( x) = 1 − x / α
⋅e if 0 ≤ x
0.4 α
f X (x ) 0.3
0.2
0.1
0.0
-3 0 3 6 9 1
x
Cumulative Distribution Functio
1.2
1.0
0.8
F X (x ) 0.6
0.4
0.2
0.0
0 if
-3 x<0 0 3 6 9
FX ( x) = − x /α
1 − e if 0 ≤ x x
Probbility Density Function
0.0016
0.0012
f X (x ) 0.0008
0.0004
0.0000
0 if x < 0
1
f X ( x) = -100
if 0 ≤0x ≤ u100 200 300 400 500 600
u
0 if u < x x
Cum ulative Distribution Function
1.0
0.8
0.6
F X (x )
0.4
0.2
0 if 0.0
x<0
x
F ( x) = -100
X if 0 ≤ x 0
≤ u 100 200 300 400 500 600
u
1 if u < x
x
If the probability density of a random variable is
K(1 - x 2 ),0 < x < 1
given by f(x) =
0 otherwise
Find (i) K, (ii) the cumulative distribution function of
3 / 2[ x − x 3 / 3] 0 < x < 1
the random variable. =
0 otherwise
d
4.If X is a continuous RV, then F( x ) = f ( x ), at all
dx
points where F(x) is differentiable.
Definitions
If X is a discrete RV, then the expected value or
the mean value of g(X) is defined as
E{g(X)} = ∑ g(x i )p i , where p i = P(X = x i )
i
µ x = E (X) = ∑ x i p i , if X is discrete
i
= ∫ xf(x)dx, if X is continuous
Rx
2
x {
Var ( x ) = σ = E (X − µ x ) 2
}
= ∑ ( x i − µ x ) p i , if X is discrete
2
i
= ∫ (x - µ x ) f ( x )dx , if X is continuous
2
Rx
The square root of variance is called the standard
deviation.
Example
Find the expected value of the number on a die when
thrown.(7/2)
2
Var (X) = E{(X − µ x ) } = E{X − 2µ x X + µ x }
2 2
2
= E (X ) − 2µ x E (X) + µ x (sin ce µ x is a constant)
2
2
= E (X ) − µ x (sin ceµ x = E (X))
2
n n
E{ X } & E{ X − µ x } are called absolute moments of X.
n
E{(X − a ) } & E{ X − a } are called generalised moments of X.
n
F( x , y) = ∑ ∑ p ij
j i
= x y
∫−∞ ∫−∞ f (u , v)dvdu
•Properties of joint PDF
0 ≤ F ( x, y ) ≤ 1, −∞ < x < ∞ −∞ < y < ∞
F (−∞ , y ) = F ( x,−∞ ) = F (−∞ ,−∞ ) = 0
F ( ∞, ∞ ) = 1
F ( x, y ) is a nondecreasing function as either x or y, or both, increase.
F (∞, y ) = FY ( y ) F ( ∞, x ) = F X ( x )
F ( x, y ) = Pr( X ≤ x, Y ≤ y )
∂ 2 F ( x, y )
f ( x, y ) =
∂x∂y
Examples
tossing two coins
X ... random variable associated with the first coin
Y ... random variable associated with the second coin
sample space {(0,0), (0,1), (1,0), (1,1)} 0 for tail, 1 for head
F(x,y)
1
F(0,0) =
4
2
F(0,1) =
4
3
1/2 F(1,0) =
1/4 4
F(1,1) = 1
1 x
y
Example
Three balls are drawn at random without replacement
from a box containing 2 white,3red and 4 black balls.
If X denotes the number of white balls drawn and Y
denotes the no of red balls drawn, find the joint
probability distribution of (X,Y).
Solutions
As there are only 2 white balls in the box, X can take
the values 0,1,2and Y can take the values 0,1,2,3.
P(X=0,Y=0)=P(drawing 3 balls none of which is
white or red)=P(all the 3 balls drawn are black)
= C3 / C3 = 1 / 21
4 9
P(X=0,Y=1)=3/14,P(X=0,Y=2)=1/7………
X Y
0 1 2 3
2
X Y
0 1 2 3
2 1/21 1/28 0 0
For the bivariate probability distribution of (X, Y)
given below, find P(X ≤ 1), P(X ≤ 1, Y ≤ 3), P(X ≤ 1/Y ≤ 3)
P(Y ≤ 3/X ≤ 1) & P(X + Y ≤ 4).
1 2 3 4 5 6
Y
X
0 0 0 1/32 2/32 2/32 3/32
(ans.7/8,9/32,18/32,9/28,13/32)
Example
The joint density function of (X, Y) is given by
2e e 0 < x < ∞,0 < y < ∞
-x −2 y
f(x, y) =
0otherwise
Compute (i) P(X > 1, Y < 1), (ii)P(X < Y), (iii)P(X < a)
ans. e -1 (1 − e − 2 ),1 / 3,1 − e −a
Marginal probability density function
6
f ( x, y ) = (1 − x 2 y ) for 0 ≤ x ≤ 1, 0 ≤ y ≤1
5
= 0wrt y alone and
-Integrating this elsewhere
wrt x alone gives the two
marginal pdf
- 6 x2
f X ( x ) = (1 − ) 0 ≤ x ≤1
5 2
6 y
f Y ( y) = (1 − ) 0 ≤ y ≤1
5 3
Independent Random Variables
2
∂ F( x , y) e −( x + y)
x, y ≥ 0
f ( x , y) = =
∂x∂y 0otherwise
e − x x ≥ 0 e − y y ≥ 0
f1 ( x ) = f 2 ( y) =
0otherwise 0otherwise
−x −y
F1 ( x )F2 ( y) = (1 − e )(1 − e ) = F( x , y)
Example
Let X and Y be the lifetimes of two electronic devices.
Suppose that their joint pdf is given by f(x, y) = e -(x + y) , x ≥ 0, y ≥ 0
then X and Y are independent
Example
Suppose that f(x, y) = 8xy,0 ≤ x ≤ y ≤ 1.
Check the independence of X and Y.
Expectation of Product of random variables
E ( X ) + E ( Y ) = ∑ x j p ( x j , yk ) + ∑ yk p ( x j , yk )
j ,k j ,k
= ∑ ( x j + yk ) p ( x j , yk )
j ,k
= E( X + Y )
Example
What is the mathematical expectation of the sum of
points on n dice?
Ans. (7/2)n
n n
A box contains 2 tickets among which C r
tickets bear the number r (r = 0,1,2,…,n). A group of m
tickets is drawn . Let S denote the sum of their
numbers. Find E(S) and Var S.
Ans. (n/2)m
E ( XY ) = ∑ x j yk p ( x j , yk )
j ,k
= ∑ x j yk f ( x j ) g ( yk )
j ,k
= ∑ x jf ( x j ) ∑ y k g( y k )
j k
= E( X ) E(Y )
Example
If the joint pdf of (X, Y) is given by f(x, y) = 24y(1- x),0 ≤ y ≤ x ≤ 1
Find E(XY)
y
11
E(XY) = ∫ ∫ xyf ( x , y)dxdy
0y
x=y
Ans. 4/15
0 x
Binomial Distribution (re-visit)
E(Xk) = 0 • q + 1 • p = p
Since
Sn = X1 + X2+…+ Xn
We have
F ( x M ) = Pr[ X ≤ x M ]
Pr{ x ≤ x , M }
= Pr(M ) > 0
Pr(M )
f ( x , y) or f ( x , y)
f ( y | x) = f ( x | y) =
fX ( x ) f Y ( y)
- the continuous version of Bayes’ theorem
f ( x | y ) fY ( y )
f ( y | x) =
f X ( x)
- another expression of the marginal pdf
∞ ∞
f X ( x) = ∫ f ( x, y )dy = ∫ f ( x | y ) fY ( y )dy
−∞ −∞
∞ ∞
fY ( y ) = ∫ f ( x, y )dx = ∫ f ( y | x) f X ( x)dx
−∞ −∞
For the bivariate probability distribution of (X, Y)
given below, find P(X ≤ 1), P(X ≤ 1, Y ≤ 3), P(X ≤ 1/Y ≤ 3)
P(Y ≤ 3/X ≤ 1) & P(X + Y ≤ 4).
1 2 3 4 5 6
Y
X
0 0 0 1/32 2/32 2/32 3/32
(ans.7/8,9/32,18/32,9/28,13/32)
Suppose that p(x,y) the joint probability mass
function of X and Y , is given by p(0,0) =.4
p(0,1)=.2,p(1,0)=.1,p(1,1)=.3 Calculate the
conditional probability mass function of X given
that Y = 1
Ans. 2/5,3/5
Example
Suppose that 15 percent of the families in a certain
community have no children, 20% have 1, 35% have
2, & 30% have 3 children; suppose further that each
child is equally likely (and independently) to be a
boy or a girl. If a family is chosen at random from this
community, then B, the number of boys, and G , the
number of girls, in this family will have the joint
probability mass function .
j 0 1 2 3
i
0 .15 .10 .0875 .0375 P(B = 1, G = 1) = P(BGorGB)
= P(BG ) + P(GB)
1 .10 .175 .1125 0 1 1
= .35 × × × 2 = .175
2 2
2 .0875 .1125 0 0
3 .0375 0 0 0
P(B = 2, G = 1) = P(BBGorBGBorGBB)
1 1 1
= P(BBG) + P(BGB) + P(GBB) = .30 × × × × 3
2 2 2
.9
= = .1125
8
If the family chosen has one girl, compute the
conditional probability mass function of the
number of boys in the family
Ans.8/31,14/31,9/31,0
Example
The joint density of X and Y is given by
12
x ( 2 − x − y ) ,0 < x < 1,0 < y < 1
f(x, y) = 5
0otherwise
compute the conditional density of X , given that
Y = y , where 0 < y < 1
f ( x , y) 6x ( 2 − x − y )
f ( x | y) = ans =
f Y ( y) 4 − 3y
Example
The joint pdf of a two - dimensional RV is given by
2
x
f(x, y) = xy 2 + ,0 ≤ x ≤ 2,0 ≤ y ≤ 1.
8
1 1 1
Compute P(X > 1), P(Y < ), P(X > 1/Y < ), P(Y < / X > 1)
2 2 2
P(X < Y) & P(X + Y ≤ 1)
(ans. 19/24,1/4,5/6,5/19,53/480,13/480)
Variance of a Sum of random variables
If X and Y are random variables, then the variance of
their sum is
( ) ( )
= E ( X − µX ) + E ( Y − µY ) − 2E ( ( X − µX )( Y − µY ) )
2 2
= Var ( X ) + Var (Y ) − 2 E (( X − µ X )( Y − µY ) )
= Var ( X ) + Var ( Y ) − 2( E ( XY ) − µ X µY )
Cov( X , Y ) = E ( ( X − µ X )( Y − µY ) ) = E ( XY ) − µ X µY
• If X and Y are mutually independent, then
Cov(X,Y) = 0.
Var(Sn) = np(1-p)
Example
Compute Var(X) when X represents the outcome
when we roll a fair die.
Solution
Since P(X=i)=1/6, i = 1,2,3,4,5,6, we obtain
6
E (X ) = ∑ i P[X = i]
2 2
i =1
1 2 1 2 1 2 1 2 1 2 1
=1 ( ) + 2 ( ) +3 ( ) + 4 ( ) +5 ( ) + 6 ( )
2
6 6 6 6 6 6
=91/6
Var(X) = E (X ) − E (X)
2 2
2
91 7
= − = 35 / 12
6 2
Ans 175/6
ρ xy ≤ 1 or C xy ≤ σ x σ y
Example
Calculate the correlation coefficient for the following
heights (in inches) of fathers (X) & their sons (Y):
X Y
65 67
(Ans .603)
66 68
67 65
67 68
68 72
69 72
70 69
72 71
C xy
ρ xy =
σxσy
∑ xy ∑x ∑y
−
= n n n
2 2 2 2
∑x ∑x ∑y ∑y
− −
n n n n
Example
Let the random variables X and Y have the
x + y 0 < x < 1,0 < y < 1
joint pdf f(x, y) =
0 elsewhere
Find the correlation coefficient between X and Y.
(ans. - 1/11)
Example
If X,Y and Z are uncorrelated RVs with zero means
and standard deviations 5, 12 and 9 respectively
and if U=X+Y and V=Y+Z find the correlation
coefficient between U and V.
(ans 48/65)
Conditional Expected Values
(X, Y) p ij
g ( X, Y )
E{g (X, Y) / Y = y j} = ∑ g ( x i , y j )P(X = x i / Y = y j )
i
P{X = x i Y = y j} p ij
= ∑ g(x i , y j ) = ∑ g(x i , y j )
i P{Y = y j} i p* j
∞
E{g (X, Y) / Y} = ∫ g ( x , y)f ( x / y)dx
−∞
∞
E{g (X, Y) / X} = ∫ g ( x , y)f ( y / x )dy
−∞
Conditional means
∞
µ y/x = E (Y / X) = ∫ yf ( y / x )dy
−∞
−∞
Example
The joint pdf of (X,Y) is given by f(x,y)=24xy, x>0,y>0,
x+y<=1,and 0, elsewhere, find the conditional
mean and variance of Y, given X.
f ( y / x ) = 2 y /(1 − x ) , E (Y / X) = 2 / 3(1 − x ),
2
var = 1 / 18(1 − x ) 2
E (X 2 Y 2 ) = E (X 2 E (Y 2 / X)]
Example
Three coins are tossed. Let X denote the number of
heads on the first two coins,Y denote the no of tails
on the last two, and z denote the number of heads
on the last two. Find
(a)The joint distribution of (i) X and Y (ii) X and Z
(b) Conditional distribution of Y given X = 1
(c) Find covariance of x,y and x,z
(d) Find E(Z/X=1)
(e)Give a joint distribution , that is not the joint
distribution of X and Z in (a), but has the same
marginals as of (b)
RVs (X1 , X 2 ,...X n ) − independent
f ( x1 , x 2 ,..., x n ) = f ( x1 ) × f ( x 2 ) × ... × f ( x n )
conditional density
f(x1 , x 2 , x 3 )
f(x1 , x 2 / x 3 ) =
f( x 3 )
f(x1 , x 2 , x 3 )
f(x1 / x 2 , x 3 ) =
f(x 2 , x 3 )
Definition
Let X denote a random variable with probability
density function f(x) if continuous (probability
mass function p(x) if discrete)
Then
M(t) = the moment generating function of X
( )
= E etX
∞ tx
∫ e f ( x ) dx if X is continuous
= −∞
∑ etx p ( x ) if X is discrete
x
Example
xe − x ( y +1)
x > 0, y > 0
If f ( x , y) =
0otherwise
Find the moment - generating function of XY.
∞∞
− x ( y +1)
Solution : M XY = ∫ ∫ e xe xyt
dydx
00
∞ ∞
−x xyt − x ( y )
= ∫ xe { ∫ e e dy}dx
0 0
∞ ∞
−x − xy (1− t )
= ∫ xe { ∫ e dy}dx =1/1-t
0 0
M X (t) = ∑ e f (x)
tx
Properties
= ∑ (1 + tx +
( tx )
2
+ .....)f ( x )
2!
MX(0) = 1
2
2 tx
M 'X ( t ) = ∑ ( x + + ...)f ( x )
2!
M 'X ( t ) t =0 = ∑ xf ( x ) = E(X) = µ1
3
6 tx
M X ( t ) = ∑ (x +
2 2
...)f ( x )
3!
M X ( t ) t =0 = ∑ x f ( x ) = µ 2
2 2
mX( k)
( 0) = k th
derivative of mX ( t ) at t = 0.
µ 2 2 µ3 3 µk k
mX ( t ) = 1 + µ1t + t + t + L + t +L .
2! 3! k!
x k f ( x ) dx
µk = E ( X k
) = ∫ X continuous
∑ x p ( x )
k
X discrete
Let X be a random variable with moment
generating function MX(t). Let Y = bX + a
Then MY(t) = MbX + a(t) = E(e [bX + a]t) = eat MX (bt)
1 t 2 2 t 3 3t 4 4 t
M(t ) = e + e + e + e
10 10 10 10
Find p.d.f.
1 t 2 2 t 3 3t 4 4 t
M(t ) = e + e + e + e
10 10 10 10
M(t ) = ∑ e f ( x )
tx
x
1 t 2 2 t 3 3t 4 4 t
e + e + e + e = f (a )e + f (b)e + ...
at bt
10 10 10 10
x
, x = 1,2,3,4
f ( x ) = 10
0, otherwise
1
2 ,1 < x < ∞
If X has the p.d.f f(x) = x
0, otherwise
find the mean of X.
The characteristic function of a random variable X is defined by
φ X ( w ) = E(e iwx )
= 1 =1
∞
φ x ( w ) = E (e iwx
) = ∫ e f ( x )dx
iwx
−∞
∞ ∞
≤ ∫ e iwx
f ( x )dx = ∫ f ( x )dx = 1
−∞ −∞
φ X ( w ) = E (e iwx
) = ∑ e f (x) iwx
x
= ∑ 1 + iwx +
( iwx )
2
+
( iwx )
3
+ ...f ( x )
x
2! 3!
= ∑ f ( x ) + iw ∑ xf ( x ) +
( iw )
2
∑x
2
f ( x ) + .....
x x 2! x
1 d n
2.µ'n = n n φ(ω)
i dω ω= 0
3.If the characteristic function of a RV X
is φ x (ω) and if Y = aX + b, then φ y (ω) = e ibωφ x (aω)
4. If X and Y are independent RVs, then
φ x + y (ω) = φ x (ω) × φ y (ω).
5. If the characteristic function of a continuous RV X with
1 ∞ −ixω
density function f(x) is φ(ω), then f(x) = ∫ φ(ω)e dω.
2π − ∞
6.If the density function of x is known , the
density function of Y = g(X) can be found from the
CF of Y, provided Y = g(X) is one - one.
The characteristic function of a random variable X is
given by
1 − w , w ≤ 1
φx (w ) =
0, w > 1
Find the pdf of X.
The pdf of X is
1 ∞ −iwx
f(x) = ∫ φ x ( w )e dw
2π − ∞
1 1 −iwx
= ∫ (1 − w )e dw
2π −1
1 0 −iwx
1
−iwx
= ∫ (1 + w ) e dw + ∫ (1 − w ) e dw
2π −1 0
1 1
= (2 − e − e ) = 2 (1 − cos x )
ix −ix
2πx 2
πx
1 sin ( x / 2 )
2
= , −∞ < x < ∞
2π x / 2
Show that the distribution for which the
-ω
characteristic function is e has the density
1 1
function f(x) = × ,−∞ < x < ∞
π 1+ x 2
1 ∞ − i ωx
f (x) = ∫ φ(ω)e dω
2π − ∞
If X1 & X 2 are two independent RVs that
follow Poisson distribution with parameters
λ1 & λ 2 , prove that (X1 + X 2 ) also follows a
Poisson distribution with parameter (λ1 + λ 2 ).
Reproductive property of Poisson distribution
λ1 ( e iω −1)
φ x1 ( t ) = e
λ 2 ( e iω −1)
φx 2 ( t ) = e
since X1 & X 2 are independen t RVs,
( λ1 + λ 2 ) ( e iω −1)
φ x1 + x 2 ( t ) = e
Joint Characteristic Function
If (X, Y) is a two - dimensional RV, then
iω1X + iω2 Y
E(e ) is called the joint characteristic
function of (X, Y) and denoted by φ xy (ω1 , ω2 ).
∞ ∞
iω1x + iω2 y
φ xy (ω1 , ω2 ) = ∫ ∫ e f ( x , y)dxdy
−∞ −∞
iω1x + iω2 y
= ∑∑e p( x i , y j )
i j
(i)φ xy (0,0) = 1
1 ∂ m+n
(ii)E{X Y } = m + n
m n
m n
φ xy (ω1 , ω2 )
i ∂ω1 ∂ω2 ω1 =0,ω2 =0
(iii)φ x (ω) = φ xy (ω ,0) & φ y (ω) = φ xy (0, ω )
(iv)If X and Y are independent
φ xy (ω1 , ω2 ) = φ x (ω1 ) × φ y (ω2 )
and conversely.
Compute the characteristic function of the
discrete r.v.' s X and Y if the joint PMF is
1 / 3, x = y = 0
1 / 6, x = ±1, y = 0
PXY =
1 / 6, x = y = ±1
0, else.
1 1 i ( w 1k + w 2 l )
φ XY ( w 1 , w 2 ) = ∑ ∑ e PXY
k = −1 l = −1
1 1 1
= + ( cos w 1 ) + ( cos( w 1 + w 2 ) )
3 3 3
Example
Two RVs X and Y have the joint characteristic
function φ xy (ω1 , ω2 ) = e ( − 2 ω ) . Show that
2
1 −8 ω 2 2
= e[( −2 ω1
2
−8 ω 2 2 ) 4iω ]
1 ω1 = 0 , ω2 = 0 =0
E(Y) = e ( [
−2 ω 1
2
−8 ω 2 2 )16iω ]
2 ω1 = 0 ,ω2 = 0 =0
1 ∂ ( − 2 ω12 −8 ω2 2 )
2
E(XY) = 2 e
i ∂ω1∂ω2 ω1 =0,ω2 =0
∂ ( − 2 ω12 −8ω2 2 )
= e 16ω2
∂ω1 ω1 =0,ω2 =0
= {−64ω1ω2 e ( −2ω 1
2
−8 ω 2 2 )}
ω1 = 0 , ω2 = 0
=0
C xy = E(XY) − E (X) × E (Y) = 0
Compute the joint characteristic function of X and Y if
1 1 2 2
f xy = exp − ( x + y )
2π 2
Ans.
1
1 ∞ ∞ − ( x 2 + y2 )
iω1x + iω2 y
φ xy ( w 1 , w 2 ) = ∫ ∫e2
e dxdy
2π − ∞ − ∞
Random Variable
Binomial Distribution
The Bernoulli probability mass function is the density
function of a discrete variable X having 0 and 1 as
the only possible values
Pr oof
Getting exactly r successes means getting r
successes and (n - r) failures simultaneously.
∴ P(getting r successes and n - r failures)
n −r
=p q r
The trials, from which the successes are obtained
are not specified. There are nC r ways of choosing
r trials for successes. Once the r trials are chosen
for successes, the remaining (n - r) trials should
result in failures.
These nC r ways are mutually exclusive. In
each of these nC r ways, P(getting exactly
r successes ) = p r q n − r .
64
=
81
Example.
n k
p (1 − p )
(b) n−k
k
Assuming that p remains the same for all repetitions,
if we consider n independent repetitions (or trials) of
E and if the random variable X denotes the
number of times the event A has occurred, then
X is called a binomial random variable with
parameters n and p
x =1 x!(n − x )!
n
n (n − 1)!
= ∑x x −1
pp (1 − p) n−x
x =1 x ( x − 1)![ (n − 1) − ( x − 1)]!
n
(n − 1)!
= np∑ x −1
p (1 − p) n −x
x =1 ( x − 1)![ ( n − 1) − ( x − 1) ]!
n
= np∑ n − 1C x −1p x −1 (1 − p) ( n −1) −( x −1)
x =1
= np(p + (1 − p)) n −1
, sin ce[ p + (1 − p)] = ∑ n C x p x (1 − p) n − x
n n
x =0
= np
n
Second moment µ12 = E(X 2 ) = ∑ x 2 nC x p x (1 − p) n − x
x =0
(1 − (3 / 4) ) = .9899
16
Poisson Distribution
The probability density function of the poisson variate can
be obtained as a limiting case of the Binomial probability
density function under the following assumption.
(i) The number of trials is increased indefinitely (n → ∞)
(ii) The probability of success in a single trial is very small(p → 0)
(iii) np is a finite constant say np = λ.
Cosider the probability density function of a Binomial random
variable X as
n −x n!
P(X = x) = nC x p (1 − p)
x
= p x (1 − p)n − x
x!(n − x )!
x n −x
n (n − 1)...(n − x + 1) λ λ
= 1 −
x! n n
n
λ
1 −
n (n − 1)....(n − x + 1) x n
= x
λ x
x!n λ
1 −
n
n
n (n − 1) (n − x + 1) λ
.... 1 −
= n n n λx n
x
x! λ
1 −
n
For given x, as n → ∞, the terms 1 - 1/n,1 - 2/n,...1 - ( x - 1) /n,
(1 - λ/n) x tends to 1.
−λ
Also, Lt n →∞ (1 − λ / n ) = e .
n
λx −λ
Hence, Lt n →∞ P(X = x ) = e .
x!
Poisson random variable
A random variable X taking on one of the values 0,1,2,...
is said to be a Poisson random variable with parameter λ
e -λ λx
if for some λ > 0, P(x) = P(X = x) = x = 0,1,2,....
x!
∞ ∞
e − λ λx ∞
λx
∑
x =0
P ( x ) = ∑
x =0 x!
= e −λ
∑
x = 0 x!
= e −λ λ
e =1
∞ e − λ λx ∞ e − λ λx
∞
Mean = E(X) = ∑ xP(X = x ) = ∑ x =∑
x =0 x =0 x! x =1 ( x − 1)!
x −1
λ
∞
−λ λ λ
= λe ∑ −λ
= λe 1 + + + ...
x =1 ( x − 1)! 1! 2!
−λ λ
= λe e = λ
Var (X) = λ
Example
The average number of radioactive particles passing
through a counter during 1 millisecond is in a
laboratory experiment is 4. What is the probability that
6 particles enter the counter in a given millisecond?
−4 6 4 −4
e 4 4 e
,
6! 4!
Example
At a busy traffic intersection the probability p of an
Individual car having an accident is very small say
p=0.0001. However, during a certain peak hours of
the day say between 4 p.m. and 6 p.m., a large
number of cars (say 1000) pass through the
intersection. Under these conditions what is the
probability of two or more accidents occurring during
that period.
In a component manufacturing industry, there is a small
probability of 1/500 for any component to be defective.
The components are supplied in packets of 10. Use
Poisson distribution to calculate the app no of packets
containing (i) no defective (ii)one defective components
in a consignment of 10,000 packets..9802 ×10000,.0196 ×10000
Binomial Distribution
n −r
P(X = r )= C r p q
n r
; r = 0,1,2,..., n
If we assume that n trials constitute a set and if we
consider N sets, the frequency function of the
binomial distribution is given by f(r)=N p(r)
n −r
= N Cr p q
n r
Example
Fit a binomial distribution for the following data and
hence find the theoretical frequencies:
x:0 1 2 3 4
f: 5 29 36 25 5
y 6 20 28 12 8 6 0 0 0 0 0
Ans. 7,26,37,34,6
6.89,19.14,23.94,17.74,8.63,2.88,0.67,0.1,0.01,0,0
Fit a Poisson distribution for the following distribution:
x: 0 1 2 3 4 5
f: 142 156 69 27 5 1
= p[1 + 2 t + 3t + 4 t + ....]
2 3
(sin ce t < 1)
−2 p 1−2
= p[1 − t ] = p[1 − (1 − p)] = 2 =
p p
∞ n −1 ∞ 2 n −1
E(X ) = ∑ n (1 − p)
2 2
p = ∑n t p
n =1 n =1
∞
n −1
= ∑ [n (n − 1) + n ]t p
n =1
∞ ∞
n −1 n −1
= ∑ [n (n − 1)]t p + ∑ nt p
n =1 n =1
∞ n −2
= pt ∑ [n (n − 1)]t +1/ p
n =2
= pt[2 + 3.2 t + 4.3t + 5.4 t + ....] + 1 / p
2 3
−3
= 2pt[1 + 3t + 6 t + ...] + 1 / p = 2pt[1 − t ] + 1 / p
2
2p(1 − p) 2 − p
= 3
= 2
p p
1− p
Var(X) = 2
p
Negative Binomial Distribution
Trials repeated until a fixed number of success occur.
( y + r −1)
= C ( r −1) p q r y
Example
2 10 2 3
11 1 5 4 1 3
C1 , C1
6 6 4 4
Relationship between the binomial and negative
binomial
Let X have a binomial distribution with parameters
n,p. Let Y have a negative binomial distribution with
parameters r and p. (That is Y = no of trials required
to obtain r successes with probability of success p).
Then
(i)P(Y ≤ n ) = P(X ≥ r )
(ii)P(Y > n ) = P(X < r )
The probability that an experiment will succeed is 0.8.
If the experiment is repeated until four successful
outcomes have occurred, what is the expected number
of repetitions required?
Ans. 1
∞
∑ g(y) = ∑
y =0
∞
y =0
[ ( y + r −1) r
C ( r −1) p q y
]
( y + r − 1)! r y
∞
= ∑ pq
y =0
( r − 1)! y!
= ∑
∞
y =0
[ ( y + r −1)
Cyp q = p ∑r y
] r ∞
y =0
[ ( y + r −1)
C yq y
]
( y + r − 1)! y
r ∞
=p ∑ q
y =0
( r − 1)! y!
( r − 1)! r! ( r + 1)! 2
=p r
+ q+ q + ...
( r − 1)! ( r − 1)! ( r − 1)!2!
r ( r + 1) 2
= p 1 + rq +
r
q + ...
2!
= p [1 − q ] = p p = 1.
r −r r −r
rq rq
Mean = E(X) = Variance = 2
p p
PROBABILITY
DISTRIBUTIONS
Discrete Distributions
Binomial Distribution
n −r
P(X = r )= C r p q
n r
; r = 0,1,2,..., n
If we assume that n trials constitute a set and if we
consider N sets, the frequency function of the
binomial distribution is given by f(r)=N p(r)
n −r
= N Cr p q
n r
Example
Fit a binomial distribution for the following data and
hence find the theoretical frequencies:
x:0 1 2 3 4
f: 5 29 36 25 5
y 6 20 28 12 8 6 0 0 0 0 0
Ans. 7,26,37,34,6
6.89,19.14,23.94,17.74,8.63,2.88,0.67,0.1,0.01,0,0
Fit a Poisson distribution for the following distribution:
x: 0 1 2 3 4 5
f: 142 156 69 27 5 1
Ans. 7/10,3/10,5/10,4/10
b b 1
mean = E (X) = ∫ xf ( x )dx = ∫ x dx
a a b−a
1
mean = (b + a )
2
If X has uniform distribution in (-3,3), find P( X - 2 < 2)
Ans. 1/2
If X has uniform distribution in (-a, a), a > 0, find
a such that P( X < 1) = P( X > 1)
a=2
Buses arrive at a specific stops at 15min. Intervals
starting at 7 A.M., that is , they arrive at 7,7:15,7:30
and so on. If a passenger arrives at the stop at a
random time that is uniformly distributed between
7 and 7:30A.M., find the probability that he waits
(a) less than 5 min (b) at least 12 min. for a bus.
Ans. 1/3,1/5
Example:
0 3 5 7
a 4 6.5 b
1
P (3 ≤x ≤4) =( 4 −3)( ) =.25
7− 3
What is the probability that it will take more than 6.5 days?
1
P (6.5 ≤x ≤7) =(7 −6.5)( ) =.125
7− 3
var iance = E(X 2 ) − E (X) 2
a+b
E(X) =
2
a
b
E (X ) = ∫ x f ( x )dx =
2 2 1
3(b − a )
x 3
[ ] b
a
=
( b− a ) b + ab + a
3
=
3 2 2
3(b − a ) 3
a + ab + b a + 2ab + b
2 2 2 2
var = −
3 4
a + b − 2ab ( b − a )
2 2 2
= =
12 12
n +1 n +1
b −a
moments = E(X ) = n
( n + 1) (b − a )
central moments µ r = E{(X - E(X)) } r
r
1 b a + b
= ∫ x − dx
b−a a 2
r +1 b r +1 r +1
a +b b−a a −b
x − −
1 2 2 2
= =
b−a r +1 ( b − a )( r + 1)
a
0 if r is odd
= 1 b − a if r is even
r
r +1 2
2n
1 b−a
µ 2 n −1 = 0, µ 2 n = for n = 1,2,3,..
2n + 1 2
∞
2.Γ(1) = ∫ e dt = 1 −t
0
3.put x = n
Γ(n) = ( n - 1) Γ( n - 1) = ( n − 1)( n − 2 ) Γ(n − 2) = ......
= (n − 1)(n − 2)....2.1
= (n − 1)!
Γ(1) = 0!= 1
Exponential Distribution
If the occurances of events over nonoverlapping
intervals are independent, such as arrival times of
telephone calls or bus arrival times at a bus stop,
then the waiting time distribution of these events
can be shown to be exponential
1 ∞ r −y Γ(r + 1) r!
= r ∫ y e dy = =
λ 0 λ r
λr
Example
Let X have an exponential distribution with mean of
100 . Find the probability that X<90
Ans. 0.593
Customers arrive in a certain shop according to an
approximate Poisson process at a mean rate of 20
per hour. What is the probability that the shopkeeper
will have to wait for more than 5 minutes for his first
customer to arrive?
-15
ans. e
Memoryless Property of the Exponential Distribution
( )
∞
− λx − λx ∞ − λk
P ( X > k ) = ∫ λe dx = − e k =e
P{ X > s + t & X > s}
k
Now P(X > s + t/X > s) =
P{ X > s}
P{ X > s + t} e −λ (s+ t )
− λt
= e = P(X > t ).
= = − λs
P{ X > s} e
If x represents the lifetime of an equipment then
above property states that if the equipment has
been working for time s, then the probability that it
will survive an additional time t depends only on t
(not on s) and is identical to the probability of survival
for time t of a new piece of equipment.
0.368
Suppose that the amount of waiting time a customer
spends at a restaurant has an exponential distribution
with a mean value of 5 minutes Then find the
probability that a customer will spend more than
10 minutes in the restaurant
0.1353
Example
Suppose that the length of a phone call in minutes is
an exponenetial random variable with parameter
1
λ=
10.
If A arrives immediately ahead of B at a public
telephone booth, find the probability that B will have
to wait (i) more than 10 minutes, and (ii) between
10 and 20 minutes.
(ans. 0.368,0.233)
Erlang distribution or General Gamma distribution
∞
∫ f ( x )dx =1
0
λ = 1,
k −1 − x
x e
pdf = , x ≥ 0; k > 0.
Γ(k )
Gamma distribution or simple gamma distribution
with parameter k.
∞
−2 x e
−2 x
−6
P(X ≥ 3) = 2 ∫ e dx = 2 =e
3 −2
Suppose that an average of 30 customers per hour
arrive at a shop in accordance with a Poisson Process
That is, if a minute is our unit , then λ =1 / 2. What
is the probability that the shopkeeper wait more than
5 minutes before both of the first two customer arrive?
Solution.
If X denotes the waiting time in minutes until the
second customer arrives, then X has Erlang(Gamma)
distribution with k = 2, λ =1 / 2
λ ( λx )
k −1
∞
− λx
P(X > 5) = ∫ e dx
5 Γ(k )
=0.287
Mean and Variance of Erlang Distribution
moments = µ'r = E (X ) r
∞ λ k + r −1 −λx
k
=∫ x e dx
0 Γ(k )
λk
1 ∞ k + r −1 − t 1 Γ( k + r )
= ∫ t e dt = r
k +r
Γ(k ) λ 0 λ Γ(k )
k k
mean = E(X) = var(X) = 2
λ λ
m.g.f . = M X ( t ) = E (e ) tx
∞ λ k
k −1 − λx tx λk ∞
=∫ x e e dx = k −1 − ( λ − t ) x
∫x e dx
0 Γ(k ) Γ(k ) 0
λk
1 ∞ k −1 − y λ k
= k ∫
y e dy =
Γ(k ) ( λ − t ) 0 (λ − t) k
−k
t
= 1 −
λ
Reproductive Property
M ( X1 + X 2 +...+ X n ) ( t ) = M X1 ( t )M X 2 ( t ).....M X n ( t )
M X1 ( t )M X 2 ( t ).....M X n ( t )
− k1 −k 2 −k n
k k k
= 1 − 1 − .....1 −
λ λ λ
− ( k1+ k 2 ...+ k n )
t
= 1 −
λ
= M ( X1 + X 2 +...+ X n ) ( t )
The sum of a finite number of Erlang variables is also
an Erlang variable.
β −1 − αx β
f ( x ) = α βx e ,x > 0
parameter α, β α, β > 0
when β = 1, Weibull distribution reduces to
the exponential distribution with parameter α.
∞
r +β −1 − αx β
µ r ' = E(X ) = α β∫ x
r
e dx
0
r 1 1
+1− −1
y
∞ β β y
−y
β
= ∫ e dy
0 α α
1
−
β 1
Mean = E (X) = µ'1 = α Γ + 1
β
2 1 2
β β
Each of the 6 tubes of a radio set has a life
length(in years) which may be considered as
a RV that follows a Weibull distribution with
parameters α = 25 and β = 2. If these tubes
function independently of one another, what is
the probability that no tube will have to be replaced
during the first 2 months of service?
= p(X > 1 / 6) = ∫ 50 xe
∞
1/ 6
− 25 x 2
(
dx = − e − 25 x 2
) ∞
1
6
− 25 / 36
=e
P(all the 6 tubes are not to be replaced during
the first 2 months)
= e ( )
- 25/36 6
= 0.0155
Properties of a
Normal Distribution
• Continuous Random Variable
• Symmetrical in shape (Bell shaped)
• The probability of any given range of
numbers is represented by the area under
the curve for that range.
• Probabilities for all normal distributions are
determined using the Standard Normal
Distribution.
Probability for a
Continuous Random Variable
Probability Density Function for
Normal Distribution
−µ
2
1 −1 (
x
)
f (x ) =
σ 2π
e 2
σ
N(µ, σ)
+
2
1 − (
x 7
f (x) = 1
e 2 )
32π 4
N(−7,4)
∞
∫ f ( x )dx =1
−∞
Standard Normal Distribution
N(0,1)
z2
1 −
φ(z) = e 2
,−∞ < z < ∞.
2π
µ = 0, σ = 1 & by changing x and f respectively
into z and φ.
X -µ
If X has distribution N(µ, σ) and if Z = ,
σ
then Z has distribution N(0,1)
z
values of φ(z), ∫ φ(z)dz are tabulated.
0
X − N(µ, σ)
∞
E(X) = ∫ xf ( x )dx
−∞
1 ∞ − ( x −µ ) 2 / 2 σ 2
= ∫ xe dx
σ 2π − ∞
=
1 ∞
( )
−t 2
∫ µ + 2σt e dt
π −∞
t =
x −µ
σ 2
µ ∞ −t 2 2 ∞ −t 2
= ∫ e dt + σ ∫ te dt
π −∞ π −∞
µ
= π = µ.
π
0.0 0.0000 0.0040 0.0080 0.0120 0.0160 0.0190 0.0239 0.0279 0.0319 0.0359
0.1 0.0398 0.0438 0.0478 0.0517 0.0557 0.0596 0.0636 0.0675 0.0714 0.0753
0.2 0.0793 0.0832 0.0871 0.0910 0.0948 0.0987 0.1026 0.1064 0.1103 0.1141
0.3 0.1179 0.1217 0.1255 0.1293 0.1331 0.1368 0.1406 0.1443 0.1480 0.1517
0.4 0.1554 0.1591 0.1628 0.1664 0.1700 0.1736 0.1772 0.1808 0.1844 0.1879
0.5 0.1915 0.1950 0.1985 0.2019 0.2054 0.2088 0.2123 0.2157 0.2190 0.2224
0.6 0.2257 0.2291 0.2324 0.2357 0.2389 0.2422 0.2454 0.2486 0.2517 0.2549
0.7 0.2580 0.2611 0.2642 0.2673 0.2704 0.2734 0.2764 0.2794 0.2823 0.2852
0.8 0.2881 0.2910 0.2939 0.2969 0.2995 0.3023 0.3051 0.3078 0.3106 0.3133
0.9 0.3159 0.3186 0.3212 0.3238 0.3264 0.3289 0.3315 0.3340 0.3365 0.3389
1.0 0.3413 0.3438 0.3461 0.3485 0.3508 0.3513 0.3554 0.3577 0.3529 0.3621
1.1 0.3643 0.3665 0.3686 0.3708 0.3729 0.3749 0.3770 0.3790 0.3810 0.3830
1.2 0.3849 0.3869 0.3888 0.3907 0.3925 0.3944 0.3962 0.3980 0.3997 0.4015
1.3 0.4032 0.4049 0.4066 0.4082 0.4099 0.4115 0.4131 0.4147 0.4162 0.4177
1.4 0.4192 0.4207 0.4222 0.4236 0.4251 0.4265 0.4279 0.4292 0.4306 0.4319
1.5 0.4332 0.4345 0.4357 0.4370 0.4382 0.4394 0.4406 0.4418 0.4429 0.4441
1.6 0.4452 0.4463 0.4474 0.4484 0.4495 0.4505 0.4515 0.4525 0.4535 0.4545
1.7 0.4554 0.4564 0.4573 0.4582 0.4591 0.4599 0.4608 0.4616 0.4625 0.4633
1.8 0.4641 0.4649 0.4656 0.4664 0.4671 0.4678 0.4686 0.4693 0.4699 0.4706
1.9 0.4713 0.4719 0.4726 0.4732 0.4738 0.4744 0.4750 0.4756 0.4761 0.4767
2.0 0.4772 0.4778 0.4783 0.4788 0.4793 0.4798 0.4803 0.4808 0.4812 0.4817
2.1 0.4821 0.4826 0.4830 0.4834 0.4838 0.4842 0.4846 0.4850 0.4854 0.4857
2.2 0.4861 0.4864 0.4868 0.4871 0.4875 0.4878 0.4881 0.4884 0.4887 0.4890
2.3 0.4893 0.4896 0.4898 0.4901 0.4904 0.4906 0.4909 0.4911 0.4913 0.4916
2.4 0.4918 0.4920 0.4922 0.4925 0.4927 0.4929 0.4931 0.4932 0.4934 0.4936
2.5 0.4938 0.4940 0.4941 0.4943 0.4945 0.4946 0.4948 0.4949 0.4951 0.4952
2.6 0.4953 0.4955 0.4956 0.4957 0.4959 0.4960 0.4961 0.4962 0.4963 0.4964
2.7 0.4965 0.4966 0.4967 0.4968 0.4969 0.4970 0.4971 0.4972 0.4973 0.4974
2.8 0.4974 0.4975 0.4976 0.4977 0.4977 0.4978 0.4979 0.4979 0.4980 0.4981
2.9 0.4981 0.4982 0.4982 0.4983 0.4984 0.4984 0.4985 0.4985 0.4986 0.4986
3.0 0.4987 0.4987 0.4987 0.4988 0.4988 0.4989 0.4989 0.4989 0.4990 0.4990
3.1 0.4990 0.4991 0.4991 0.4991 0.4992 0.4992 0.4992 0.4992 0.4993 0.4993
3.2 0.4993 0.4993 0.4994 0.4994 0.4994 0.4994 0.4994 0.4995 0.4995 0.4995
3.3 0.4995 0.4995 0.4995 0.4996 0.4996 0.4996 0.4996 0.4996 0.4996 0.4997
3.4 0.4997 0.4997 0.4997 0.4997 0.4997 0.4997 0.4997 0.4997 0.4997 0.4998
Determining the probability of any Normal Random
Variable
Interpreting Z
• In figure Z = -0.8 means that the value 360
is .8 standard deviations below the mean.
• A positive value of Z designates how many
standard deviations (σ ) X is to the right of
the mean (µ ).
• A negative value of Z designates how may
standard deviations (σ ) X is to the left of the
mean (µ ).
Example: A group of achievement scores are
normally distributed with a mean of 76 and a
standard deviation of 4. If one score is randomly
selected what is the probability that it is at least 80.
σ =4
76 80
x − µ 80 − 76
Z= = =1
σ 4
P( x ≥ 80) = P(z ≥ 1) = .5 − P(0 ≤ z ≤ 1) =
.5 − .3413 = .1587
σ =4
76 80
x − u 80 − 76
Z= = =1
σ 4
P ( x ≥ 80) = P ( z ≥ 1) = .5 − P (0 ≤ z ≤ 1) =
.5 − .3413 = .1587
.3413
.1587
0 1
Continuing, what is the probability that it is less than 70.
70 76
x −u 70 − 76
Z= = = −1.5
σ 4
P ( x ≤ 70 ) = P ( z ≤ −1.5) = .5 − P ( −1.5 ≤ z ≤ 0.0) =
.5 −.4332 = .0668 .4332
.0668
-1.5 0
What proportion of the scores occur within 70 and 85.
.4332
.4878
-1.5 0 2.25
x −u 70 −76
Z = = = −1.5
σ 4
x −u 85 −76
Z = = = 2.25
σ 4
P (70 ≤ x ≤85 ) = P ( −1.5 ≤ z ≤ 2.25 ) =
P ( −1.5 ≤ z ≤ 0.0) + P (0.0 ≤ z ≤ 2.25 ) =
.4332 +.4878 =.9210
Time required to finish an exam is known to be normally
distributed with a mean of 60 Min. and a Std Dev. of 12
minutes. How much time should be allowed in order for
90% of the students to finish?
.9 σ =12
60 x
x −µ
z=
σ
zσ = x − µ
zσ + µ = x
1.28(12 ) + 60 = x
x = 75 .36
An automated machine that files sugar sacks has an adjusting device to
change the mean fill per sack. It is now being operated at a setting that
results in a mean fill of 81.5 oz. If only 1% of the Sacks filled at this
setting contain less than 80.0 oz, what is the value of the variance for this
population of fill weights. (Assume Normality).
.01
-2.33 0
µ=81 .5 x =80 PROB =.01
x −u
Z =
σ
80 .0 −81 .5
−2.33 =
σ
(80 .0 −81 .5)
σ= =.6437
−2.33
σ =.4144
2
Moment generating function of N(0,1)
∞
M Z ( t ) = E(e tZ ) = ∫ e tz φ(z)dz
−∞
1 ∞ tz − z 2 / 2 1 ∞
( − 2 tz ) / 2
=
2
∫e e dz = ∫ e
− z
dz
2π − ∞ 2π − ∞
( z−t ) 2
1 ∞ − ( z − t ) − t 2 / 2 1 ∞
2
−
t2 / 2
= ∫ e
dz = e ∫ e 2
dz
2π − ∞ 2π − ∞
t2 / 2 1 t2 / 2
t2 / 2 1 ∞ − u du = e Γ(1 / 2) = e
=e ∫ e π
2π − ∞ 2u
The moment generating function of N(µ, σ)
= M X ( t ) = M σZ+µ ( t )
t ( σz +µ) µt tσz
= E (e ) = e E (e )
µt
= e M Z (σt )
=e eµt σ 2 t 2 / 2
=e ( ( ) )
t µ+ σ2 t / 2
t σ t t 2
σ t 2 2 2
= 1 + (µ + ) + (µ + ) + ..... + ∞
1! 2 2! 4
If X has the distribution N(µ, σ) then Y = aX + b
has the distribution N(aµ + b, aσ)
M X (t) = e ( ( ) )
t µ+ σ2 t / 2
M Y ( t ) = M aX + b ( t )
= e M X (at )
bt
( ( ) )
bt t aµ + a 2 σ 2 t / 2
=e e
=e ( ( ) )
t ( aµ + b ) + a 2 σ 2 t / 2
X—B(n,p)
s tan dard binomial variable Z is given by
X - np
Z=
npq
as X varies from 0 to n with step size 1, Z varies
- np np 1
from to with step size .
npq npq npq
∞ 1 −t 2 / 2
P(z) = F(z) = ∫ e dt ,−∞ < z < ∞
−∞ 2π
Let X be the number of times that a fair coin, flipped 40
times, land heads. Find P(X=20). Use normal
approximation and compare it to the exact solution.
P(X=20)=P(19.5<X<20.5)
19.5 − 20 X − 20 20.5 − 20
= P < <
10 10 10
= φ(.16) − φ(−.16) = .1272
0 1 2 3 4 5 6 7 8 9 10 11 12
01 2 3 4 5 6 7 8 9 10 11 12
If 20% of the momory chips made in a certain plant
are defective, what are the probabilities that in a
lot of 100 randomly chosen for inspection
(a) at most 15 will be defective?
(b) exactly 15 will be defective ?
µ = 100(.20) = 20, σ = 4
15.5 − 20
F( ) = 0.1292
4
15.5 − 20 14.5 − 20
F − F = 0.0454
4 4
Fit a normal distribution to the following distribution
and hence find the theoretical frequencies:
Class Freq
60-65 3
65-70 21
70-75 150
75-80 335
80-85 336
85-90 135
90-95 26
95-100 4
-------------
1000
β −1 − αx β
f ( x ) = α βx e ,x > 0
The special case of Weibull with α = 1/σ & β = 2 2
0 x<0
of freedom.
k ( r / 2)
E(X) = = =r
λ (1 / 2 )
k ( r / 2)
Var (X) = 2 = = 2r
λ (1 / 4 )
Mean equals the number of degrees of freedom and
the variance equals twice the number of degrees of
freedom.
−r / 2
M X ( t ) = (1 − 2t ) , t < 1/ 2
df \p .005 .01 .025 .05 .10 .90 .95 .975 .99 .995
1 .00004 .00016 .00098 .0039 .0158 2.71 3.84 5.02 6.63 7.88
2 .0100 .0201 .0506 .1026 .2107 4.61 5.99 7.38 9.21 10.60
3 .0717 .115 .216 .352 .584 6.25 7.81 9.35 11.34 12.84
4 .207 .297 .484 .711 1.064 7.78 9.49 11.14 13.28 14.86
5 .412 .554 .831 1.15 1.61 9.24 11.07 12.83 15.09 16.75
.6 .676 .872 1.24 1.64 2.20 10.64 12.59 14.45 16.81 18.55
7 .989 1.24 1.69 2.17 2.83 12.02 14.07 16.01 18.48 20.28
8 1.34 1.65 2.18 2.73 3.49 13.36 15.51 17.53 20.09 21.96
9 1.73 2.09 2.70 3.33 4.17 14.68 16.92 19.02 21.67 23.59
10 2.16 2.56 3.25 3.94 4.87 15.99 18.31 20.48 23.21 25.19
11 2.60 3.05 3.82 4.57 5.58 17.28 19.68 21.92 24.73 26.76
12 3.07 3.57 4.40 5.23 6.30 18.55 21.03 23.34 26.22 28.30
df \p .005 .01 .025 .05 .10 .90 .95 .975 .99 .995
14 4.07 4.66 5.63 6.57 7.79 21.06 23.68 26.12 29.14 31.32
15 4.6 5.23 6.26 7.26 8.55 22.31 25 27.49 30.58 32.80
16 5.14 5.81 6.91 7.96 9.31 23.54 26.30 28.85 32.00 34.27
18 6.26 7.01 8.23 9.39 10.86 25.99 28.87 31.53 34.81 37.16
20 7.43 8.26 9.59 10.85 12.44 28.41 31.41 34.17 37.57 40.00
24 9.89 10.86 12.40 13.85 15.66 33.20 36.42 39.36 42.98 45.56
30 13.79 14.95 16.79 18.49 20.60 40.26 43.77 46.98 50.89 53.67
40 20.71 22.16 24.43 26.51 29.05 51.81 55.76 59.34 63.69 66.77
60 35.53 37.48 40.48 43.19 46.46 74.40 79.08 83.30 88.38 91.95
120 83.85 86.92 91.58 95.70 100.62 140.23 146.57 152.21 158.95 163.64
Let X be χ (10). Find P(3.25 ≤ X ≤ 20.5).
2
Ans.0.95
0.831, 12.8
Beta Distribution
The random variable x is said to have beta
distribution with nonnegative parameters α & β
1
if f(x) = x α −1 (1 − x )β−1 ,0 < x < b
B(α, β)
0 otherwise
1
α -1 β −1
where B(α, β) = ∫ x (1 − x ) dx
0
2π Γ( α ) Γ( β )
= 2 ∫ ( sin θ) ( cos θ)
2 α −1 2β −1
dθ =
0 Γ( α + β )
when α = β = 1, beta distribution is uniform distribution
on (0,1)
α+β ( α + β) ( α + β + 1)
2
In a certain country, the proportion of highway
requiring repairs in any given year is a random
variable having the beta distribution with α = 3 and
β = 2. Find
(a) on the average what percentage of the highway
sections require repairs in any given year :
(b) the probability that at most half of the highway
sections will require repairs in any given year.
3
(a )µ = = 0.60, i.e.60% of the high way
3+ 2
sections require repairs in any given year.
1/2
(b)P(X ≤ 1/2) = ∫ 12x (1 − x )dx = 5 / 16
2
0
The lognormal distribution
If X = log T follows a normal distribution N(µ, σ),
then T follows a lognormal distribution whose pdf is given
1 1 t
2
by f(t) = exp − 2 log , t ≥ 0
st 2π 2s tm
5101hours,1030hours
∞ ∞
R (3000) = ∫ f ( t )dt = 1
∫ φ( z )dz
3000
3000 log( )
0.2 5000
∞
= ∫ φ(z)dz = 0.9946
− 2.55
3598.2
z 0.00 0.01 0.02 0.03 0.04 0.05 0.06 0.07 0.08 0.09
0.0 0.0000 0.0040 0.0080 0.0120 0.0160 0.0190 0.0239 0.0279 0.0319 0.0359
0.1 0.0398 0.0438 0.0478 0.0517 0.0557 0.0596 0.0636 0.0675 0.0714 0.0753
0.2 0.0793 0.0832 0.0871 0.0910 0.0948 0.0987 0.1026 0.1064 0.1103 0.1141
0.3 0.1179 0.1217 0.1255 0.1293 0.1331 0.1368 0.1406 0.1443 0.1480 0.1517
0.4 0.1554 0.1591 0.1628 0.1664 0.1700 0.1736 0.1772 0.1808 0.1844 0.1879
0.5 0.1915 0.1950 0.1985 0.2019 0.2054 0.2088 0.2123 0.2157 0.2190 0.2224
0.6 0.2257 0.2291 0.2324 0.2357 0.2389 0.2422 0.2454 0.2486 0.2517 0.2549
0.7 0.2580 0.2611 0.2642 0.2673 0.2704 0.2734 0.2764 0.2794 0.2823 0.2852
0.8 0.2881 0.2910 0.2939 0.2969 0.2995 0.3023 0.3051 0.3078 0.3106 0.3133
0.9 0.3159 0.3186 0.3212 0.3238 0.3264 0.3289 0.3315 0.3340 0.3365 0.3389
1.0 0.3413 0.3438 0.3461 0.3485 0.3508 0.3513 0.3554 0.3577 0.3529 0.3621
1.1 0.3643 0.3665 0.3686 0.3708 0.3729 0.3749 0.3770 0.3790 0.3810 0.3830
1.2 0.3849 0.3869 0.3888 0.3907 0.3925 0.3944 0.3962 0.3980 0.3997 0.4015
1.3 0.4032 0.4049 0.4066 0.4082 0.4099 0.4115 0.4131 0.4147 0.4162 0.4177
1.4 0.4192 0.4207 0.4222 0.4236 0.4251 0.4265 0.4279 0.4292 0.4306 0.4319
1.5 0.4332 0.4345 0.4357 0.4370 0.4382 0.4394 0.4406 0.4418 0.4429 0.4441
1.6 0.4452 0.4463 0.4474 0.4484 0.4495 0.4505 0.4515 0.4525 0.4535 0.4545
1.7 0.4554 0.4564 0.4573 0.4582 0.4591 0.4599 0.4608 0.4616 0.4625 0.4633
1.8 0.4641 0.4649 0.4656 0.4664 0.4671 0.4678 0.4686 0.4693 0.4699 0.4706
1.9 0.4713 0.4719 0.4726 0.4732 0.4738 0.4744 0.4750 0.4756 0.4761 0.4767
2.0 0.4772 0.4778 0.4783 0.4788 0.4793 0.4798 0.4803 0.4808 0.4812 0.4817
2.1 0.4821 0.4826 0.4830 0.4834 0.4838 0.4842 0.4846 0.4850 0.4854 0.4857
2.2 0.4861 0.4864 0.4868 0.4871 0.4875 0.4878 0.4881 0.4884 0.4887 0.4890
2.3 0.4893 0.4896 0.4898 0.4901 0.4904 0.4906 0.4909 0.4911 0.4913 0.4916
2.4 0.4918 0.4920 0.4922 0.4925 0.4927 0.4929 0.4931 0.4932 0.4934 0.4936
2.5 0.4938 0.4940 0.4941 0.4943 0.4945 0.4946 0.4948 0.4949 0.4951 0.4952
2.6 0.4953 0.4955 0.4956 0.4957 0.4959 0.4960 0.4961 0.4962 0.4963 0.4964
2.7 0.4965 0.4966 0.4967 0.4968 0.4969 0.4970 0.4971 0.4972 0.4973 0.4974
2.8 0.4974 0.4975 0.4976 0.4977 0.4977 0.4978 0.4979 0.4979 0.4980 0.4981
2.9 0.4981 0.4982 0.4982 0.4983 0.4984 0.4984 0.4985 0.4985 0.4986 0.4986
3.0 0.4987 0.4987 0.4987 0.4988 0.4988 0.4989 0.4989 0.4989 0.4990 0.4990
3.1 0.4990 0.4991 0.4991 0.4991 0.4992 0.4992 0.4992 0.4992 0.4993 0.4993
3.2 0.4993 0.4993 0.4994 0.4994 0.4994 0.4994 0.4994 0.4995 0.4995 0.4995
3.3 0.4995 0.4995 0.4995 0.4996 0.4996 0.4996 0.4996 0.4996 0.4996 0.4997
3.4 0.4997 0.4997 0.4997 0.4997 0.4997 0.4997 0.4997 0.4997 0.4997 0.4998