Lecture11 Slides
Lecture11 Slides
Lecture11 Slides
Definition: The function f(x, y) is a joint density function of the continuous random variables X
and Y if
2. ∫∫
−∞ −∞
f ( x, y )dxdy = 1,
=g ( x) ∫=
f ( x, y )dy and h( y ) ∫
−∞ −∞
f ( x, y )dx
Statistical Independence
Definition: Let X and Y be two random variables, discrete or continuous, with joint probability
distribution f(x, y) and marginal distributions g(x) and h(y), respectively.
The random variables X and Y are said to be statistically independent if and only if
f(x, y) = g(x)h(y)
for all (x, y) within their range.
Example 12.2: A privately owned business operates both a drive-in facility and a walk-in
facility. On a randomly selected day, let X and Y, respectively, be the proportions
of the time that the drive-in and the walk-in facilities are in use, and suppose that
the joint density function of these random variables is
2
(2 x + 3 y ), 0 ≤ x ≤ 1,0 ≤ y ≤ 1,
f(x, y) = 5
0, elsewhere.
PHM111s - Probability and Statistics
(a) Verify that f(x, y) is a joint density function.
1 1 1
(b) Find P[(X, Y) ∈ A], where A = {(x, y) | 0 < x < , < y < }.
2 4 2
(c) Find the marginal distribution of X alone and of Y alone g(x) and h(y).
1/2
2 x 2 6 xy x =1/2 1/2
1 3y
∫1/4 5
= ( +
5
) | x=0
dy ∫
=+ (
1/4 10 5
)dy
y 3 y 2 1/2 13
=
( + ) |1/4 = .
10 10 160
(c) By definition,
∞ 1
2 4 xy 6 y 2 y =1 4 x + 3
g ( x) =∫ f ( x, y )dy =∫ (2 x + 3 y )dy =( + ) |y=0 = ,
−∞ 0 5 5 10 5
for 0 ≤ x ≤ 1, and g(x) = 0 elsewhere. Similarly,
∞ 1
2 2(1 + 3 y )
h( y ) = ∫ f ( x, y )dx = ∫ (2 x + 3 y )dx = ,
−∞ 0 5 5
for 0 ≤ y ≤ 1, and h(y) = 0 elsewhere.
Example 12.6: Suppose that the shelf life, in years, of a certain perishable food product packaged
in cardboard containers is a random variable whose probability density function is
e− x , x > 0,
given by f(x) =
0, elsewhere.
Let X 1 , X 2 , and X 3 represent the shelf lives for three of these containers selected
independently and find P(X 1 < 2, 1 < X 2 < 3, X 3 > 2).
2 1 0
= (1 − e −2 )(e −1 − e −3 )e −2 =
0.0372.
Definition: Let X and Y be random variables with joint probability distribution f(x, y). The mean,
or expected value, of the random variable g(X, Y) is
∞ ∞
E(Y) = ∫∫
−∞ −∞
yf ( x, y )dxdy =
−∞
∫ yh( y )dy (continuous case),
σ XY = E[( X − µ X )(Y − µY )] = ∫ ∫ (x − µ X
)( y − µY ) f ( x, y )dxdy , X and Y are continuous.
−∞ −∞
Theorem: The covariance of two random variables X and Y with means µ X and µY , respectively,
is given by
Cov(X, Y) = σ XY = E ( XY ) − µ X µY = E ( XY ) − E ( X ) E (Y ).
Theorem: Let X and Y be two independent random variables. Then
E(XY) = E(X)E(Y).
Corollary: Let X and Y be two independent random variables. Then σ XY = 0.
Definition: Let X and Y be random variables with covariance σ XY and standard deviations σ X
and σ Y , respectively. The correlation coefficient of X and Y is
σ XY
ρ XY = . −1 ≤ ρ XY ≤ 1
σ Xσ Y
3 0 3 3
2 2
2 4 2 1 8 11
σ =− = and σ Y2 =− = .
2
3 5 75 3 15 225
X
Solution: (a) ∫∫
−∞ −∞
=
f ( x, y )dxdy ∫ ∫ C (6 − x − y )dxdy
2 0
= 1
4
x2
⇒ C ∫ (6 x − − xy ) |xx == 02 dy =
1
2 2
4
⇒ C ∫ (10 − 2 y )dy =
1
2
⇒ C[(10 y − y 2 ) | yy == 42 ] =
1
1
⇒ C (20 − 12) =1 ⇒ C =
8
(b) To calculate the probability, we use
P[(X, Y) ∈ A] = P( X + Y > 3)
3 2 4 2
1 1
∫2 3∫− y 8 (6 − x − y )dxdy + ∫3 ∫0 8 (6 − x − y )dxdy
3 2 4 2
1 x 1 x
⇒ ∫ (6 x − − xy ) | dy + ∫ (6 x − − xy ) | dy
= x 2= x 2
x=3− y x=0
2 8 3 2 8 2
−1 3 2 14 19
⇒ ∫
16 2
( y − 8 y + 7)dy + ∫
83
(10 − 2 y )dy =
24
Method 2:
3 3− y
1
P( X + Y > 3) = 1 − ∫ ∫ (6 − x − y )dxdy
2 0 8
3
1 x2 19
= 1 − ∫ (6 x − − xy ) |30− y dy =
2 8 2 24
6 36 3 6 36
X