Unit 3 2DRV
Unit 3 2DRV
Unit 3 2DRV
2.1 Introduction
Examples:
If the possible values of (X, Y) are finite or countably infinite, then (X, Y)
is called as 2 - dimensional discrete
random variable.
It can be represented as Xi , Y j where i = 1, 2, 3, · · · n; j = 1, 2, 3, · · · m.
P(X = Xi ) = Pi∗
P(X = Xi , Y = Y j )
P(X = Xi /Y = Y j ) =
P(Y = Y j )
Pi j
=
P∗ j
P(X = xi , Y = y j )
P(Y = y j /X = xi ) =
P(X = xi )
Pi j
=
Pi∗
F(x, y) = P[X ≤ x, Y ≤ y]
XX
= P X = xi , Y = y j
i j
m
n P
P xi , y j = 1. [Use it for Checking of JPMF or to find an unknown in JPMF]
P
1.
i=1 j=1
m
2. P(X = xi ) =
P
Pi j [Use it to find MPMF of X]
j=1
n
3. P(Y = Y j ) =
P
Pi j [Use it to find MPMF of Y]
i=1
10. P(xi , y j ) = P(xi ) · P(y j ) [Use it for checking independent of r.v.s X and
Y]
P(X = Xi , Y = Y j )
11. P(X = Xi /Y = Y j ) = [Use it for the conditional
P(Y = Y j )
probability of X given Y]
P(X = xi , Y = y j )
12. P(Y = y j /X = xi ) = [Use it for the conditional
P(X = xi )
probability of Y given X]
∴ X = {0, 1, 2}
Given Y a discrete r.v. denotes the number of tails which takes the
values 0,1,1,2.
∴ Y = {0, 1, 2}
(a) Joint P.M.F. :
X
0 1 2
0 P(X = 0, Y = 0) = 0 P(X = 1, Y = 0) = 0 P(X = 2, Y = 0) = 1/4
Y 1 P(X = 0, Y = 1) = 0 P(X = 1, Y = 1) = 2/4 P(X = 2, Y = 1) = 0
2 P(X = 0, Y = 2) = 1/4 P(X = 1, Y = 2) = 0 P(X = 2, Y = 2) = 0
(b) Marginal P.M.F. of X & Y :
X Mar. PMF of Y :
0 1 2 PY (y) = P∗ j
0 0 0 1/4 P(Y = 0) = 1/4
Y 1 0 2/4 0 P(Y = 1) = 2/4
2 1/4 0 0P(Y = 2) = 1/4
XX
Mar. PMF of X : P(X = 0) P(X = 1) P(X = 2) P X = xi , Y = y j
∀i ∀j
i.e.,
Marginal P.M.F. of X Marginal P.M.F. of Y
j=2 i=2
1 X 1 X
P(X = 0) = = P X = 0, Y = y j P(Y = 0) = = P (X = xi , Y = 0)
4 j=0 4 i=0
j=2 i=2
2 X 2 X
P(X = 1) = = P X = 1, Y = y j P(Y = 1) = = P (X = xi , Y = 1)
4 j=0 4 i=0
j=2 i=2
1 X 1 X
P(X = 2) = = P X = 2, Y = y j P(Y = 2) = = P (X = xi , Y = 2)
4 j=0 4 i=0
144 MA6453 Probability and Queueing Theory
(e) F X (1) = F X (X = 1)
= P(X ≤ 1)
= P(X = 0, 1)
= P(X = 0) + P(X = 1)
1 2 3
= + =
4 4 4
(f) FY (2) = FY (Y = 2)
= P(Y ≤ 2)
= P(X = 0, 1, 2)
= P(X = 0) + P(X = 1) + P(X = 2)
1 2 12
= + + =1
4 4 4
To find k,
XX
P(X = x, Y = y) = 1
∀x ∀y
P(0,1)+P(0,2)+P(0,3)+P(1,1)+P(1,2)+P(1,3)+P(2,1)+P(2,2)+P(2,3) = 1
3k + 6k + 9k + 5k + 8k + 11k + 7k + 10k + 13k = 1
72k = 1
1
∴k=
72
X MPMF of Y :
0 1 2 PY (y) = P∗ j
1 3/72 5/72 7/72 P(Y = 1) = 15/72
Y 2 6/72 8/72 10/72 P(Y = 2) = 24/72
3 9/72 11/72 13/72 P(Y = 3) = 33/72
XX
MPMF of X : P(X = 0) P(X = 1) P(X = 2) P X = xi , Y = y j
∀i ∀j
Marginal p.m.f. of X, Y:
X + Y P(X+Y)
P(X + Y = 1) = P(X = 0, Y = 1)
1 3
=
72
P(X + Y = 2) = P(X = 0, Y = 2) + P(X = 1, Y = 1)
6 5
2 = +
72 72
11
=
72
P(X + Y = 3) = P(X = 0, Y = 3) + P(X = 1, Y = 2) + P(X = 2, Y = 1)
9 8 7
3 = + +
72 72 72
24
=
72
P(X + Y = 1) = P(X = 1, Y = 3) + P(X = 2, Y = 2)
11 10
4 = +
72 72
21
=
72
P(X + y = 5) = P(X = 2, Y = 3)
5 13
=
72
7
(b) P(X > Y) = P(X = 2, Y = 1) =
72
(c) P[Max(X, Y) = 3]:
Z = Min(X, Y) P(Z)
Y
0 1 2 3
4C3 3C1 · 4C2 3C2 · 4C1 3C3
0
9C3 9C3 9C3 9C3
2C1 · 4C2 2C1 · 3C1 · 6C1 2C1 · 3C2 0
X 1
9C3 9C3 9C3 9C3
2C2 · 4C1 2C2 · 4C1 0 0
2
9C3 9C3 9C3 9C3
i.e.,
The joint and marginal probability distribution :
Y MPMF of Y
0 1 2 3 PX (x) = Pi∗
4 18 12 1 35
0 P(X = 0) =
84 84 84 84 84
12 24 6 42
X 1 0 P(X = 1) =
84 84 84 84
4 3 7
2 0 0 P(X = 2) =
84 84 84
XX
MPMF of X : P(Y = 0) P(Y = 1) P(Y = 2) P(Y = 3) P(i, j)
∀i ∀j
20 45 18 1
PY (y) = P∗ j = = = = =1
84 84 84 84
Y
HH
HH
1 2 3 4 5 6
X H
HH
H
HH
Y Mar. PMF of X
HH
1 2 3 4 5 6
X Pi∗ = P (X = xi )
H
HH
H
1 2 2 3 1
0 0 0
32 32 32 32 4
1 1 1 1 1 1 5
1
16 16 8 8 8 8 8
1 1 1 1 2 1
2 0
32 32 64 64 64 8
Mar. PMF of Y 3 3 11 13 3 1
HH
H 1
HH
P∗ j = P Y = y j 32 32 64 64 16 4 1 H
H
H
P(X ≤ 1, Y ≤ 3)
(iv) P(X ≤ 1/Y ≤ 3) = (1)
P(Y ≤ 3)
9/32
=
P(Y = 1) + P(Y = 2) + P(Y = 3)
9/32 18
= =
23/64 23
P(Y ≤ 3, X ≤ 1)
(v) P(Y ≤ 3/X ≤ 1) =
P(X ≤ 1)
9/32 9 8 9
= = × =
7/8 32 7 28
(vi) P(X + Y ≤ 4)
X+Y P(X + Y)
1 0
2 0 + 1/16 = 1/16
3 1/32 + 1/16 + 1/32 = 4/32
4 1/8 + 2/32 + 1/32 = 7/32
X
HH
HH
0 1 2
Y H
HH
H
3 9 3
0
28 28 28
3 3
1 0
14 14
1
2 0 0
28
152 MA6453 Probability and Queueing Theory
Solution:
X
HH
HH
0 1 2 P(Y = y j )
Y H
HH
H
3 9 3 15
0
28 28 28 28
3 3 12
1 0
14 14 28
1 1
2 0 0
28 28
10 15 3
P(X = xi ) 1
28 28 28
Marginal distribution of X.
X 0 1 2
10 15 3
P(X = xi )
28 28 28
Marginal distribution of Y.
Y 0 1 2
15 12 1
P(Y = y j )
28 28 28
Y
H
HH
H 0 1 2
X HH
H
H
Y
HH
HH
0 1 2 P(X = xi )
X H
HH
H
Marginal distribution of X.
X 0 1 2
P(X = xi ) 0.16 0.34 0.30
Marginal distribution of Y.
Y 0 1 2
P(Y = y j ) 0.24 0.38 0.38
P(X ≤ 1) = P(X = 0, 1)
= P(X = 0) + P(X = 1) = 0.16 + 0.34
= 0.50
P(Y ≤ 1) = P(Y = 0, 1)
= P(Y = 0) + P(Y = 1) = 0.24 + 0.38
= 0.62
X
HH
H
HH 2 4
Y H
H
H
1 0.10 0.15
3 0.20 0.30
5 0.10 0.15
X
HH
HH
2 4 P(Y = y j )
Y H
HH
H
1 0.10 0.15 0.25
3 0.20 0.30 0.5
5 0.10 0.15 0.25
P(X = xi ) 0.40 0.60 1
Example 2.8. X and Y are two random variables having the joint
1
pmf of (X, Y) is P (x, y) = (x + 2y), where X and Y assumes the
27
integer values 0,1 and 2. Find the marginal probability
distribution of X.
1
Solution: Given f (x, y) = (x + 2y) ; x = 0, 1, 2; y = 0, 1, 2.
27
Here X and Y are discrete random variables.
The joint and marginal pmf of X and Y is
X P Y = yj
0 1 2
0 0 1/27 2/27 3/27
Y 1 2/27 3/27 4/27 9/27
2 4/27 5/27 6/27 15/27
P (X = xi ) 6/27 9/27 12/27 1
Marginal distribution of X.
X 0 1 2
P(X = xi ) 6/27 9/27 12/27
Two - Dimensional Random Variables 155
(i) f (x, y) ≥ 0.
Z∞ Z∞
(ii) f (x, y) dx dy = 1.
−∞ −∞
F(x, y) = P[X ≤ x, Y ≤ y]
Zy Zx
= f (x, y) dx dy
−∞ −∞
Z Z
1. f (x, y) dx dy = 1.[Use it for Checking JPDF or to find an unknown in JPDF]
RX RY
R∞
2. f (x) = f (x, y) dy. [Use it to find MPDF of X]
−∞
R∞
3. f (y) = f (x, y) dx. [Use it to find MPDF of Y]
−∞
Rx
4. F(x) = f (x) dx. [Use it to find CDF of X]
−∞
Rx
5. F(y) = f (x) dx. [Use it to find CDF of Y]
−∞
Rb
6. P(a < X < b) = f (x) dx [Use it to find Probability of X]
a
Rd
7. P(c < Y < d) = f (y) dy [Use it to find Probability of Y]
c
Rd Rb
8. P(a < X < b, c < Y < d) = f (x, y) dx dy [Use it to find joint
c a
Probability of X and Y]
Z
9. E (X r ) = xr f (x) dx. [Use it for rth moment of X]
RX
Z
10. E (Y ) =
r
yr f (y) dy. [Use it for rth moment of Y]
RY
11. Var(X) = E X 2 − [E(X)]2 . [Use it for Variance of X]
12. Var(Y) = E Y 2 − [E(Y)]2 . [Use it for Variance of Y]
f (x, y)
13. f (x/y) = [Use it for the condition probability of X given Y]
f (y)
f (x, y)
14. f (y/x) = [Use it for the condition probability of Y given X]
f (x)
Two - Dimensional Random Variables 157
Example 2.9. The bivariant random variable X and Y has the p.d.f.
f (x, y) = Kx2 (8 − y) , x < y < 2x, 0 ≤ x ≤ 2. Find
(a) K (b) f (x) (c) f (y) ! (d) f (x/y)
1 3
(e) f (y/x) (f) f (y/x = 1) (g) P ≤X≤
4 4
Solution: Given joint pdf of X and Y is f (x, y) = K x2 (8 − y), x < y < 2x, 0 ≤
x < 2. (1)
Z∞ Z∞
W.K.T., f (x, y)dxdy = 1
∵ f (x, y) han an unknown K
−∞ −∞
Z2 Z2x
K x2 (8 − y) dy dx = 1 [∵ y is a function of x]
0 x
Z2 !2x
y2
K 2
x 8y − dx = 1
2 x
0
Z2
4x2 x2
" ! !#
K x2 16x − − 8x − dx = 1
2 2
0
Z2 2
!
x
K x2 16x − 2x2 − 8x + dx = 1
2
0
Z2 !
−3 4
K x + 8x3 dx = 1
2
0
#2
3 x5 x4
"
K − +8 =1
25 4 0
" ! #
3 32 8
K − + (16) = 1
2 5 4
" #
−48
K + 32 = 1
5
!
112
K =1
5
5
∴K=
112
5 2
∴ (1) ⇒ f (x, y) = x (8 − y) , x < y < 2x, 0 ≤ x ≤ 2. (2)
112
158 MA6453 Probability and Queueing Theory
(b) f (x) :
Z∞
f (x) = f (x, y)dy
−∞
Z2x Z2x
5 2 5 2
= x (8 − y) dy = x (8 − y) dy
112 112
x x
(∵ integration w.r.t. y, so treat remaining as constant)
!2x
5 2 y2
= x 8y −
112 2 x
4x2 x2
" ! !#
5 2
= x 16x − − 8x −
112 2 2
4x2
" #
5 2
= x 8x − 3
112 2
5 h i
= 16x − 3x , 0 < x < 2
3 4
224
(b) In Region R1 , f (y) :
Z∞
f (y) = f (x, y)dx
−∞
Zy
5 2
= x (8 − y) dx
112 Y Y = 2X
y
2 1
5
[Refer region R1 ] (2, 4)
4 X=Y
Zy
5
= (8 − y) x2 dx 3 R2
112
y (2, 2)
2 R1
2
" 3 #y
5 x 1
= (8 − y)
112 3 y X
0 1 2 3 0
4
" 3 !2
y3
!#
5 y
= (8 − y) −
112 3 24
" 3#
5 7y
= (8 − y)
112 24
5 3
= y (8 − y) , 0 < y < 2
384
In Region R2 , f (y) :
Two - Dimensional Random Variables 159
Z∞
f (y) = f (x, y)dx
−∞
Z2
5 2
= x (8 − y) dx [Refer region R2 ]
112
y
2
Z2
5
= (8 − y) x2 dx
112
y
2
" 3 #2
5 x
= (8 − y)
112 3 y
" !2
y3
!#
5 8
= (8 − y) −
112 3 24
5 h i
= (8 − y) 64 − y3 , 2 < y < 4
112
(d) f (x/y) :
f (x, y)
f (x/y) =
f (y)
5 h 2
i
2
8x − x y
112 , in R1
5 3
y (8 − y)
= 384
5 h 2 2
i
8x − x y
112
, in R2
5 3
(8 − y) 64 − y
112
(e) f (y/x) :
f (x, y)
f (y/x) =
f (x)
5 h 2 i
8x − x2 y
= 112
5
16x3 − 3x4
224
h i
2 8x2 − x2 y
=
16x3 − 3x4
(f) f (y/x = 1) :
160 MA6453 Probability and Queueing Theory
f (x = 1, y)
f (y/x = 1) =
f (x = 1)
5
8−y
= 112
5
(16 − 3)
224
2
=
8−y
13
!
1 3
(g) P ≤ X ≤ :
4 4
! Z3/4
1 3
P ≤X≤ = f (x)dx
4 4
1/4
Z3/4
5 3
= 16x − 3x4 dx
224
1/4
!3/4
5 3
= 4x4 − x5
224 5 1/4
= 0.0247366769
K xye−(x +y ) dxdy = 1
2 2
∞0 0 ∞
Z Z
−x2 −y2
K xe dx ye dy = 1 (1)
0 0
R∞ 2
Consider the integral I = xe−x dx
0
Let x2 = t
when x = 0, t = 0
2xdx = dt
when x → ∞, t → ∞
xdx = dt/2
Two - Dimensional Random Variables 161
Z∞
4xye−(x +y ) dx
2 2
=
0
Z∞
2 2
= 4ye−y xe−x dx
0
!
−y2 1
= 4ye [∵ by (2)]
2
Independent of X and Y:
2
f (y) = 2ye−y , y > 0
2 2
f (x). f (y) = 2xe−x 2ye−y
= 4xye−( x +y )
2 2
#2
8 2x2 x4
"
= − =1
k 2 8 1
" #
8 1 1
= (4 − ) − (1 − ) = 1
k 8 8
9
=1
k
k=9
8
xy , 1 < x < y < 2
∴ f (x, y) =
9
0 , otherwise
(ii) To find marginal values of X and Y
" #
1 1
= +
12 6
3
=
12
1
P(Y < 1/2) =
4
Z1/2Z2
(iii) P(X > 1, Y < 1/2) = f (x, y)dxdy
0 1
Z1/2Z2
x2
= (xy2 + )dxdy
8
0 1
Z1/2 !2
x 2 y2 x 3
= + dy
2 24 1
0
Z1/2"
y2
! !#
1 1
= 2y + −
2
+ dy
3 2 24
0
Z1/2!
3 2 7
= y + dy
2 24
0
! #1/2
3 y3
" !
7
= + y
2 3 24 0
" ! ! #
3 1 7
= + −0
2 24 48
3 7
= +
48 48
5
P(X > 1, Y < 1/2) =
24
Z1 Zy
(iv) P(X < Y) = f (x, y)dxdy
0 0
Z1 Zy "
x2
!#
= xy2 + dxdy
8
0 0
Z1 !y
x2 y2 x3
= + dy
2 24 0
0
166 MA6453 Probability and Queueing Theory
Z1
y4 y3
!
= + dy
2 24
0
" 5 #1
y y4
= +
10 96 0
1 1
= +
10 96
53
=
480
53
P(X < Y) =
480
1
Example 2.13. If the joint pdf of X and Y is given by f (x, y) = (6 −
8
x − y), 0 < x < 2, 2 < y < 4. Find
(a) P(X < 1, Y < 3) (b) P(X + Y < 3) (c) P(X + Y > 3)
(d) P(X < 1/Y < 3) (e) f (x/y = 1)
Two - Dimensional Random Variables 167
Z∞
Now, f (y) = f (x, y)dx
−∞
Z2
1
= (6 − x − y)dx
8
0
Z2
1
= (6 − x − y)dx
8
0
!2
1 x2
= 6x − − xy
8 2 0
1
=
(12 − 2 − 2y) − ((0 − 0 − 0))
8
1
=
10 − 2y
8
1
=
5−y
4
Z3
1
∴ (3) ⇒ P(Y < 3) =
5 − y dy
4
2
#3
y2
"
1
= 5y −
4 2 2
" ! #
1 9
= 15 − − (10 − 2)
4 2
" #
1 9
= 7−
4 2
Two - Dimensional Random Variables 169
" #
1 5
=
4 2
5
=
8
P(X < 1, Y < 3) 3/8 3
∴ (2) ⇒ P(X < 1/Y < 3) = = = [∵ by (a) and (3)]
P(Y < 3) 5/8 5
(e) f (x/y = 1) :
" #
f (x, y)
f (x/y = 1) =
f (y) y=1
1
(6 − x − y)
= 8
1
(5 − y)
4 y=1
" #
1 5−x
=
2 4
5−x
=
8
Example 2.14. Given joint p.d.f. f (x, y) = 2, 0 < y < x < 1. Find
(a) Marginal p.d.f. of X (b) Marginal p.d.f. of Y
(c) Cond. prob. of X given Y (d) Cond. prob. of Y given X
(e) Are X and Y independent?
Solution : Given joint p.d.f. f (x, y) = 2, 0 < y < x < 1. (1)
Zx=1
= 2dy (∵ by (1))
x=y
h i x=1
=2 x
x=y
= 2[(1) − (y)]
∴ f (y) = 2(1 − y), 0 < y < 1 (use extreme limits)
r x 2
2
, −R ≤ X ≤ R
(a) Find c (b) show that f (x) =
1−
πR R
0 , otherwise
Solution : Given that the joint density function of X and Y is constant in
this circle and given by
c , x2 + y2 ≤ R2
(
f (x, y) = (1)
0 , otherwise
(a) Value of c :
Z∞ Z∞
W.K.T. f (x, y)dxdy = 1 (∵ f (x, y) contains an unknown)
−∞ −∞
Z Z
cdx = 1
πR2 c = 1
1
∴c=
πR2
1
, x2 + y2 ≤ R2
∴ (1) ⇒ f (x, y) =
πR 2 (2)
0 , otherwise
(b) Proof of f (x) :
Z∞
f (x) = f (x, y)dy
−∞
√
ZR2 −x2
1
= dy
√
πR2
− R2 −x2
√
ZR2 −x2
2
= dy (∵ constant fn. which is an even function)
πR2
0
2 h √ 2 i
= 2 R − x2 − (0)
πR r
x 2
1 −
2 R
= 2
πR
R
r x 2
2
= 1− , −R ≤ X ≤ R.
πR R
172 MA6453 Probability and Queueing Theory
(vi) P(a < X < b, c < Y < d) = F X,Y (b, d) + F X,Y (a, c) − F X,Y (a, d) − F X,Y (b, c)
Z2
2 + 2y dy = 1
k
0
1
k=
8
Example 2.18. The joint p.d.f. of 2 random variables X and Y is
f (x, y) = cx(x − y), 0 < x < 2, −x < y < x. Find c.
Solution: By the property of jpdf, we have
Z∞ Z∞
f (x, y)dxdy = 1
−∞ −∞
Z2 Zx
cx (x − y) dydx = 1
0 −x
1
c=
8
Example 2.19. Find k if the joint probability density function of
a bivariate random variable (X, Y) is given by
k (1 − x) (1 − y) , 0 < x, y < 1
(
f (x, y)
0, otherwise
Solution: By the property of joint p.d.f. we have
"
f (x, y) dxdy = 1
R
Z1 Z1
k (1 − x) (1 − y) dxdy = 1
0 0
Z2 # x=1
x2 x2 y
"
k x − − xy + dy = 1
2 2 x=0
0
Z1 " #
1 y
k − y + dy = 1
2 2
0
k=4
Example ( 2.20. If X and Y have joint p.d.f.
x + y, 0 < x < 1, 0 < y < 1
f (x, y) = . Check whether X and Y are
0, otherwise
independent.
Solution: If X and Y are independent then f (x, y) = f (x) · f (y)
The marginal density function of X is
174 MA6453 Probability and Queueing Theory
R∞ R1 1
f (x) = f (x, y)dy = (x + y)dy = x +
−∞ 0 2
The marginal density function of Y is
R∞ R1 1
f (y) = f (x, y)dx = (x + y)dx = y +
−∞ 0 2
! !
1 1
∴ f (x) f (y) = x + y+
2 2
, x + y = f (x, y)
i.e., f (x) · f (y) , f (x, y)
∂2
Since F (x, y) < 0, F is not a joint probability distribution function.
∂x∂y
Example 2.22. Find the marginal density function of X and Y if
f (x, y) = 2 (2x + 5y) /5, 0 ≤ x ≤ 1, 0 ≤ y ≤ 1.
Solution: The marginal density function of X is given by
R∞ R1 2 4x + 5
fX (x) = f (x) = f (x, y)dy = (2x + 5y)dy = .
−∞ 0 5 5
The marginal density function of Y is given by
R∞ R1 2 2 + 10y
fY (y) = f (y) = f (x, y)dx = (2x + 5y)dx =
−∞ 0 5 5
Example 2.23. Can the joint distributions of two random
variables X and Y be got if their marginal distributions are
Two - Dimensional Random Variables 175
known?
Solution: We know that fXY (x, y) = fX (x) fY (y)
Two joint distributed random variables X and Y are statistically
independent of each other, if and only if the joint probability density
function is equals the product of the two marginal probability density
functions.
X
H
PY (y) = P∗ j
HH
H 2 4
Y HH
HH
1 0.10 0.15 0.25
3 0.20 0.30 0.50
5 0.10 0.15 0.25
PX (x) = Pi∗ 0.40 0.60 1
X
(a) E(X) = xi P (X = xi )
i
= 2(0.40) + 4(0.60)
= 0.8 + 2.4
= 3.2
X
E(Y) = y jP Y = y j
j
= (2)(1)(0.10)+(2)(3)(0.20)+(2)(5)(0.10)
+(4)(1)(0.15)+(4)(3)(0.30)+(4)(5)(0.15)
= 9.6
Pi j = Pi∗ × P∗ j , ∀i, j
Similarly,
P23 = P2∗ × P∗3
P25 = P2∗ × P∗5
P41 = P4∗ × P∗1
P43 = P4∗ × P∗3
P45 = P4∗ × P∗5
∴ X and Y are independent.
2.6 Co - Variance
Note:
nh ih io h i
E X − E(X) Y − E(Y) = E XY − XE(Y) − Y E(X) + E(X)E(Y)
= E(XY) − E(X)E(Y) − E(Y)E(X) + E(X)E(Y)
= E(XY) − E[X] · E[Y].
Properties :
1. Cov(X, Y) = E[XY] − E[X] · E[Y].
h i2
2. Cov(X, X) = E[XX] − E[X] · E[X] = E X 2 − E(X) .
3. Cov(X + a, Y + b) = Cov(X, Y). (1)
Proof:
2.7 Correlation
The two variables vary such that the change in one variable affects the
change in another variable. Then the variables are said to be correlated.
Correlation analysis is a statistical tool used to measure the degree of
relationship between any two variables.
Types of Correlation :
Positive Correlation : If the increase in one variable results in
Two - Dimensional Random Variables 179
Cov(X, Y)
r(X, Y) = , σ x , 0σy , 0
σ x σy
Note:
Cov(X, Y)
r(X, Y) =
σ x σy
E(XY) − E(X)E(Y)
= p p
E X 2 − [E(X)]2 E Y 2 − [E(Y)]2
180 MA6453 Probability and Queueing Theory
Properties :
Note:
X 9 8 7 6 5 4 3 2 1
Y 15 16 14 13 11 12 10 8 9
Two - Dimensional Random Variables 181
n=9
1X
X= X
n
1
= (45)
9
=5
1X
Y= Y
n
1
= (108)
9
= 12
1X
Cov(X, Y) = XY − X Y
n
182 MA6453 Probability and Queueing Theory
1
= (597) − (5)(12)
9
= 6.3333
r
1 X 2 2
σx = X − X
r n
1
= (285) − 25
√ 9
= 6.6
= 2.582
r
1 X 2 2
σy = Y − Y
r n
1
= (1356) − 144
√ 9
= 6.6
= 2.582
Cov(X, Y)
∴ (1) ⇒ r(x, y) =
σ x σy
6.3333
=
(2.582)(2.582)
r(X, Y) = 0.9499
X 65 66 67 67 68 69 70 72
Y 67 68 65 68 72 72 69 71
Solution:
X Y XY X2 Y2
65 67 4355 4225 4489
66 68 4488 4356 4624
67 65 4355 4489 4225
67 68 4556 4489 4624
68 72 4896 4624 5184
69 72 4968 4761 5184
70 69 4830 4900 4761
72 71 5112 5184 5041
X = 544 Y = 552 XY = 37560 X = 37028 Y = 38132
P P P P 2 P 2
Two - Dimensional Random Variables 183
1
X= × 544 = 68
8
552
Y= = 69
8
1X 1
Cov(X, Y) = XY − XY = 37560 − 4692 = 3
nr 8
1
σx = × 37028 − 4624 = 2.1213
r 8
1
σy = × 38132 − 4761 = 2.3452
8
Correlation Coefficient is
Cov(X, Y) 3
r(X, Y) = = = 0.6030
σ x σy (2.1213)(2.3452)
(or)
X Y U = X − 70 V = Y − 72 UV U2 V2
65 67 -5 -5 25 25 25
66 68 -4 -4 16 16 16
67 65 -3 -7 21 9 49
67 68 -3 -4 12 9 16
68 72 -2 0 0 4 0
69 72 -1 0 0 1 0
70 69 0 -3 0 0 9
72 71 2 -1 -2 4 1
U = −16 V = −24 UV = 72 U 2 = 68 V 2 = 116
P P P P P
1X 1
U= U = (−16) = −2
n 8
1 X 1
V= V = (−24) = −3
n 8
1 X
Cov(U, V) = UV − U V
n
1
= (72) − (−2)(−3)
8
∴ Cov(U, V) = 3
r
1X 2
σu = U − (u)2
n
184 MA6453 Probability and Queueing Theory
r
1
= (68) − (−2)2
8
σu = 2.1213
r
1X 2
σv = V − (v)2
r n
1
= (116) − (−3)2
8
σv = 2.3452
∴Correlation Coefficient is
Cov(U, V) 3
r(X, Y) = r(U, V) = =
σu σv (2.1213)(2.3452)
= 0.6030
Y
H
HH
HH −1 1
X HH
H
0 1/8 3/8
1 2/8 2/8
Find the correlation co-efficient of X and Y.
Solution:
Y
H
P(X = xi )
HH
H −1 1
X HH
HH
X 0 1
P(x) 1/2 1/2
Y -1 1
P(y) 3/8 5/8
Two - Dimensional Random Variables 185
X
E[X] = xP(x)
! !
1 1
=0 +1
2 2
1
=
2
X
E[X ] =
2
x2 P(x)
! !
1 1
= 02 + (1)2
2 2
1
=
2
X
E[Y] = yP(y)
= (−1) (3/8) + 1 (5/8)
1
=
4
X
E[Y ] =
2
y2 P(y)
= (−1)2 (3/8) + (1)2 (5/8)
=1
XX
E[XY] = xyp(x, y)
= x0 y0 p(x0 , y0 ) + x0 y1 p(x0 , y1 ) + x1 y0 p(x1 , y0 ) + x1 y1 p(x1 , y1 )
= 0(−1) (1/8) + 0(1) (3/8) + 1(−1) (2/8) + (1)(1) (2/8)
=0
Cov(X, Y) = EXY − E[X]E[Y]
! !
1 1
=0−
2 4
−1
Cov(X, Y) =
8
σ x = Var[X]
2
= E[X 2 ] − (E[X])2
!2
1 1
= −
2 2
1
σ2x =
4
1
σx =
2
σy = Var[Y]
2
= E[Y 2 ] − (E[Y])2
186 MA6453 Probability and Queueing Theory
!2
1
=1−
4
15
σ2y =
16
√
15
σy =
4
Correlation co-efficient is
Cov(x, y)
r(X, Y) =
σ x σy
−1/8
= √
1 15
2 4
r(X, Y) = −0.2582
Y = 4X + 9 (1)
E(Y) = E(4X + 9)
= 4E(X) + 9
E(XY) = E[X(4X + 9)]
= E[4X 2 + 9X]
= 4E[X 2 ] + 9E[X]
Cov(X, Y) = E[XY] − E[X] · E[Y]
= 4E[X 2 ] + 9E[X] − E[X](4E[X] + 9)
= 4E[X 2 ] + 9E[X] − 4(E[X])2 − 9E[X]
= 4[E[X 2 ] − (E[X])2 ]
Two - Dimensional Random Variables 187
Cov(X, Y) = 4σ2x
σ2Y = Var[Y] = Var[4X + 9] (by the property of variance)
= 42 Var[X]
= 16Var[X]
σ2y = 16σ2x
σy = 4σ x
Correlation co-efficient is
Cov(x, y) 4σ2x
r(X, Y) = =
σ x σy σ x 4σ x
r(X, Y) = 1
Given Var(X) = 36
Var(Y) = 16
Let U = X + Y
Let V = X − Y
Cov(U, V) = 20
σU = E U 2 − [E(U)]2
2
h i
= E (X + Y) − [E(X + Y)]
2 2
h i
= E X 2 + Y 2 + 2XY − [E(X) + E(Y)]2
= E X 2 + E Y 2 + 2E(XY) − [E(X)]2 − [E(Y)]2 − 2E(X)E(Y)
h i h i
= E(X 2 ) − [E(X)]2 + E(Y 2 ) − [E(Y)]2
= Var(X) + Var(Y)
= 36 + 12
σ2U = 52
√
σU = 52
σ2V = E V 2 − [E(V)]2
h i
= E (X − Y)2 − [E(X − Y)]2
h i
= E X 2 − Y 2 + 2XY − [E(X) + E(Y)]2
= E X 2 + E Y 2 − 2E(XY) − [E(X)]2 − [E(Y)]2 + 2E(X)E(Y)
h i h i
= E(X 2 ) − [E(X)]2 + E Y 2 − [E(Y)]2
(Since X and Y are independent E(XY) = E(X)E(Y))
= Var(X) + Var(Y)
= 36 + 16
σ2V = 52
√
σV = 52
Correlation co-efficient,
Cov(U, V) 20
r(U, V) = = √ √
σu σv 52 52
r(X + Y, X − Y) = 0.3846
E(X) = 0
E(Y) = 0
Two - Dimensional Random Variables 189
E(Z) = 0
σX = 0 (1)
σY = 0
σZ = 0
r(X, Y) = 0
Cov(X, Y) = 0
E(XY) − E(X)E(Y) = 0
E(XY) − (0)(0) = 0
∴ E(XY) = 0
Similarly
E(YZ) = 0
E(ZX) = 0
Now,
(1) ⇒ σ2X =E X − [E(X)]2 = 25
2
E X 2 − [0]2 = 25
∴ E X 2 = 25
Similarly, E Y 2 = 144
E Z 2 = 81
Now
h i
= E X 2 + Y 2 + 2XY − [E(X) + E(Y)]2
= E X 2 + E Y 2 + 2E(XY) − [E(X)]2 − [E(Y)]2 − 2E(X)E(X)
= 25 + 144 + 2(0) − 02 − 02 − 2(0)(0)
= 25 + 144
∴ σ2U = 169
σU = 13
σ2V = E V 2 − [E(V)]2
h i
= E (Y + Z) − [E(Y + Z)]2
2
= E X + 2YZ + Z − [E(Y) + E(Z)]2
2 2
h i
= E Y + E Z 2 + 2E(YZ) − [E(Y)]2 − [E(Z)]2 − 2E(Y)E(Z)
2
aE X 2 + bE(X) − a[E(X)]2 − bE(X)
=
h σX σi Y
a E X − [E(X)]2
2
=
σ X σY
aσ2X
=
σ X σY
aσY
= (1)
σY
σ2Y = E Y 2 − [E(Y)]2
h i
= E (aX + b) − [E (aX + b)]2
2
h i
= E a X + 2abX + b − [aE(X) + b]2
2 2 2
h i n o
= a2 E X 2 + 2abE [X] + b2 − [aE [X]]2 + b2 + 2abE [X]
h i
= a2 E X 2 + 2abE [X] + b2 − a2 [E [X]]2 − b2 − 2abE [X]
h i
= a2 E X 2 − a2 [E [X]]2
n h i o
= a2 E X 2 − [E [X]]2
= a2 σ2x
∴ σY = ±aσX
aσX
∴ (1) ⇒ r (X, Y) =
±aσX
= ±1
⇒ |r (X, Y)| = 1
1
Example 2.37. If jpdf of X and Y is f (X, Y) = (6 − x − y), 0 ≤ x ≤ 2, 2 ≤
8
y ≤ 4. Find the correlation between X and Y.
Solution: Here X and Y are continuous random variables with jpdf
1
f (X, Y) = (6 − x − y), 0 ≤ x ≤ 2, 2 ≤ y ≤ 4.
8
W.K.T., the correlation coefficient between X and Y is
R∞ R∞ R∞ R∞
" #" #
xy f (x, y) dx dy − x f (x) dx y f (y) dy
−∞ −∞ −∞ −∞
= s #2 s #2
R∞ R∞ R∞ R∞
" "
x2 f (x) dx − x f (x) dx y2 f (y) dy − y f (y) dy
−∞ −∞ −∞ −∞
Marginal pdf of X
Z∞ Marginal pdf of Y
f (x) = f (x, y)dy Z∞
−∞ f (y) = f (x, y)dx
Z4 −∞
1
= (6 − x − y)dy 1
Z2
8 = (6 − x − y)dx
2 8
#4
y2
"
1 0
= 6y − xy − 1
"
x2
#2
8 2 2 = 6x − xy −
1 8 2 0
= [(24 − 4x − 8) − (12 − 2x − 2)] 1
8 = [12 − 2 − 2y]
1 8
= [−2x + 6] 1
8 ∴ f (y) = [5 − y], 2 ≤ y ≤ 4
1 4
∴ f (x) = [3 − x], 0 ≤ x ≤ 2
4
Expectation of X Expectation of Y
Z∞ Z∞
E[X] = x f (x)dx E[Y] = y f (y)dy
−∞ −∞
Z2 Z4
1 1
= x (3 − x)dx = y (5 − y)dy
4 4
0 2
Z2 Z4
1 1
= (3x − x2 )dx = (5y − y2 )dy
4 4
0 2
3 2
2
#4
1 5y2 y3
" # "
1 3x x
= − = −
4 2 3 0 4 2 3 2
" # " ! !#
1 8 1 64 8
= 6− = 40 − − 10 −
4 3 4 3 3
5 17
∴ E[X] = E[Y] =
6 6
194 MA6453 Probability and Queueing Theory
Z4 Z2
E[XY] = xy f (x, y)dxdy
2 0
Z4 Z2
1
= xy (6 − x − y)dxdy
8
2 0
Z4 Z2
1
= (6xy − x2 y − xy2 )dxdy
8
2 0
Z4 " #2
2
" 3 #2 " 2 #2
1 x x 2 x
=
6y − y − y dy
8 2 0 3 0 2 0
2
Z4 !
2 4
! 2 4
! 3 4
!
1 8 1 y 8 y y
= 12y − y − 2y2 dy = 12
− −2
8 3 8 2 2 3 2 2 3 2
"2 #
1 8 56 1 112
= 72 − × 6 − 2 × = [72 − 16 − ]
8 3 3 8 3
E[XY] = 7/3
Z∞
Z∞
E[Y 2 ] = y2 f (y)dy
E[X 2 ] = x2 f (x)dx
−∞
−∞
Z4
Z2 1
1 = (5y2 − y3 )dy
= (3x2 − x3 )dx 4
4 2
0 #4
1 5y3 y4
"
4 2
=
" #
1 3 x −
= x − 4 3 4 2
4 4 0 " ! !#
1 320 40
1
= [8 − 4] = − 64 − −4
4 4 3 3
E[X 2 ] = 1 25
=
3
Two - Dimensional Random Variables 195
Cov(X, Y)
r(X, Y) =
σ X σY
−1/36
= r r
11 11
36 36
−1
=
11
r(X, Y) = −0.09091
Z∞ Z∞
f (x) = f (x, y)dy f (y) = f (x, y)dx
−∞ −∞
Z1 Z1
= (2 − x − y)dy = (2 − x − y)dx
0 0
#1 #1
y2
"
x2
"
= 2y − xy − = 2x − − yx
2 2
" # 0 0
1 1
= 2−x− =2− −y
2 2
3 3
f (x) = − x, 0 ≤ x ≤ 1 f (y) = − y, 0 ≤ y ≤ 1
2 2
196 MA6453 Probability and Queueing Theory
Z∞ Z∞
E[X] = x f (x)dx E[Y] = y f (y)dy
−∞ −∞
Z1 ! Z1 !
3 3
= x − x dx = y − y dy
2 2
0 0
Z1 ! " 2 3 1
# Z1 ! " 2 3 1
#
3 3 x x 3 3 y y
= x − x2 dx = − = y − y2 dy = −
2 22 3 0 2 22 3 0
0 0
3 1 9−4 3 1 9−4
= − = =− =
4 3 12 4 3 12
5 5
E[X] = E[Y] =
12 12
Z1 Z1 Z1 Z1
E[XY] = xy f (x, y)dxdy = xy(2 − x − y)dxdy
0 0 0 0
Z1 Z1
= 2 2
2xy − x y − xy dxdy
0 0
Z1 " 2 #1
x x3 x2 2
= 2 y − y − y dy
2 3 2 0
0
Z1 "
y y2
#
= y− − dy
3 2
0
" 2 #1
2y y3
= −
32 6 0
1 1
= −
3 6
1
∴ E[XY] =
6
h i Z∞ h i Z
∞
E X2 = x2 f (x)dx E Y2 = y2 f (y)dy
−∞ −∞
Z1 ! Z1 !
3 3
= x2 − x dx = y2 − y dy
2 2
0 0
Z1 ! Z1 !
3 2 3 2
= x − x3 dx = y − y3 dy
2 2
0 0
" 2 #1 #1
x3
" 2
y3
! !
3x 3 1 1 3y 3 1 1
= − = − = − = −
22 3 0 2 3 4 22 3 0 2 3 4
1 1
= =
4 4
Correlation co-efficient is
Cov(x, y)
r(X, Y) =
σ x σy
−1
= √ 144 r
11 11
12 12
r(x, y) = −0.0909
Rank Correlation
6 di2
P
r(X, Y) = 1 − where di = xi − yi
n(n2 − 1)
known as Spearman’s Rank correlation co-efficient.
Note: hP i
6 di + C.F.
2
For repeated Ranks r(X, Y) = 1 − ,
n(n2 − 1)
X mi m2i − 1
where C.F. is the correction factor = where mi is the number
∀i
12
of times an item is repeated in X and Y series.
Example 2.39. Find the rank correlation co-efficient from the
following data.
Rank of X 1 2 3 4 5 6 7
Rank of Y 4 3 1 2 6 5 7
Solution: Here ranks are not repeated in X and Y.
X Y di = xi − yi di2
1 4 −3 9
2 3 −1 1
3 1 2 4
4 2 2 4
5 6 −1 1
6 5 1 1
7 7 0 0
di = 20
P 2
Here n = 7
∴ Rank correlation co-efficient for non repeated ranks is
6 di2
P
r(X, Y) = 1 −
n(n2 − 1)
6(20)
=1−
7(72 − 1)
120
=1−
7(48)
r(X, Y) = 0.6429
Here n = 12.
∴Rank correlation co-efficient for repeated ranks is
P
6 di2 + C.F.
r(X, Y) = 1 −
n(n2 − 1)
6(72.5 + 7)
=1−
12(122 − 1)
477
r(X, Y) = 1 −
1716
∴ r(X, Y) = 0.7221
200 MA6453 Probability and Queueing Theory
2.8 Regression
Regression Lines :
Regression Line of Y on X is
y − y = byx (x − x)
where byx = Regression coefficient of y on x
Cov (X, Y) Cov (X, Y)
= =
Var (X) σ2x
σy
=r
σPx P P
n xy − ( x) ( y)
= · · · direct method
n x2 − ( x)2
P P
Regression Line of X on Y is
x − x = b xy (y − y)
where b xy = Regression coefficient of x on y
Cov (X, Y) Cov (X, Y)
= =
Var (Y) σ2y
σx
=r
σy
P P P
n xy − ( x) ( y)
= · · · direct method
n y2 − ( y)2
P P
σ σ
2 !
r − 1 x y
Obtuse angle, θ0 = tan−1
σ2x + σ2y
|r|
(iii) r2 = b xy byxp
i.e., r = ± b xy byx and r has the same sign as that of b xy and byx .
(iv) Regression coefficients are independent of change but not of scale and
x−a y−b
if u = , v= , then
h k
h k
b xy = buv and byx = bvu
k h
X = E[X / Y = y]
Y = E[Y / X = x]
Example 2.41. The following table gives age X in years if cars and
annual maintenance cost Y(in 100 Rs.)
X 1 3 5 7 9
Y 15 18 21 23 22
Find
(a) regression equations
(b) correlation coefficient
(c) Estimate the maintenance cost for a 4 year old car.
Solution :
(a) W.K.T. regression equations, i.e.,
Regression equation of Y on X is
(y − y) = byx (x − x) (1)
Regression equation of X on Y is
(x − x) = b xy (y − y) (2)
where
202 MA6453 Probability and Queueing Theory
# X Y XY X2 Y2
1 1 15 15 1 225
2 3 18 54 9 324
3 5 21 105 25 441
4 7 23 161 49 529
5 9 22 198 81 484
X = 25 Y = 99 XY = 533 X = 165 Y = 2003
P P P P 2 P 2
n=5
5(533) − (25)(99)
P
x 99
x= = = 19.8 byx = = 0.95
Pn 5 5(165) − (25)2
y 25 5(533) − (25)(99)
y= = =5 b xy = = 0.8878
n 5 5(2003) − (99)2
∴ Regression equation of Y on X is from (1) is
(y − 19.8) = 0.95(x − 5) ⇒ y = 0.95x + 15.05 (3)
∴ Regression equation of X on Y is from (2) is
(x − 5) = 0.8878(y − 19.8) ⇒ x = 0.8878y − 12.5784 (4)
(b) Correlation Coefficient r :
q
r = ± byx · b xy
√
= ± 0.8878 × 0.95
∴ r = ±0.9183
(c) Cost(Y) for 4 year old(X) car :
i.e., we have to use regression equation of Y on X. i.e.,
∴ (3) ⇒ y = 0.95x + 15.05
= 0.95(4) + 15.05
∴ y = 18.85
i.e., The cost for 4 year old car is Rs.1885.
Two - Dimensional Random Variables 203
8x − 10y + 66 = 0
40x − 18y − 214 = 0
Find
(a) mean values of X and Y (b) r (X, Y)
Solution : Given
8x − 10y + 66 = 0 (1)
40x − 18y − 214 = 0 (2)
(a) Since both the lines of regression pass through the point (x, y), where
(x) and (y) are mean values of x and y, which satisfies the two given
lines of regression.
Find the mean value point x, y by solving equations (1) and (2) : i.e.,
On solving,
x = 13
y = 17
(b) r(x, y) :
(1) ⇒ 8x − 10y + 66 = 0 (1) ⇒ 8x − 10y + 66 = 0
⇒ 8x = 10y − 66 ⇒ 10y = 8x + 666
10 66 8 66
⇒ x= y− ⇒y= x+
8 8 10 10
10 8
∴ b xy = ∴ byx =
8 10
(2) ⇒ 40x − 18y − 214 = 0 (2) ⇒ 40x − 18y − 214 = 0
⇒ 40x − 214 = 18y ⇒ 40x = 18y + 214
40 214 18 214
⇒y= y− ⇒ x= y+
18 18 40 40
40 18
∴ byx = ∴ b xy =
18q 40q
∴ r = ± b xy · byx ∴ r = ± b xy · byx
r r
10 40 18 8
r=± · r=± ·
8 18 40 10
r = ±2.777 r = ±0.6
204 MA6453 Probability and Queueing Theory
W.K.T. 0 ≤ r ≤ 1.
∴ r = ±0.6
4x − 5y + 33 = 0 (A)
20x − 9y = 107 (B)
Since both the lines of regression passes through the mean values
x and y, the point (x, y) must satisfy the two given regression lines.
4x − 5y = −33 (1)
20x − 9y = 107 (2)
x = 13, y = 17
r2 = b xy b xy
9 4
=
20 5
9
=
25
3
r = = 0.6
5
Example 2.44. If y = 2x − 3 and y = 5x + 7 are the two regression
lines, find the mean values of x and y. Find the correlation
coefficient between x and y. Find an estimate of x when y = 1.
10 29 √
Solution : {Mean values x = − , y = − , r = 2/5, for y = 1, x = −6/5}
3 3
Two - Dimensional Random Variables 205
Solution :
(i)
Z∞ Z∞
f (x) = f (x, y)dy f (y) = f (x, y)dx
−∞ −∞
Z2 Z1
1 1
= (x + y)dy = (x + y)dy
3 3
0 0
#1
2 2 1 x2
" # "
1 y = + xy
= xy + 3 2
3 2 0 " # 0
1 1 1
= [2x + 2] = +y
3 3 2
2 1
f (x) = [x + 1], 0 ≤ x ≤ 1 f (y) = [1 + 2y], 0 ≤ y ≤ 2
3 6
Z∞ Z∞
E[X] = x f (x)dx E[Y] = y f (y)dy
−∞ −∞
Z1 Z2
2 1
= (x + 1) dx
x = (1 + 2y) dy
y
3 6
0 0
#1 #2
2 x3 x2 1 y2 y3
" "
= + = +2
3 3 2 0 6 2 3
" # " # 0
2 1 1 10 1 16
= + = = 2+
3 3 2 18 6 3
5 11
E[X] = E[Y] =
9 9
206 MA6453 Probability and Queueing Theory
Z∞ Z∞ Z∞ Z∞
E[XY] = xy f (x, y)dxdy (or) xy f (x, y)dydx
−∞ −∞ −∞ −∞
Z1 Z2 Z1 Z2
1 1
= xy (x + y)dydx = x2 y + xy2 dydx
3 3
0 0 0 0
Z1 #2
x2 y2 xy3
"
1
= + dx
3 2 3 0
0
Z1 " #
18
=
2x2 + x dx
33
0
#1
1 2 3 8 x2
"
= x +
3 3 32 0
" # " #
1 2 4 1 6
= + =
3 3 3 3 3
2
∴ E[XY] =
3
" #" #
2 5 11 2 55
Cov[X, Y] = E[XY] − E[X] · E[Y] = − = −
3 9 9 3 81
−1
Cov[X, Y] =
81
Z∞ ∞
h i h i Z
E X2 = x2 f (x)dx E Y2 = y2 f (y)dy
−∞ −∞
Z1 Z1
2 1
= x2 (x + 1) dx = y2 (1 + 2y) dy
3 6
0 0
Z1 Z1
2 1
= x3 + x2 dx = y2 + 2y3 dy
3 6
0 0
3 1
#2
2 x4 x 1 y3 y4
" # "
= + = +2
3 4 3 0 6 3 4
" ! # " ! 0 #
2 1 1 1 8
= + − (0 + 0) = + 8 − (0 + 0)
3 4 3 6 3
7 16
= =
18 9
Two - Dimensional Random Variables 207
Correlation co-efficient is
Cov(x, y)
r(X, Y) =
σ x σy
1
−
= √ 81
13162
r
23
81
r
2
r(x, y) = −
299
rσX rσY
X − E(X) = [Y − E(Y)] Y − E(Y) = [X − E(X)]
σY σX
r r
r 13 r 23
" # " #
5 2 162 11 11 2 81 5
X− =− × r Y− Y− =− × r X−
9 299 23 9 9 299 13 9
" # 81 " # 162
5 1 11 11 2 5
X− =− Y− Y− =− X−
9 23 9 9 13 9
1 14 2 17
X=− Y+ Y =− X+
23 23 13 13
W.K.T.
f (x, y) (x + y)/3
The conditional probability of X on Y is f (x/y) = =
f (y) (1 + 2y)/6
2(x + y)
=
1 + 2y
f (x, y) (x + y)/3
The conditional probability of Y on X is f (y/x) = =
f (x) 2(x + 1)/3
x+y
=
2(1 + x)
X = E[X / Y = y] Y = E[Y / X = x]
Z1 Z2
= x f (x/y) dx = y f (y/x) dy
0 0
Z1 Z2
2(x + y) x+y
= x dx = y dy
1 + 2y 2(1 + x)
0 0
Z1 Z2
2 1
= x + xy dx
2
= xy + y 2
dy
1 + 2y 2(1 + x)
0 0
3 2
#1 #2
xy2 y3
" "
2 x x 1
= + y = +
1 + 2y 3 2 0 2(1 + x) 2 3 0
" # " ! #
2 1 y 1 8
= + = 2x + − (0 + 0)
1 + 2y 3 2 2(1 + x) 3
2+y 3x + 4
" #
2
= ∴y=
1 + 2y 6 3(1 + x)
2 + 3y
∴x=
3(1 + 2y)
Two - Dimensional Random Variables 209
Y(X = 0) V(U = 0)
31 v = f2 (x, y)
(x, y) v (u, v)
2
1
u = f1 (x, y)
U(V = 0)
0 1 2 30 X(Y = 0) u
XY − plane U V − plane
f (x, y) = Joint pdf of X&Y g(u, v) = Joint pdf of U&V
= fXY (x, y) = gUV (u, v)
f (x) = Marginal pdf of X g(u) = Marginal pdf of U
Z∞ Z∞
= f (x, y)dy = g(u, v)dv
−∞ −∞
= fX (x) = gU (u)
f (y) = Marginal pdf of Y g(v) = Marginal pdf of V
Z∞ Z∞
= f (x, y)dx = g(u, v)du
−∞ −∞
= fY (y) = gV (v)
Let U = f1 (x, y) and V = f2 (x, y)
Find x and y in terms of u and v
∂x ∂x
∂u ∂v
, =
f (x, y) |J| J ∂y ∂y
∂u ∂v
Define g(u, v) =
(or)
∂u ∂u
1 ∂x ∂y
, =
0
f (x, y) J
∂v ∂v
|J 0 |
∂x ∂y
Find Range of space of u and v
Z∞
P.D.F. of U = g(u) = g(u, v)dv
−∞
Z∞
P.D.F. of V = g(v) = g(u, v)du
−∞
210 MA6453 Probability and Queueing Theory
Note : On obtaining the joint p.d.f. fUV (u, v) we can find the marginal
p.d.f.s of U and V as
R∞
fU (u) = f (u) = fUV (u, v) dv
−∞
R∞
fV (v) = f (v) = fUV (u, v) du
−∞
(b) Theorem : If X and Y are independent continuous r.v.s then the p.d.f.
of U = X + Y is given by
R∞
f (u) = fX (v) fY (u − v) dv
−∞
(c) Working rule for finding fUV (u, v) given the j.p.d.f. fXY (x, y) as in
following steps:
(i) Find out the joint p.d.f. of (X, Y) if not already given.
(ii) Consider the new random variables, u = f1 (x, y) , v = f2 (x, y) and
rearrange them to express x and y in terms of u and v to get x =
g1 (u, v) , y = g2 (u, v).
∂x ∂x
∂ (x, y)
∂u ∂v xu xv
(iii) Find the Jacobian J = = ∂y ∂y = and hence
∂ (u, v) yu yv
∂u ∂v
the value of |J|
(iv) Find the j.p.d.f. fUV (u, v) using the formula: fUV (u, v) = fXY (x, y) |J|
and express it in terms of u and v.
Two - Dimensional Random Variables 211
X+Y
Let U = . (2)
2
Define a new variable V = X. (3)
(3) ⇒ v = x ⇒∴ x = v (4)
x+y
∴ (2) ⇒ u =
2
⇒ 2u = v + y ⇒∴ y = 2u − v (5)
Z∞
∴ Pdf of U is g(u) = g(u, v)dv V(U = 0)
−∞ v = 2u
Z∞
= 2e−2u dv
−∞
Z2u 0 < v < 2u
= 2e −2u
du
0
U(V = 0)
= 2e−2u · 2u
= 4ue−2u , u > 0
Given U = X − Y. (2)
Define a new variable V = X. (3)
(3) ⇒ v = x ⇒∴ x = v (4)
∴ (2) ⇒ u = x − y
⇒ u = v − y ⇒∴ y = v − u (5)
Now, joint pdf of (U, V) :
g(u, v) = f (x, y) |J| (6)
0 1
where J = = 1 ⇒ |J| = |1| = 1
−1 1
∴ g(u, v) = e−[x+y] · 1
= e−[v+v−u]
g(u, v) = eu · e−2v
x≥0 y≥0
v−u≥0
To find range space of u and v: v ≥ 0
v≥u
∴u≤0<v
V(U = 0)
u=v
R1 R2
v≥u v>0
U(V = 0)
214 MA6453 Probability and Queueing Theory
Z∞
∴ Pdf of U is g(u) = g(u, v)dv (7)
−∞
In R1 In R2
Z∞ Z∞
∴ (7) ⇒ g(u) = eu e−2v dv ∴ (7) ⇒ g(u) = eu e−2v dv
"0 −2v #∞ "u −2v #∞
e e
= eu = eu
−2 0 −2 u
e−2u
" # " #
1
= e (0) −
u
= e (0) −
u
−2 −2
u −u
e e
= ,u < 0 = ,u ≥ 0
2 2
eu
2 ,u < 0
∴ The required pdf of U : g(u) =
e−u
,u ≥ 0
2
(3) ⇒ v = y ⇒∴ y = v (4)
∴ (2) ⇒ u = xy
u
⇒ u = xv ⇒∴ x = (5)
v
X
Given U = . (2)
Y
(3) ⇒ v = X ⇒∴ x = v (4)
x
∴ (2) ⇒ u =
y
v v
⇒ u = ⇒∴ y = (5)
y u
0<y<1 0<x<1
v
0< <1
To find range space of u and v: 0 < v < 1 u
0<v<u
∴0<v<u<1
Z∞
∴ Pdf of U is g(u) = g(u, v)dv
−∞ V(U = 0)
Z1 u=v
1
= dv
v v=1
Zu
= [log v]1u R1 R2
U(V = 0)
= [(log 1) − (log u)]
= − log u, u > 0
Example 2.51. If X and Y are independent random √ variables with
mean ‘0’ and variance σ . Find p.d.f. of R = X + Y , φ = tan
2 2 2 −1 Y
X
.
Solution : Given X and Y are independent random variables with mean
‘0’ and variance σ2 . i.e.,
X ∼ Nor. Dist. µ = 0, σ 1
Y ∼ Nor. Dist. µ = 0, σ 1
1 x−µ 2
1 y − µ 2
1 − 1 −
f (x) = √ e 2 σ ; f (y) = √ e 2 σ ;
σ 2π σ 2π
−∞ < x < ∞ −∞ < y < ∞
x−µ y−µ
z= ; −∞ < µ < ∞ z= ; −∞ < µ < ∞
σ σ
σ>0 σ>0
Joint pdf of X and Y :
!2 !2
1 y − 0 1 y − 0
1 − 1 −
σ σ
f (x, y) = f (x) · f (y) = √ e
2 · √ e
2
σ 2π σ 2π
1 x +y
2 2
1 −
= 2
e 2 σ2 ; −∞ < x, y < ∞ (1)
2πσ
Given
√
R= X2 + Y 2 (2)
Y
φ = tan−1 (3)
X
218 MA6453 Probability and Queueing Theory
x = R cos φ (3)
y = R sin φ (4)
1 −
= 2
e 2 σ2 ·R
2πσ
2
1 R
R −
= 2
e 2 σ2 (∵ by(1))
2πσ
−∞ < x, y < ∞
To find range space of R and φ:
0 ≤ φ ≤ 2π and 0 ≤ R < ∞
V(U = 0)
R=0 R→∞
U(V = 0)
0 ≤ φ ≤ 2π
Two - Dimensional Random Variables 219
∴ Pdf of R is : ∴ Pdf of φ is :
Z∞
g(φ) = g(R, φ)dR
−∞
2
Z∞ Z∞ 1 R
R −
g(R) = g(R, φ)dφ = e 2 σ2 dR
2πσ2
−∞ 0
2 2
Z2π 1 R Z∞ 1 R
R − R −
= e 2 σ2 dφ = Re 2 σ2 dR
2 2πσ 2
2πσ
0 0
R2
2 2RdR
1 R Take t = ⇒ dt =
R − 2σ2 2σ2
= 2
e 2 σ2 [φ]2π
0 Z∞
2πσ 1
∴ g(φ) = σ2 e−t dt
2
1 R 2πσ 2
R − " −t0 #∞
= e 2 σ2 [2π]
2
2πσ 1 e
2 =
1 R 2π −1 0
R − 1 h −∞ 0 i
∴ g(R) = 2 e 2 σ2 =− e − e
σ 2π
1
= − [(0) − (1)]
2π
1
∴ g(φ) = , 0 ≤ φ ≤ 2π
2π