Tutorial 11: Expectation and Variance of Linear Combination of Random Variables

Download as pdf or txt
Download as pdf or txt
You are on page 1of 3

Tutorial 11: Expectation and Variance of linear combination of

random variables

Fact 1:
For random variable X:

a) E[aX + b] = aE[X] + b

b) Var[aX + b] = a2 Var[X]

Fact 2:
For random variables X1 , X2 , . . . , Xn :

a) The following equation holds for arbitrary random variables X1 , X2 , . . . , Xn

E[X1 + X2 + . . . + Xn ] = E[X1 ] + E[X2 ] + . . . + E[Xn ]

b) If X1 , X2 , . . . , Xn are independent, then

Var[X1 + X2 + . . . + Xn ] = Var[X1 ] + Var[X2 ] + . . . + Var[Xn ]

Fact1 + Fact2 ⇒ Fact 3:


For random variables X1 , X2 , . . . , Xn and arbitrary constant c0 , c1 , . . . , cn :

a) The following equation holds for arbitrary random variables X1 , X2 , . . . , Xn

E[c0 + c1 X1 + c2 X2 + . . . + cn Xn ] = c0 + c1 E[X1 ] + c2 E[X2 ] + . . . + cn E[Xn ]

b) If X1 , X2 , . . . , Xn are independent, then

Var[c0 + c1 X1 + c2 X2 + . . . + cn Xn ] = c21 Var[X1 ] + c22 Var[X2 ] + . . . + c2n Var[Xn ]

Notes: The facts hold for both continuous and discrete random variables.

1
Proof of Fact 1:
a) Let g(X) = aX + b
Z
E[g(X)] = (ax + b)fX (x)dx
Z Z
= a xfX (x)dx + b fX (x)dx
= aE[X] + b

b) Let g(X) = aX + b
Var[g(X)] = E[g(X)2 ] − E[g(X)]2
= E[(aX + b)2 ] − (aE[X] + b)2
= E[a2 X 2 + 2abX + b2 ] − (aE[X] + b)2
= a2 E[X 2 ] + 2abE[X] + b2 − (aE[X] + b)2
= a2 (E[X 2 ] − E[X]2 )
= a2 Var[X]

Proof of Fact 2:
a) Prove by induction.

• First prove for arbitrary two random variable X, Y (note we don’t make independence
assumption here), E[X + Y ] = E[X] + E[Y ]:
Denote f (x, y) the joint probability density function of X, Y .
Z Z
E[X + Y ] = (x + y)f (x, y)dxdy
y x
Z Z Z Z
= x f (x, y)dydx + y f (x, y)dxdy
x y y x
Z Z
= xfX (x)dx + yfY (y)dy
x y
= E[X] + E[Y ]

• Suppose
k−1
X k−1
X
E[ Xi ] = E[Xi ]
i=1 i=1
Pk−1
Define random variable Yk−1 = i=1 Xi , then
k
X
E[ Xi ] = E[Yk−1 + Xk ]
i=1
= E[Yk−1 ] + E[Xk ]
k−1
X
= E[Xi ] + E[Xk ]
i=1
Xk
= E[Xi ]
i=1

b) Prove by induction

2
Problems:

a) Xi , i = 1, . . . , n are independent normal variables with respective parameters µi and σi2 , then
X = ni=1 Xi is normal distribution, show that expectation of X is ni=1 µi and variance is
P P
Pn 2
i=1 σi .

b) A random variable X with gamma distribution with parameters (n, λ), n ∈ N, λ > 0 can be
expressed as sum of n independent exponential random variables: X = ni=1 Xi , here Xi are
P

independent exponential random variable with the same parameter λ. Calculate expectation
and variation of gamma random variable X.

c) A random variable X is named χ2n distribution with if it can be expressed as the squared sum
of n independent standard normal random variable: X = ni=1 Xi2 , here Xi are independent
P

standard normal random variable. Calculate expectation of random variable X.

d) Xi , i = 1, . . . n are independent uniform variables over interval (0, 1). Calculate the expec-
tation and variation of the random variable X = n1 ni=1 Xi .
P

Solution:

a) From Fact 2

b) E[X] = ni=1 E[Xi ] = n/λ


P

Var[X] = ni=1 Var[Xi ] = n/λ2


P

c) E[Xi2 ] = Var[Xi ] = 1 (Recall Var[X] = E[X 2 ] − E[X]2 )


E[X] = ni=1 E[Xi2 ] = n
P

d) E[X] = n1 ni=1 E[Xi ] = n1 n2 = 21


P

Var[X] = n12 ni=1 Var[Xi ] = n12 · n · 12


1 1
P
= 12n
We can see the variance diminishes as n → ∞.

You might also like