Expected Value and Variance
Expected Value and Variance
Expected Value and Variance
Example Letting X denote the random variable that is defined as the sum of two fair
dice, then
1
P {X = 2} = P {(1, 1)} = 36 ,
2
P {X = 3} = P {(1, 2), (2, 1)} = 36 ,
3
P {X = 4} = P {(1, 3), (2, 2), (3, 1)} = 36 ,
4
P {X = 5} = P {(1, 4), (2, 3), (3, 2), (4, 1)} = 36 ,
5
P {X = 6} = P {(1, 5), (2, 4), (3, 3), (4, 2), (5, 1)} = 36 ,
6
P {X = 7} = P {(1, 6), (2, 5), (3, 4), (4, 3), (5, 2), (6, 1)} = 36 ,
5
P {X = 8} = P {(2, 6), (3, 5), (4, 4), (5, 3), (6, 2)} = 36 ,
4
P {X = 9} = P {(3, 6), (4, 5), (5, 4), (6, 3)} = 36 ,
3
P {X = 10} = P {(4, 6), (5, 5), (6, 4)} = 36 ,
2
P {X = 11} = P {(5, 6), (6, 5)} = 36 ,
1
P {X = 12} = P {(6, 6)} = 36
For any number x, F(x) is the probability that the observed value of X will be
at most x.
EXAMPLE A store carries flash drives with either 1 GB, 2 GB, 4 GB, 8 GB, or 16 GB of mem-
ory. The accompanying table gives the distribution of Y 5 the amount of memory in
a purchased drive:
y 1 2 4 8 16
If X is a discrete random variable having a probability mass function p(x), then the
expected value of X is defined by
E[X] = xp(x) if X is discrete
x:p(x)>0
∞
E[X] = xf (x) dx if X is continuous
−∞
The expected value of a random variable X, E[X], is also referred to as the mean
or the first moment of X. The quantity E[Xn], n ≥ 1, is called the nth moment of X.
⎧
⎪
⎪ x n p(x), if X is discrete
⎨
x:p(x)>0
E[X n ] = ∞
⎪
⎪
⎩ n
x f (x) dx, if X is continuous
−∞
E(aX 1 b) 5 a ? E(X) 1 b
(Or, using alternative notation, aX1b 5 a ? X 1 b)
Thus, the variance of X measures the expected square of the deviation of X from its
expected value.
Let X have pmf p(x) and expected value . Then the variance of X, denoted
by V(X) or 2X, or just 2, is
V(X) 5 o (x 2 )
D
2
? p(x) 5 E[(X 2 )2]
PROPOSITION If X , Bin(n, p), then E(X) 5 np, V(X) 5 np(1 2 p) 5 npq, and X 5 Ïnpq
(where q 5 1 2 p).
where q = 1 − p,
d q
=p
dq 1 − q
p 1
= =
(1 − q)2 p
∞
∞
∞ ∞
1 1
iP (X ≥ i) = i(1 − p)i−1 = i p(1 − p)i−1 = i P (X = i)
p p
i=1 i=1 i=1 i=1
1 1
= E[X] = 2
p p
2 1 1 1 1−p
E[X 2 ] = 2
− Var(X) = 2
− =
p p p p p2
DEFINITION
The Poisson distribution is a discrete probability distribution of a random
variable x that satisfies these conditions.
1. The experiment consists of counting the number of times x an event occurs
in a given interval. The interval can be an interval of time, area, or volume.
2. The probability of the event occurring is the same for each interval.
3. The number of occurrences in one interval is independent of the number of
occurrences in other intervals.
The probability of exactly x occurrences in an interval is
mxe -m
P1x2 =
x!
where e is an irrational number approximately equal to 2.71828 and m is the
mean number of occurrences per interval unit.
2 3 ` x ` e2 ? x
e 5 1 1 1
2!
1
3!
1…5 o
x50 x!
15 o
x50 x!
= e− e
=
Suppose that in the binomial pmf b(x; n, p), we let n S ` and p S 0 in such
a way that np approaches a value . 0. Then b(x; n, p) S p(x; ).
EXAMPLE If a publisher of nontechnical books takes great pains to ensure that its books are free
of typographical errors, so that the probability of any given page containing at least one
such error is .005 and errors are independent from page to page, what is the probability
that one of its 600-page novels will contain exactly one page with errors? At most three
pages with errors?
Table
Discrete Probability mass Moment Mean Variance
probability function, p(x) generating
distribution function, φ(t)
&n' x
x p (1 − p)
Binomial with n−x , (pet + (1 − p))n np np(1 − p)
parameters n, p, x = 0, 1, . . . , n
0≤p≤1
mxe -m
Poisson with pa- P1x2 = n ! exp{(et − 1)}
rameter x = 0, 1, 2, . . .
pet 1 1−p
Geometric with p(1 − p)x−1 ,
1 − (1 − p)et p p2
parameter x = 1, 2, . . .
0≤p≤1