Continuous Rvs

Download as pdf or txt
Download as pdf or txt
You are on page 1of 34

5.

Continuous Random Variables


Continuous random variables can take any value
in an interval. They are used to model physical
characteristics such as time, length, position, etc.

Examples
(i) Let X be the length of a randomly selected
telephone call.
(ii) Let X be the volume of coke in a can
marketed as 12oz.

Remarks
A continuous variable has infinite precision,
hence P(X = x) = 0 for any x.
In this case, the p.m.f. will provide no useful
information.

Definition. X is a continuous random


variable if there is a function f (x) so that for
any constants a and b, with a b ,
Z b
P(a X b) =
f (x) dx
(1)
a

For small, P(a X a + ) f (a) .


The function f (x) is called the probability
density function (p.d.f.).
For any a,
Ra
P(X = a) = P(a X a) = a f (x) dx = 0.
A discrete random variable does not have a
density function, since if a is a possible value of a
discrete RV X, we have P(X = a) > 0.
Random variables can be partly continuous and
partly discrete.

The following properties follow from the axioms:


R
f (x) dx = 1.
f (x) 0.

Example. For some constant c, the random


variable X has probability density function

cxn 0 < x < 1


f (x) =
0
otherwise
Find (a) c and (b) P(X > x) for 0 < x < 1.

Discussion problem. Let X be the duration of


a telephone call in minutes and suppose X has
p.d.f. f (x) = c ex/10 for x 0. Find c, and also
find the chance that the call lasts less than 5
minutes.

Cumulative Distribution Function (c.d.f.)


The c.d.f. of a continuous RV is defined exactly
the same as for discrete RVs:
F (x) = P(X x) = P (X (, x])
Rx
Hence F (x) = f (x)dx, and differentiating
both sides we get
dF
dx

= f (x)

Example. Suppose the lifetime, X, of a car


battery has P(X > x) = 2x . Find the p.d.f. of
X.

Expectation of Continuous RVs


Definition. For X a continuous RV with p.d.f.
f (x)
E(X) =

x f (x)dx

Intuitively, this comes from the discrete case by


R
P
replacing
with and p(xi ) with f (x)dx.
We can think of a continuous distribution as
being approximated by a discrete distribution on
a lattice . . . , 2, , 0, , 2, . . . for small .

Exercise. Let X be a continuous RV and let X


be a discrete RV approximating X on the lattice
. . . , 2, , 0, , 2, . . . for small . Sketch a p.d.f.
f (x) for X and the corresponding p.m.f p(x) for
X , paying attention to the scaling on the y-axis.

Example. Suppose X has p.d.f.

(1 x2 )(3/4) 1 x 1
f (x) =
0
else

Find the expected value of X.

Solution.

Note. Similarly to discrete RVs, the expected


value is the balancing point of the graph of the
p.d.f., and so if the p.d.f. is symmetric then the
expected value is the point of symmetry. A sketch
of the p.d.f. quickly determines the expected
value in this case:

Example. The density function of X is

a + bx2
f (x) =
0
If E(X) = 3/5, find a and b.

Solution.

0x1
else

Proposition 1. For X a non-negative


continuous RV, with p.d.f. f and c.d.f. F ,
Z
Z
E(X) =
P(X > x)dx =
(1 F (x)) dx
0

Proof.

Proposition 2. For X a continuous RV with


p.d.f. f and any real-valued function g
Z
E [g(X)] =
g(x)f (x)dx

Proof.

10

Example. A stick of length 1 is split at a point


U that is uniformly distributed over (0, 1).
Determine the expected length of the piece that
contains the point p, for 0 p 1.

Solution.

11

Variance of Continuous RVs


The definition is the same as for discrete RVs:
h
2 i
Var(X) = E X E(X)
The basic properties are also the same
E(X 2 ) (E(X))

Var(X) =
E(aX + b) =

aE(X) + b
a2 Var(X)

Var(aX + b) =

Example. If E(X) = 1 and Var(X) = 5, find

2
(a) E (2 + X)

(b) Var(4 + 3X)

12

The Uniform Distribution


Definition: X has the uniform distribution
on [0, 1] (and we write X Uniform[0, 1] or just
X U[0, 1]) if X has p.d.f.

1 0x1
f (x) =
0 else
This is what people usually mean when they
talk of a random number between 0 and 1.
Each interval of length in [0, 1] has equal
R x+
probability: x f (y)dy =
The chance of X falling into an interval is equal
to the length of the interval.

13

Generalization: X has the uniform


distribution on [a, b] (i.e. X U[a, b]) if X has
p.d.f.

1/(b a) a x b
f (x) =
0
else

Proposition. If Z U[0, 1], then


(b a)Z + a U[a, b].

Proof.

14

Example. Let X U[a, b]. Show that


E(X) =

a+b
2

and Var(X) =

(ba)2
12 ,

(a) by standardizing (i.e., using the previous


proposition);

(b) directly (by brute force).

15

Example. Your company must make a sealed


bid for a construction project. If you succeed in
winning the contract (by having the lowest bid),
then you plan to pay another firm $100, 000 to do
the work. If you believe that the maximum bid
(in thousands of dollars) of the other
participating companies can be modeled as being
the value of a random variable that is uniformly
distributed on (70, 140), how much should you bid
to maximize your expected profit?

Solution.

16

Discussion problem. Suppose X U [5, 10].


Find the probability that X 2 5X 6 is greater
than zero.

Solution.

17

The Standard Normal Distribution


Definition: Z has the standard normal
distribution if it has p.d.f.
f (x) =

2
1 ex /2
2

f (x) is symmetric about x = 0, so E(X) = 0.


Var(X) = 1. Check this, using integration by
parts:

18

The Normal Distribution


Definition. If X has density
f (x) =

1
2 2

exp

(x )
2 2

then X has the normal distribution, and we


write X N(, 2 ).
E(X) = and Var(X) = 2 .
If Z N(0, 1) then Z is standard normal.

Proposition. Let Z N(0, 1) and set


X = + Z for constants and . Then,
X N(, 2 ).

Proof.

19

Exercise. Check that the standard normal


density integrates to 1.

Trick. By changing to polar coordinates, show


Z

2
exp{x /2} dx = 2.
2

Solution.

20

Calculating with the Normal Distribution


There is no closed form solution to the integral
R b 1 x2 /2
e
dx, so we rely upon computers (or
a
2
tables).
The c.d.f. of the standard normal distribution is
Z z
1 x2 /2
e
(z) =
dx.
2

This is tabulated on page 201 of Ross.


R 2 1 x2 /2
Example. Find 1 2 e
dx.

Solution.

21

Standardization
If we wish to find P(a X b) where
X N(, 2 ), we write X = + Z. Then,

a
b
P(a X b) = P
Z

b
a
=

Example. Suppose the weight of a new-born


baby averages 8 lbs, with a SD of 1.5 lbs. If
weights are normally distributed, what fraction of
babies are between 7 and 10 pounds?

Solution.

22

Example continued. For what value of x does


the interval [8 x, 8 + x] include 95% of
birthweights?

Solution.

23

Example. Suppose that the travel time from


your home to your office is normally distributed
with mean 40 minutes and standard deviation 7
minutes. If you want to be 95% percent certain
that you will not be late for an office appointment
at 1 p.m., What is the latest time that you should
leave home?

Solution.

24

The Normal Approximation


to the Binomial Distribution
If X Binomial(n, p) and n is large enough,
then X is approximately N(np, np(1 p)).
Rule of thumb: this approximation is
reasonably good for np(1 p) > 10
P(X = k) P(k 1/2 < Y < k + 1/2) where
Y N(np, np(1 p)).
Note: P(X k) is usually approximated by
P(Y < k + 1/2).

25

Example. I toss 1000 coins. Find the chance


(approximately) that the number of heads is
between 475 and 525, inclusive.

Solution.

26

Example. The ideal size of a first-year class at a


particular college is 150 students. The college,
knowing from past experience that on the average
only 30 percent of those accepted for admission
will actually attend, uses a policy of approving
the applications of 450 students. Compute the
probability that more than 150 first-year students
attend this college.

Solution.

27

The Exponential Distribution


Definition. X has the exponential
distribution with parameter if it has density

ex x 0
f (x) =
0
x<0
We write X Exponential().
E(X) = 1/ and Var(X) = 1/2 .
F (x) = 1 ex

28

Example. The amount of time, in hours, that a


computer functions before breaking down is a
continuous random variable with probability
density function given by

ex/100 x 0
f (x) =
0
x<0
Find the probability that
(a) the computer will break down within the first
100 hours;
(b) given that it it still working after 100 hours, it
breaks down within the next 100 hours.

Solution.

29

The Memoryless Property of Exponential RVs


The exponential distribution is the continuous
analogue of the geometric distribution (one has
an exponentially decaying p.m.f., the other an
exponentially decaying p.d.f.).
Suppose that X Exponential(). Then
P(X > t + s|X > t) = es = P(X > s).

Check this:

This is an analog for continuous random


variables of the memoryless property that we
saw for the geometric distribution.

30

Example. At a certain bank, the amount of time


that a customer spends being served by a teller is
an exponential random variable with mean 5
minutes. If there is a customer in service when
you enter the bank, what is the probability that
he or she will still be with the teller after an
additional 4 minutes?

Solution.

31

Example. Suppose X U[0, 1] and Y = ln(X)


(so Y > 0). Find the p.d.f. of Y .

Solution.

Note. This gives a convenient way to simulate


exponential random variables.
32

Calculations similar to the previous example are


required whenever we want to find the
distribution of a function of a random
variable

Example. Suppose X has p.d.f. fX (x) and


Y = aX + b for constants a and b. Find the p.d.f.
fY (y) of Y .

Solution.

33

Example. A stick of length 1 is split at a point


U that is uniformly distributed over (0, 1).
Determine the p.d.f. of the longer piece.

Solution.

34

You might also like