Chapter 5.3 PDF
Chapter 5.3 PDF
Chapter 5.3 PDF
UP School of Statistics
Consider the following experiment. We begin with a line segment t units long and a
fixed number n of darts and throw darts at the line segment in such a way that each
dart’s position upon landing is uniformly distributed along the segment, independent of
the location of the other darts. Let U1 be the position of the first dart thrown, U2 be
the position of the second, and so on up to Un .
Now let W1 ≤ W3 ≤ . . . ≤ Wn denote the same positions, not in the order in which
the darts were thrown, but instead in the order in which they appear along the line.
The figure below depicts a typical relation between U1 , . . . , Un and W1 , . . . , Wn .
Theorem
Let W1 , W2 , . . . , Wn be the occurence times in a Poisson process of rate λ > 0.
Conditioned on N(t) = n, the random variables W1 , W2 , . . . , Wn have the joint
probability density function
The above theorem has important applications in evaluating certain Poisson processes.
Some sample instances follow.
h XX
(t) i
M=E e −βWk
k=1
Viewing a fixed mass of a certain radioactive material, suppose that alpha particles
appear in time according to a Poisson process if intensity λ. Each particle exists for a
random duration and is then annihilated. Suppose that the successive lifetimes
Y1 , Y2 , . . . of distinct particles are independent random variables having the common
distribution functions G (y ) = P(Yk ≥ y ). Let M(t) count the number of alpha
particles existing at time t. The process is depicted below:
We are interested in the probability distribution of M(t) under the condition that
M(0) = 0. Let X (t) be the number of particles created up to time t. Note that the
number of existing particles cannot exceed the number of particles created, i.e.,
M(t) ≤ X (t). Condition on X (t) = n and let W1 , W2 , . . . , Wn ≤ t be the times of
particle creation. Then, particle k exists at time t if and only if Wk + Yk ≥ t. Let
(
0 if Wk + Yk ≥ t
I{Wk + Yk ≥ t} =
1 if Wk + Yk < t
Thus, we have
n
nX o
P(M(t) = m|x(t) = n) = P I{Wk + Yk ≥ t} = mX (t) = n
k=1
n
nX o
=P I{Uk + Yk ≥ t} = m ,
k=1
where U1P, U2 , . . . , Un are independent and uniformly distributed on(0, t]. The random
variable nk=1 I{Uk + Yk ≥ t} is clearly a binomial random variable with
1 t
Z
p = P{Uk + Yk ≥ t} = P{Yk ≥ t − u}du
t 0
1 t
Z
= [1 − G (t − u)]du
t 0
Z t
1
= [1 − G (z)]dz
t 0