Lecture 11
Lecture 11
Lecture 11
Definition
Example One
Since
n n 2
X
2
X X1 + X2 + . . . + Xn
(Xi − X̄) = Xi −
i=1 i=1
n
n−1 2
X1 + X22 + . . . + Xn2
=
n
2
− X1 X2 + X1 X3 + . . . + X1 Xn + . . . Xn−1 Xn
n
Theorem
Let Q = Q1 +Q2 +. . .+Qk where Q, Q1 , . . . , Qk are k +1 random variables which are real
quadratic forms in n mutually independent normal variables with means µ1 , µ2 , . . . , µn
and common variance σ 2 . If Q/σ 2 , Q1 /σ 2 , . . . , Qk−1 /σ 2 have chi–squared distributions
with ν, ν1 , ν2 , . . . , νk−1 degrees of freedom respectively, then
1
HYPOTHESIS TESTING Quadratic Forms
Example Two
Let the random variable X have a normal distribution X ∼ N (µ, σ 2 ) and let the variables
Xij , i = 1, . . . , a, j = 1, . . . , b be a random sample of size n = ab from this normal
distribution.
a X
X b
2
abS = (Xij − X̄.. )2
i=1 j=1
b
a X
X 2
= (Xij − X̄i. ) + (X̄i. − X̄.. )
i=1 j=1
X b
x X X b
a X X b
a X
2 2
= (Xij − X̄i. ) + (X̄i. − X̄.. ) + 2 (Xij − X̄i. )(X̄i. − X̄.. )
i=1 j=1 i=1 j=1 i=1 j=1
X b
a X a
X
2
= (Xij − X̄i. ) + b (X̄i. − X̄.. )2
i=1 j=1 i=1
or
Q = Q1 + Q2
2
HYPOTHESIS TESTING Quadratic Forms
Also
a Pb
j=1 (Xij − X̄i. )2
Q1 X
=
σ2 i=1
σ2
Pb
and for each value of i, − X̄i. )2 /b is the variance of a random sample of
j=1 (Xij
Pb 2 2
size b from the given normal distribution so j=1 (Xij − X̄i. ) /σ has a chi–squared
distribution with b − 1 degrees of freedom. As the Xij are independent, Q1 /σ 2 is the sum
of independent chi–squared variables and has a chi–squared distribution with a(b − 1)
degrees of freedom.
Example Three
b X
X a b
X
2 2
abS = (Xij − X̄.j ) + a (X̄.j − X̄.. )2
j=1 i=1 j=1
or
Q = Q3 + Q4
and as before we get that Q3 and Q4 are independent and Q4 /σ 2 has a chi–squared
distribution with ab − 1 − b(a − 1) = b − 1 degrees of freedom.
Example Four
Writing Xij − X̄.. as (X̄i. − X̄.. ) + (X̄.j − X̄.. ) + (Xij − X̄i. − X̄.j + X̄.. ) gives
a
X b
X b X
X a
2 2 2
abS = b (X̄i. − X̄.. ) + a (X̄.j − X̄.. ) + (Xij − X̄i. − X̄.j + X̄.. )2
i=1 j=1 j=1 a=1
or
Q = Q2 + Q4 + Q5
and as before Q2 , Q4 and Q5 are independent and Q5 /σ 2 has a chi–squared distribution
with ab − 1 − (a − 1) − (b − 1) = (a − 1)(b − 1) degrees of freedom.
3
HYPOTHESIS TESTING Quadratic Forms
As the above statistics which are quadratic forms are independent, we have that
Q4 /σ 2 (b − 1)
∼ Fb−1,b(a−1)
Q3 /σ 2 b(a − 1)
and
Q4 /σ 2 (b − 1) Q4
2
= ∼ Fb−1,(a−1)(b−1) .
Q5 /σ (a − 1)(b − 1) Q5 /(a − 1)
To test
H0 : µ 1 = µ 2 = . . . = µ b = µ
against all possible alternatives the likelihood ratio test has
and
1 ab/2 b a
1 XX 2
L(ω) = exp − 2 (xij − µ)
2πσ 2 2σ j=1 i=1
1 ab/2 b a
1 XX 2
L(Ω) = exp − 2 (xij − µj ) .
2πσ 2 2σ j=1 i=1
Furthermore Pb Pa
∂ ln L(ω) j=1 i=1 (xij − µ)
=
∂µ σ2
b a
∂ ln L(ω) ab 1 XX
2
=− 2 + 4 (xij − µ)2
∂σ 2σ 2σ j=1 i=1
4
HYPOTHESIS TESTING Quadratic Forms
and setting the derivatives equal to zero gives the values at which L(ω) is maximized as
Pb Pa
j=1 i=1 xij
µ̂ =
ab
Pb Pa
2 j=1 i=1 (xij − x̄)2
σ̂ = .
ab
For Ω Pa
∂ ln L(Ω) i=1 (xij − µj )
= j = 1, 2, . . . , b
∂µj σ2
b a
∂ ln L(Ω) ab 1 XX
2
=− 2 + 4 (xij − µj )2
∂σ 2σ 2σ j=1 i=1
and setting the derivatives equal to zero gives the values at which L(Ω) is maximized as
Pa
xij
µ̂j = i=1 j = 1, 2 . . . , b
a
Pb Pa
j=1 i=1 (xij − x̄.j )2
σ̂ 2 = .
ab
Then
ab/2 Pb P a
ab j=1 i=1 (xij − x̄)2
ab
L(ω̂) = Pb Pa exp − Pb Pa
2π j=1 i=1 (xij − x̄)2 2 j=1 i=1 (xij − x̄)2
ab/2
ab
= Pb Pa exp − ab/2
2π j=1 i=1 (xij − x̄)2
ab/2
ab
L(Ω̂) = Pb Pa exp − ab/2
2π j=1 i=1 (xij − x̄.j )2
and
L(ω̂)
λ=
L(Ω̂)
Pb Pa 2 ab/2
j=1 i=1 (xij − x̄.j )
= Pb Pa 2
j=1 i=1 (xij − x̄)
5
HYPOTHESIS TESTING Quadratic Forms
In this case
Q3
λ2/ab =
Q
Q3
=
Q3 + Q4
1
=
1+ Q 4
Q3
and if 1
2/ab
α=P Q4
≤ λ 0 H0
1+ Q 3
Q /(b − 1)
4
=P ≥ cH0
Q3 /b(a − 1)
then the test is based on an F distribution with b − 1 and b(a − 1) degrees of freedom.
Regression
1 n/2 n
2 2
h 1 X 2 i
L(y1 , y2 , . . . , yn , σ ; α, β, σ ) = exp − 2 yi − α − β(xi − x̄)
2πσ 2 2σ i=1
6
HYPOTHESIS TESTING Quadratic Forms
The sum of squares in the exponent of the joint probability density function can be written
as follows.
n
1 X
SS = Yi − α − β(xi − x̄)
n i=1
n h
X i2
= (α̂ − α) + (β̂ − β)(xi − x̄) + Yi − α̂ − β̂(xi − x̄)
i=1
n n
2 2
X
2
X 2
= n(α̂ − α) + (β̂ − β) (xi − x̄) + Yi − α̂ − β̂(xi − x̄)
i=1 i=1
Xn
= n(α̂ − α)2 + (β̂ − β)2 (xi − x̄)2 + nσ̂ 2
i=1
Yi − α − β(xi − x̄) 1 = 1, 2, . . . , n.
Confidence intervals for α and β can be obtained from the results that
√ p P
[ n(α̂ − α)]/σ (xi − x̄)2 (β̂ − β) /σ
T1 = p and T2 = p
Q3 /[σ 2 (n − 2)] Q3 /[σ 2 (n − 2)]
7
HYPOTHESIS TESTING Quadratic Forms
If X1 , X2 , . . . , Xr are independent normal variables each with mean zero and variance σ 2 ,
then
X r
Xi2 /σ 2 ∼ χ2r .
i=1
Pr
If the means of X1 , X2 , . . . , Xr are µ1 , µ2 , . . . , µr respectively, then i=1 Xi2 /σ 2 is said to
have a non–central chi–squared distribution with r degrees of freedom and non–centrality
parameter θ. This is denoted by χ2 (r, θ) and the variable has probability density function
(θx)i
∞
1
X
f (x; r, θ) = e −θ
x 2 r−1 e −x
.
i=0
i!Γ(i + 2r )