Module 33 Statistical Inference Problems: Point Estimation
Module 33 Statistical Inference Problems: Point Estimation
Module 33 Statistical Inference Problems: Point Estimation
Formally, let the r.v. X (which may be vector valued) describes the
characteristics of the population under investigation and let F (·) be
the d.f. of X ;
() Module 33 Statistical Inference Problems: Point Estimation 1/1
Parametric Statistical Inference:
Here the r.v.X has a d.f. F ≡ Fθ (·) with a known functional form
(except perhaps for the parameter θ, which may be a vector valued);
Here we know nothing about the functional form of the d.f. F (·)
(except perhaps that F (·) is, say, continuous or discrete);
Define
n
1X k
Ak = Xi , k = 1, . . . , p.
n
i=1
(b) M.M.E. may not exist when the underlying equations do not have a
solution. Also the M.M.E. may not be unique as the underlying equations
may have more than one solution.
Solution: We have
n n
1X 2 2 1X n−1 2
⇒ µ̂ = X σ̂ 2 = Xi − X = (Xi − X )2 = S .
n n n
i=1 i=1
Lx (θ) = fX (x|θ),
Lx (θ0 ) = fX (x|θ0 )
1, if x(n) − 21 ≤ θ ≤ x(1) + 1
Lx (θ) = 2 .
0, otherwise
1 1
Clearly any estimate δ(x) such that x(n) − 2 ≤ δ(x) ≤ x(1) + 2 is a
x(1) +x(n)
m.l.e. In particular δ ∗ (x) = 2 is a m.l.e. of θ.
(c) Since Lx (θ) and ln Lx (θ) attain their maximum for same values of θ,
sometimes it is more convenient to work with ln Lx (θ).
∂
ln Lx (θ) θ=θ̂ = 0, j = 1, . . . , p; θ = (θ1 , . . . , θp )
∂θj
∂
⇔ Lx (θ) θ=θ̂ = 0, j = 1, . . . , p.
∂θj
∂
(c) For any x ∈ SX and any θ ∈ Θ, the derivative ∂θ fX (x|θ), θ ∈ Θ,
exists and is finite and
Z ∞ Z ∞
··· fX (x|θ)dx = 1, θ ∈ Θ,
−∞ −∞
for some functions h(·), c(·), r (·) and T (·) and an open interval
Θ ⊆ R.
represents the relative rate at which the p.d.f. (or p.m.f.) fX (x|θ)
changes at x. The average of this rate is denoted by
2 !
∂
I (θ) = Eθ ln fX (X |θ) , θ ∈ Θ ⊆ R.
∂θ
= 0, ∀ θ ∈ Θ.
√ (g 0 (θ))2
d p
n(ĝn − g (θ)) → W ∼ N 0, and ĝn → g (θ), θ ∈ Θ.
i(θ)
Unbiased Estimators
Suppose that the estimand g (θ) is real-valued.
Definition 6.
(a) An estimator δ(X ) is said to be an unbiased estimator of g (θ) if
Eθ (δ(X )) = g (θ), ∀ θ ∈ Θ.
1
Eθ (δ(X )) = , ∀θ∈Θ
θ
n
X n j 1
i.e., δ(j) θ (1 − θ)n−j = , ∀0<θ<1
j θ
j=0
n
X n j
⇒θ δ(j) θ (1 − θ)n−j = 1, ∀ 0 < θ < 1,
j
j=0
δ(j) (−2)j
= , j = 0, 1, 2, . . .
j! j!
⇒ δ(j) = (−2)j , j = 0, 1, 2, . . .
θ̂MME
= X ⇒ θ̂MME = 2X .
2
√
Thus the MME of g (θ) = θ is
√
δMME (X ) = 2X
√
√ 2 2√
⇒ Eθ [δMME (X )] = 2Eθ (X ) = θ 6= θ.
3
then (δ1 (X ) − g (θ))2 is, on the average, less than (δ2 (X ) − g (θ))2 , which
indicates that δ1 is nearer to g (θ) than δ2 . For this reason we define:
Definition 8.
In an estimation problem where the M.L.E. exists, an estimator (not
necessarily unbiased) which depends on observation X = (X1 , . . . , Xn ) only
through the M.L.E. (i.e., an estimator which is a function of the M.L.E.
alone) is called an estimator based on the M.L.E..
∞
X j e −nθ (nθ)j
⇔ δ( ) = e −θ , ∀ θ > 0
n j!
j=0
∞
X j nj
⇔ δ( ) θj = e (n−1)θ , ∀ θ > 0
n j!
j=0
∞ ∞
X j nj X (n − 1)j
⇔ δ( ) θj = θj , ∀ θ > 0
n j! j!
j=0 j=0
() Module 33 Statistical Inference Problems: Point Estimation 55 / 1
j 1
⇔ δ( ) = (1 − )j , j = 0, 1, . . . .
n n
It follows that the unbiased estimator based on the M.L.E. is
δU (X ) = (1 − n1 )nX .