STA 303 Lec 1
STA 303 Lec 1
STA 303 Lec 1
1 Lecture 1
Lecture Hours: 3
1
Course Assessment Type Weighting
Continuous Assessment 30 %
End of Semester examination 70 %
Total 100 %
Relevant Website;
Z ∞
σ 2 = V ar(X) = E(X − µ)2 = (x − µ)2 f (x)dx
−∞
Therefore, taking a random sample from the above distribution, we are then
interested in estimating the population parameters (µ and σ). These can be
estimated from the sample observations.
2
required to estimate the unknown parameters from random sample taken from
the given distribution.
e.g. the population mean, µ, can be estimated with the sample mean; also
it can be estimated with the help of the sample median.
f (x.θ), θ ∈ Θ(Ω)
The set Θ which is the set of all possible values of θ is called the parameter
space.
Such situation gives rise not to one probability distribution but a family of
probability distribution which we write as
{f (x.θ), θ ∈ Θ}.
{f (x; θ1 , · · · , θk ) : θi ∈ Θ, i = 1, · · · , k}
θx e−θ
f (x, θ) = , x = 0, 1, · · · or x ≥ 0
x!
which gives E(x) = θ and V ar(x) = θ. Thus θ can be estimated with the help
of
3
n n
(xi − X)2
P P
xi
i=1 2 i=1
X= or S =
n n
Thus, generally ∃ several estimators of the same unknown parameter.
Thus we wish to determine the function (such estimating functions are
known as estimators) of the sample observations: T1 = t1 (x1 , · · · , xn ), T2 =
t2 (x1 , · · · , xn ), · · · , Tn = tn (x1 , · · · , xn ) s.t their distribution is concentrated as
closely as possible near the true value of the parameter.
This implies we have to select a good estimator from the several estimators.
1. Unbiasedness Definition;
An estimator T1 = t1 (x1 , · · · , xn ), is said to be unbiased estimator of the
unknown parameter θ if
E(T ) = θ f or all θ ∈ Ω (parametric space).
If however E(T ) 6= θ, we say T is biased estimator of θ.
In particular if E(T ) > θ, then T is positively biased estimator of θ; and if
E(T ) < θ, then it is a negatively biased estimator of θ.
E(T )−θis called an amount of bias in T as estimator of θ. Further if E(T ) 6=
θ, but E(T ) = θ, when n → ∞, we then say , T is asymptotically unbiased
estimator of θ i.e.
E(T ) 6= θ and E(T ) = θ
n→∞
4
2. The sample variance S 2 is a biased estimator of a population variance σ 2 i.e.
we have to show that
n
(xi − X)2
P
i=1
E(S 2 ) 6= σ 2 where S 2 =
n
Solution: n
P
(xi −X)2
For E(xi ) = µ and V ar(xi ) = σ , i = 1, · · · , n and S
2 2
= i=1
n ,
introducing the population mean, we have,
n
P
(xi −µ+µ+X)2
2 i=1
S = n
n
P 2
((xi −µ)−(X−µ))
i=1
= n
= σ 2 − (X − µ)2
Note:
σ2
V ar(X) =
n
Thus S 2 is a biased estimator of σ 2
The amount of bias is equal to :
2 2
Bias
= E(S 2
) − σ
= σ 2 − σn − σ 2
2
= − σn < 0