Unit 4
Unit 4
Unit 4
To draw the inferences about a population from the analysis of a sample drawn
from the population.
Sample is representative of Population.
Statistical inference includes Estimation and Testing of Hypothesis.
Any function of the random sample 𝑥1 , 𝑥2 , … , 𝑥𝑛 that is being observed, say
𝑇𝑛 (𝑥1 , 𝑥2 , … , 𝑥𝑛 ) is called a statistic. If it is used to estimate an unknown
parameter 𝜃 of the distribution, it is called an estimator. A particular value of the
estimator is called an estimate of 𝜃.
There will be always infinite number of functions of sample values, called
statistics, which may be proposed. But the one which falls nearest to the true
value of the parameter to be estimated would be the best estimate.
Point estimate of some population parameter 𝜃 is a single value 𝜃̂ of a statistic 𝑇.
The set of all possible values of 𝜃 is called the parameter space Θ.
CHARACTERISTICS OF ESTIMATORS
1. Unbiasedness
2. Consistency
Unbiased Estimator
An estimator 𝑇𝑛 = 𝑇(𝑥1 , 𝑥2 , … , 𝑥𝑛 ) is said to be an unbiased estimator of 𝑔(𝜃 ), if
Positively Biased
Negative Biased
Q1. If 𝑥1 , 𝑥2 , … , 𝑥𝑛 is a random sample from normal population 𝑁(𝜇, 1).
1
Show that 𝑇 = ∑ 𝑥𝑖2 is an unbiased estimator of 𝜇2 + 1.
𝑛
Q3. Let 𝑋 be distributed in the Poisson form with parameter 𝜃. Show that
𝑇(𝑋) = (−𝑘) 𝑋 is an unbiased estimetor of 𝑒 −(𝑘+1)𝜃 , 𝑘 > 0.
i.e., if for every 𝜖 > 0, 𝜂 > 0, there exists a positive integer 𝑛 ≥ 𝑚 (𝜖, 𝜂) such that
𝐸𝜃 (𝑇𝑛 ) → 𝑔(𝜃 ) as 𝑛 → ∞
𝑉𝜃 (𝑇𝑛 ) → 0 as 𝑛 → ∞
∑𝑛
𝑖=1 𝑥𝑖 ∑𝑛
𝑖=1 𝑥𝑖
Q8(B) Show that (1 − ) is consistent estimator of 𝑝(1 − 𝑝)
𝑛 𝑛
2 ∑𝑛
𝑖=1 𝑖 𝑥𝑖
Q9. Check whether 𝑇2 = is unbiased and consistent estimator of
𝑛(𝑛+1)
population mean 𝜇 or not.
Efficient Estimator
Efficiency
If 𝑇1 is the most efficient estimator with variance 𝑉1 and 𝑇2 is any other
estimator with variance 𝑉2 , then the efficiency 𝐸 of 𝑇2 is defined as
𝑉
𝐸 = 1 . 𝐸 cannot exceed unity.
𝑉2
Properties of M.L.E.
Assumptions or Regularity Conditions:
𝜕 log 𝐿 ∂2 log 𝐿
(i) The first and second derivatives 𝜕𝜃 and 𝜕θ2 exist and are
continuous functions of 𝜃 in range 𝑅 (including the true value θ0 of
𝜕
parameter) for almost all 𝑥. For every 𝜃 in 𝑅, |𝜕𝜃 log 𝐿| < 𝐹1 (𝑥) and
𝜕2
|𝜕θ2 log 𝐿| < 𝐹2 (𝑥) where 𝐹1 (𝑥 ) and 𝐹2 (𝑥) are integrable functions over
(−∞, ∞).
∂3 𝜕3
(ii) The third order derivative 𝜕θ3 log 𝐿 exists such that |𝜕θ3 log 𝐿| <
𝑀(𝑥), where 𝐸 [𝑀(𝑥)] < 𝐾, a positive quantity.
𝜕2 ∞ 𝜕2
𝐸 (− 𝜕θ2 log 𝐿) = ∫−∞ (− 𝜕θ2 log 𝐿) 𝐿 𝑑𝑥 = 𝐼 (𝜃), is finite and
non-zero.
Important Theorems
Liapounoff’s Form
Lindeberg-Levy’s Form
Corollary
𝑋1 +𝑋2 +⋯+𝑋𝑛 𝜎2
̅
If 𝑋 = , then 𝐸 (𝑋) = 𝜇 and 𝑉𝑎𝑟(𝑋) = and 𝑋̅ follows
̅ ̅
𝑛 𝑛
𝜎
𝑁 (𝜇, ) as 𝑛 → ∞.
√𝑛
Q16. A coin is tossed 200 times. Find the approximate probability that the
number of heads obtained is between 80 and 120.
Q17. The guaranteed average life of a certain type of electric light bulb is
1000 h with a standard deviation of 125 h. It is decided to sample the
output so as to ensure that 90% of the bulbs do not fall short of guaranteed
average by more than 2.5%. Use CLT to find the minimum sample size.