Basic Statitical Inference (Lecture3)

You might also like

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 8

Course :Basic Statistical inference

Topic: point estimation


By Yasir Khan
Point estimates:

 When an estimate for the unknown population parameter is expressed by a single value, it
is called a point estimate.
 Example: suppose we wish to estimate the average height of a very large group of
students on the basis of a sample. If we find the sample average height to be 64 inches,
then 64 inches is a point estimate of the unknown population mean.
 It is to be noted that a point estimate will not, in general, be equal to the population
parameter as the random sample used is one of the many possible samples which could be
chosen from the population.
Criteria For Good point estimators:

 A point estimator is considered is good estimator if it satisfies various criteria. Four of


these criteria are,
 1) unbiasedness: an estimator is defined to be unbiased if the statistic used as an
estimator has its expected value equal to the population parameter being estimated.
 In other words let 𝜃^ (theta hat) be an estimator of a parameter 𝜃 then 𝜃^ (theta hat) will
be called an unbiased estimator if 𝗘(𝜃)^ = 𝜃
 Normally it is preferable to have an unbiased estimator.
 If 𝗘(𝜃)^ ≠ 𝜃 then we say that 𝜃^ is biased estimator of 𝜃. The amount of bias may be
positive or negative.
conti…

 If 𝗘(𝜃)^ - 𝜃 > 0 , then amount of bias is positive and 𝜃^ is said to over estimate the 𝜃.
 If 𝗘(𝜃)^ - 𝜃 < 0 , then amount of bias is negative and 𝜃^ is said to under estimate 𝜃.
 Example : the sample mean (X-bar) is an unbiased estimator of the population mean.
Similarly the sample median is also an unbiased estimator of the population mean when
the population is normally distributed.
 Example (numerical ) : discussed on the white board.
2) Consistency

 An estimator is said to be consistent if the statistic to be used as estimator becomes closer


and closer to the population parameter being estimated as the sample size ”n” increases.
 A consistent estimator may or may not be unbiased.
 The sample mean (x-bar) which is an unbiased estimator of papulation mean “u” , is also
a consistent estimator.
 The median is not a consistent estimator of “u” when the population has a skewed
distribution.
Conti….

 To prove that an estimator is consistent.


 Let 𝜃^ be an estimator of 𝜃 based on a sample of size n then 𝜃^ is consistent estimator
of 𝜃, if var(𝜃^) 0, as n ∞.
 Consistency is called a large sample property.
 Example: sample mean (x-bar) is a consistent estimator of population mean, Because as
we increase the sample size ”n” the variance of a sample mean approaches to zero.
3) Efficiency:

 To define the efficiency property let say we have two unbiased estimator T1 and T2 of the same
parameter 𝜃. Then T1 will be said to be more efficient estimator than T2 if Var(T1) < var (T2)
 The sample mean is more efficient than the sample as an estimator of ”u”. The sample mean may
therefore be preferred.
 Example:
Sufficiency :

 An estimator is defined to be sufficient if the statistic used as an estimator uses all the
information that is contained in the sample.
 Any statistic that is not computed from all values in the sample is not a sufficient
estimator, the sample mean is a sufficient estimator of “u”
 This implies that x-bar contains all the information in the simple relative to the estimation
of the population parameter “u” and no other estimator such as the sample median, etc.
calculated from same sample can add any useful information concerning ”u”.

You might also like