Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

Statistical Modelling

General Parametric Inference


Parametric Models
There are a number of standard parametric models which could be
used to model data of various kinds.
Usually individual observations, Y, are modelled by distribution of
given form, but with unknown parameter(s) β.
Distributions may be either

Discrete with probability function

f(y|β)=Pr(Y=y) with ∑f(y)=1

E(Y) = ∑yf(y)=μ

Var(Y) = ∑y2f(y) –μ2 = 2


both μ and 2
are functions of β

Or

Continuous with density function f, such that

P(Y a) = F(a)= f ( y )dy with

E(X) = yf ( y )dy = μ

Var(Y) = y 2 f ( y )dy 2

1
Some Examples
1. Uniform (a,b) 2 parameters

2. Binomial(n,p) 1 parameter

3. Poisson(λ) 1 parameter

4. Exponential(λ) 1 parameter

5. Gamma(υ, λ) 2 parameters

6. Normal(μ,σ2) 2 parameters

Some more specialised distributions, such as

7. Double Exponential

Also called “Laplacian distribution”


2
8. Logistic

Random Sample and Likelihood


A random sample is a set of n independent observations
Y1,Y2,....,Yn from the same distribution (or „population‟) f(y|β).
Also refrerred to as “iid data”.
The joint probability of a set of values

[i.e. Y1= y1, Y2= y2,........., Yn= yn ]


Occurring is given by

3
Or

Or to emphasize the dependence on the parameters

This quantity, for a given observed values, as a function of β is


called the likelihood function L(β) = .

Example:
If n observations are given from a Poisson
distribution

Ex: If n=3 with y 1= 2, y2 = 3, y3 = 2 then ∑yi=7 and


therefore

as a function of μ.

4
Definition: Statistics

A Statistic t( ) is any quantity that can be calculated from the data.


i.e. it does not involve any unknown parameters.

Eg: is a Statistic but not

We normally construct statistics to estimate parameters, test


hypothesis etc.

Estimation
Here we look at the problem of estimating the parameters of a
distribution. Given data , how do we obtain „good‟ estimates of
the parameter β that generated them?

An estimator is a statistic whose values are hopefully close to


the true parameter β.

Definition: An estimator is called an unbiased


estimator of the parameter β if

(averaged over all samples)

[Note that ]

5
Eg1: Given two observations Y1, Y2 from N(μ,σ2) the following are
all unbiased estimates of μ.

Y1 as E(Y1)= μ
Y2 as E(Y2)= μ
0.3Y1 +0.7Y2
0.5Y1 +0.5Y2

Eg2: Given n observations Y1, Y2 ,....,Yn from a Poisson distribution


the estimators

and )2

are both unbiased estimates of μ.


Exercise: Let Y1, Y2,Y3 be three independent observations from a
Poisson distribution with unknown parameter µ. Show that

is an unbiased estimator of µ.
Solution:
Y ~ Poisson(µ), therefore E(Y)= µ and V(Y)= µ. Hence

i.e. E(T1) = µ and hence T1 is an unbiased estimator of µ.


Ex: Find the variance of T1.

6
Distribution of Estimator
Different samples from same population, for the same estimator,
gives different estimates of the parameter β (as estimator is a random
variable)
Eg:

T(y3)

T(y1) β T(y2)

We want estimators which gives good estimates a large proportion of


the time.

Look at distribution of the estimator for characteristics


which make it a “good” estimator or a “bad” one.

We use certain criteria to choose between estimators when we have


more than one estimators of a parameter.

Criteria for choosing an estimator

1) Unbiasedness

2) Variability (MSE, Var etc)

3) Consistency

7
Unbiasedness

For an estimator T of β
(i) T is unbiased E(T) = β
(ii) bias(T) = E(T) - β

E(T)<β E(T)=β E(T)>β


negative bias zero bias positive bias
(unbiased)
Note:
1) May have several unbiased estimators of β.
2) Is an unbiased estimator always better than a biased one?

Eg:

E(T1)< β (biased)

E(T2)= β (unbiased)
But V(T1) < V(T2)

8
Variability

Definition: Mean Square Error (MSE) =

Relation to variance:
(a)

as E(T- ) = E(T)- = 0 and

(b) If T is unbiased then MSE(T) = V(T)

Using MSE:
1) Prefer T1 to T2 if T1 has smaller MSE
2) Can‟t in general find T whose MSE is always smallest
3) If only unbiased estimators are considered, may find T whose
MSE (variance) is smallest.

Consistency

Definition. An estimator Tn of the parameter θ is said to be


consistent iff for any ε >0
.

(chance of a bad estimate is small for large n)

To show consistency:
If then Tn is consistent.

9
Example:
Random sample of X values obtained to estimate the parameter µ.
MSE( ) = Var( ) =
Therefore is a consistent estimator of µ.

Note:
If V(T)
then T is consistent.

10

You might also like