Professional Documents
Culture Documents
Multi Col Linearity
Multi Col Linearity
Sciences
MULTICOLLINEARITY
1
Amity Institute of Applied
Sciences
Multicollinearity
• One of the important assumptions of classical
linear model is to assume that there is no
correlation among the explanatory variables
included in the linear model. For example if we
consider a linear model
yi = Bo + B1X1i + B2X2i+ ui : i=1, 2
Then this assumption of CLRM says that
Cor (X2i , X1i)=0
That is X2i and X1i are not correlated by any
means.
Amity Institute of Applied
Sciences
– The situation when this assumption is violated
is referred as Multicollinearity.
– When the explanatory variables are highly
correlated it becomes Very difficult to
distinguish the separate effects of each of the
explanatory variables.
• The term “multicollinearity” means the existence
of a perfect or exact linear relationship among
some or all explanatory variables of a regression
model.
3
Amity Institute of Applied
Sciences
9
Amity Institute of Applied
Sciences
Consequences of Multicollinearity
In cases of near and high multicollinearity one is
likely to have the following consequences:
• Although the Ordinary least square estimators are
still BLUE, they have large variances and
covariance's making precise estimation difficult.
• Because of above consequence, the confidence
intervals tend to be much wider, leading to the
acceptance of the zero null hypothesis (i.e. the true
population coefficient is zero).
• The t-ratio of one or more variables tends to be
statistically insignficant.
Amity Institute of Applied
Sciences
• The multiple correlation coefficient R 2 i.e. the
overall measure of goodness of fit of the model
can be very high misleading about goodness of
fit of the model.
• The OLS estimators and their standard errors
can be sensitive to small change in the data.
Ideally OLS should not change with small change
in data.