Econometrics Presentation

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 31

Welcome to Our

Presentation Program

1
MULTICOLLINEARITY:
What Happens if
Explanatory Variables are
Correlated?
Submitted to

Fahad Zeya
Assistant Professor
Department of Finance & Banking
Comilla University

3
Submitted by
Name ID
Choiti Chakma (Leader) 22217049
Jahidul Islam 22217013
Momtaj Jahan 22217003
Lovely Rani Nath 22217006
Anika Sharmila 22217021
Mst. Fahmida Akter 22217029
Priya Saha 22217015
Sumaiya Akter 22217004
4
Now The Presenter is

Choiti Chakma
ID - 22217049
5
According to OLS Regression Model
Y = β₀ + β₁X₁ + β₂X₂ + ... + βₖXₖ + ε

According to OLS Regression Assumption there


are no correlations between the independent
variables.

What happens if Independent Variables are


correlated ???
6
Example

SC= β₀ + β₁PM+ β₂FI

 If PM increase by 1 unit ,SC increase by 1 unit


keeping FI remain constant.
 IF FI increase by 2 unit ,SC increase by 2 unit
keeping PM remain constant.
 But the problem is PM depends on FI.
 FI increase = PM increase= SC increase
7
MULTICOLLINEARITY

When independent variables


started to correlate with
each other so we have
named their problem
Multicollinearity.

8
Now The Presenter is

Mst. Fahmida Akter ID


- 22217029
9
Nature of Multicollinearity

1)Perfect Multicollinearity

2)Imperfect Multicollinearity

10
Perfect Multicollinearity

If two or more independent variables have an exact


linear relationship between them then we have perfect
multicollinearity.

11
Imperfect Multicollinearity

Imperfect multicollinearity occurs when two or more


explanatory variables in a statistical model are
correlated with each other, but not perfectly.

12
Now The Presenter is

Anika Sharmila
ID - 22217021
13
Theoretical Consequences of Multicollinearity

• Under perfect Multicollinearity, • Interpretation of coefficients is


the OLS estimators simply do not not independent: Simple
exist. interpretation does not seem to
• If you try to estimate an be valid.
equation in Eviews and Eviews • Large variances and standard
will does not given the outcomes errors of coefficients.
or singular matric. • Insignificant t-ratios.
• OLS estimators remain • High R2 but low t.
unbiased : no violation of BLUE • Wrong signs of coefficients.
property.. 14
Now The Presenter is

Lovely Rani Nath


ID - 22217006
15
Practical Consequences of Multicollinearity

In Y= 2.0328 = 0.4515 In X2
– 0.3722 In X3

t = (17.479) (18.284)
(-5.8647)
R2 = 0.9801; R2 = 0.9781
16
Practical Consequences of Multicollinearity
1. Large variances and standard errors of OLS estimators
2. Wider confidence intervals
3. Insignificant t ratios
4. A high R2 value but few significant t ratios
5. OLS estimators and their standard errors become very sensitive to small
changes in the date; that is, they tend to be unstable.
6. Wrong signs for regression coefficients.
7. Difficulty in assessing the individual contributions of explanatory
variables to the explained sum of squares (ESS) or R2. 17
Now The Presenter is

Momtaj Jahan
ID - 22217003
18
Detection of multicollinearity

Y = β₀ + β₁X₁ + β₂X₂ + ... + β3X3 + ε

(1) High R square but significant t ratio. If R square is


our office

greater than 0.8 then F test will reject the null


hypothesis that means partial coefficient will be zero.
(2) Pair wise test: If we test explanatory variable and the
correlation between any pair of r>0.8, then
multicollinearity can be existed. But this is sufficient
condition not necessary condition .
19
(3)Auxiliary regression: One way of finding out which X
variable in the model is to regress each X variable on the
remaining X variables and to compute corresponding R
square. Each of these regression is called auxiliary
regression. For example: if we suspect X1 and X2 is
correlated then we find the coefficient of determination
between them. According to Klein rule of thumb - Ri²>R²
= yes, there is a multicollinearity.
20
(4): Larger variance inflation factor( VIF) VIF = 1/ 1- R²
When all the independent variables are alternatively
made dependent by excluding Y , we will get VIF.
When R² is 0, there is no collinearity & the VIF will be
1.
When VIF > 1 : independent variables began to correlate.

21
Now The Presenter is

Priya Saha
ID - 22217015
22
Problems of Multicollinearity

1.Statistical consequences of multicollinearity include


difficulties in testing individual regression coefficients
due to inflated standard error.

2.Numerical consequences of multicollinearity include


difficulties in the computer's calculations due to
numerical instability.

23
Now The Presenter is

Jahidul Islam
ID - 22217013
24
Multicollinearity Problems: Two Remedial
Measures
1) Dropping a variable(s) from the model: The simplest solution to
fix multicollinearity problem is to drop one or more of the collinear
variables. Simply drop some of the correlated predictors. From a
practical point of view, there is no point in keeping 2 very similar
predictors in the model. But a significant problem arises if we drop
one or more variables. The estimated parameters may turn out to be
biased.

25
Multicollinearity Problems: Two Remedial Measures

2) Principle Components Analysis(PCA): Another very popular


technique is to perform Principal components analysis (PCA).
PCA is used when we want to reduce the number of variables in
our data but we are not sure which variable to drop. It is a type of
transformation where it combines the existing predictors in a
way only to keep the most informative part.

26
Now The Presenter is

Sumaiya Akter
ID – 22217004
27
Collect More Data

Collecting more data can be a useful way to reduce the problem


of multicollinearity. By increasing the sample size, the standard
error of the coefficients decreases, which reduces the possibility
of multicollinearity.

28
Standardize the Variables

Standardizing the variables by converting them to z-scores


can help to reduce the multicollinearity problem.
Standardizing the variables scales them to have a mean of
zero and a standard deviation of one, which makes them
comparable and reduces the correlation between them.

29
Ridge Regression

Ridge regression is a technique that is used to handle


multicollinearity by adding penalty term to the regression
equation. This penalty term reduces the magnitude of the
coefficient of the correlated variables, which reduces the
multicollinearity problem.

30
Thank You So Much

31

You might also like