Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 41

Software practical training

APPLICATION TO CROSS SECTIONAL DATA IN ECONOMETRICS


 GRAPHICS
 CROSS TABULAR ANALYSIS (for dummy variable)

 ለአንድ variable (example Economi growth) ከተለያየ observation data (Ethiopia,kenya,gana…) በአንድ ጊዜ
(in 2012 ብቻ) የምንሰበስብ ከሆነ Cross sectional data ሆኖ primary data ነው፡፡
 ለአንድ variable ከተለያየ observation data በአንድ ጊዜ ከልላ ከወሰድን Cross sectional data ሆኖ secondary
data ነው፡፡
 ለአንድ variable ከአንድ observation data በተለያየ ጊዜ የምንሰበስብ ከሆነ Time serious data ነው፡፡

. import excel "C:\Eco exel\consumption.xlsx", sheet("Sheet1") firstrow

(8 vars, 51 obs)

.ed

. sum

Variable | Obs Mean Std. Dev. Min Max

-------------+---------------------------------------------------------

Consumption | 50 2456.4 1582.302 300 7000

income | 50 3360 2234.676 200 10000

age | 50 37.18 15.82002 20 80

wealth | 50 23001.38 10370.16 2580 46210

sex | 50 .48 .504672 0 1

-------------+---------------------------------------------------------

education | 50 .9 .7889544 0 2

G| 0

H| 0

. twoway (line Consumption income)

. ztest Consumption==5000

Prepared by yitbarek
Software practical training

One-sample z test

------------------------------------------------------------------------------

Variable | Obs Mean Std. Err. Std. Dev. [95% Conf. Interval]

---------+--------------------------------------------------------------------

Consum~n | 50 2456.4 .1414214 1 2456.123 2456.677

------------------------------------------------------------------------------

mean = mean(Consumption) z = -1.8e+04

Ho: mean = 5000

Ha: mean < 5000 Ha: mean != 5000 Ha: mean > 5000

Pr(Z < z) = 0.0000 Pr(|Z| > |z|) = 0.0000 Pr(Z > z) = 1.0000

. ttest Consumption==5000

One-sample t test

------------------------------------------------------------------------------

Variable | Obs Mean Std. Err. Std. Dev. [95% Conf. Interval]

---------+--------------------------------------------------------------------

Consum~n | 50 2456.4 223.7713 1582.302 2006.715 2906.085

------------------------------------------------------------------------------

mean = mean(Consumption) t = -11.3670

Ho: mean = 5000 degrees of freedom = 49

Ha: mean < 5000 Ha: mean != 5000 Ha: mean > 5000

Pr(T < t) = 0.0000 Pr(|T| > |t|) = 0.0000 Pr(T > t) = 1.0000

regress Consumption income

Source | SS df MS Number of obs = 50

Prepared by yitbarek
Software practical training

-------------+---------------------------------- F(1, 48) = 637.96

Model | 114095781 1 114095781 Prob > F = 0.0000

Residual | 8584571.19 48 178845.233 R-squared = 0.9300

-------------+---------------------------------- Adj R-squared = 0.9286

Total | 122680352 49 2503680.65 Root MSE = 422.9

------------------------------------------------------------------------------

Consumption | Coef. Std. Err. t P>|t| [95% Conf. Interval]

-------------+----------------------------------------------------------------

income | .6828452 .027035 25.26 0.000 .6284877 .7372027

_cons | 162.0402 108.7583 1.49 0.143 -56.63298 380.7134

------------------------------------------------------------------------------

. cor Consumption income

(obs=50)

| Consum~n income

-------------+------------------

Consumption | 1.0000

income | 0.9644 1.0000

. tabulate sex education, cell chi2 CROSS TABULAR ANALYSIS


(for dummy variable)
+-----------------+

| Key |

|-----------------|

| frequency |

| cell percentage |

Prepared by yitbarek
Software practical training

+-----------------+

| education

sex | 0 1 2| Total

-----------+---------------------------------+----------

0| 9 10 7| 26

| 18.00 20.00 14.00 | 52.00

-----------+---------------------------------+----------

1| 9 9 6| 24

| 18.00 18.00 12.00 | 48.00

-----------+---------------------------------+----------

Total | 18 19 13 | 50

| 36.00 38.00 26.00 | 100.00

Pearson chi2(2) = 0.0496 Pr = 0.975

corr Consumption income age wealth sex education Inferential


(obs=50)
analysis for
Bivariate
| Consum~n income age wealth sex educat~n analysis
-------------+------------------------------------------------------

Consumption | 1.0000

income | 0.9644 1.0000

age | -0.1254 -0.2058 1.0000

wealth | 0.5628 0.5607 -0.2003 1.0000

sex | -0.0532 -0.0595 0.0605 0.0594 1.0000

education | 0.3108 0.4167 -0.1915 0.0683 -0.0308 1.0000

Prepared by yitbarek
Software practical training

. reg Consumption income

Source | SS df MS Number of obs = 50

-------------+---------------------------------- F(1, 48) = 637.96

Model | 114095781 1 114095781 Prob > F = 0.0000

Residual | 8584571.19 48 178845.233 R-squared = 0.9300

-------------+---------------------------------- Adj R-squared = 0.9286

Total | 122680352 49 2503680.65 Root MSE = 422.9

------------------------------------------------------------------------------

Consumption | Coef. Std. Err. t P>|t| [95% Conf. Interval]

-------------+----------------------------------------------------------------

income | .6828452 .027035 25.26 0.000 .6284877 .7372027

_cons | 162.0402 108.7583 1.49 0.143 -56.63298 380.7134

. logit education sex age We call this chi square


We use it if Dependent
Iteration 0: log likelihood = -32.67091
variable dummy
Iteration 1: log likelihood = -31.479414

Iteration 2: log likelihood = -31.476572

Iteration 3: log likelihood = -31.476572

Logistic regression Number of obs = 50

LR chi2(2) = 2.39

Prob > chi2 = 0.3029

Log likelihood = -31.476572 Pseudo R2 = 0.0366

Prepared by yitbarek
Software practical training

------------------------------------------------------------------------------

education | Coef. Std. Err. z P>|z| [95% Conf. Interval]

-------------+----------------------------------------------------------------

sex | -.0693746 .6056513 -0.11 0.909 -1.256429 1.11768

age | -.028662 .019044 -1.51 0.132 -.0659875 .0086635

_cons | 1.695121 .8266621 2.05 0.040 .0748935 3.315349

. mfx
Marginal effects after logit

y = Pr(education) (predict)

= .64477923

------------------------------------------------------------------------------

variable | dy/dx Std. Err. z P>|z| [ 95% C.I. ] X

---------+--------------------------------------------------------------------

sex*| -.0158947 .1388 -0.11 0.909 -.287929 .25614 .48

age | -.0065647 .00436 -1.51 0.132 -.015108 .001978 37.18

------------------------------------------------------------------------------

(*) dy/dx is for discrete change of dummy variable from 0 to 1

. regress Consumption income age wealth sex education

Source | SS df MS Number of obs = 50

-------------+---------------------------------- F(5, 44) = 149.03

Model | 115840318 5 23168063.7 Prob > F = 0.0000

Residual | 6840033.62 44 155455.31 R-squared = 0.9442

-------------+---------------------------------- Adj R-squared = 0.9379

Total | 122680352 49 2503680.65 Root MSE = 394.28

Prepared by yitbarek
Software practical training

------------------------------------------------------------------------------

Consumption | Coef. Std. Err. t P>|t| [95% Conf. Interval]

-------------+----------------------------------------------------------------

income | .7147582 .0344614 20.74 0.000 .6453057 .7842107

age | 6.708642 3.70608 1.81 0.077 -.7604719 14.17776

wealth | .0026088 .0068365 0.38 0.705 -.0111692 .0163868

sex | -3.911833 112.778 -0.03 0.972 -231.201 223.3773

education | -196.966 81.40955 -2.42 0.020 -361.0362 -32.89581

_cons | -75.47325 230.9427 -0.33 0.745 -540.9077 389.9612

Hetroschedencity
------------------------------------------------------------------------------

. hettest First Regression have to be make

Breusch-Pagan / Cook-Weisberg test for heteroskedasticity

Ho: Constant variance

Variables: fitted values of Consumption

chi2(1) = 4.73

Prob > chi2 = 0.0296

Create new variables


. gen lnConsumption =ln( Consumption)
(1 missing value generated)

. gen lnincome =ln( income )

(1 missing value generated)

. gen lnwealth =ln(wealth)

(1 missing value generated)

Prepared by yitbarek
Software practical training

. regress Consumption income age wealth sex education

Source | SS df MS Number of obs = 50

-------------+---------------------------------- F(5, 44) = 149.03

Model | 115840318 5 23168063.7 Prob > F = 0.0000

Residual | 6840033.62 44 155455.31 R-squared = 0.9442

-------------+---------------------------------- Adj R-squared = 0.9379

Total | 122680352 49 2503680.65 Root MSE = 394.28

------------------------------------------------------------------------------

Consumption | Coef. Std. Err. t P>|t| [95% Conf. Interval]

-------------+----------------------------------------------------------------

income | .7147582 .0344614 20.74 0.000 .6453057 .7842107

age | 6.708642 3.70608 1.81 0.077 -.7604719 14.17776

wealth | .0026088 .0068365 0.38 0.705 -.0111692 .0163868

sex | -3.911833 112.778 -0.03 0.972 -231.201 223.3773

education | -196.966 81.40955 -2.42 0.020 -361.0362 -32.89581

_cons | -75.47325 230.9427 -0.33 0.745 -540.9077 389.9612

------------------------------------------------------------------------------

Roboost regression
. rreg Consumption income age wealth sex education This is the last option

Huber iteration 1: maximum difference in weights = .48894981

Huber iteration 2: maximum difference in weights = .09834988

Huber iteration 3: maximum difference in weights = .04202991

Biweight iteration 4: maximum difference in weights = .15259292

Biweight iteration 5: maximum difference in weights = .02087195

Biweight iteration 6: maximum difference in weights = .00609413

Prepared by yitbarek
Software practical training

Robust regression Number of obs = 50

F( 5, 44) = 139.22

Prob > F = 0.0000

------------------------------------------------------------------------------

Consumption | Coef. Std. Err. t P>|t| [95% Conf. Interval]

-------------+----------------------------------------------------------------

income | .7144283 .0355928 20.07 0.000 .6426957 .7861609

age | 5.169524 3.827752 1.35 0.184 -2.544804 12.88385

wealth | .0004722 .0070609 0.07 0.947 -.0137582 .0147025

sex | -17.48285 116.4806 -0.15 0.881 -252.234 217.2683

education | -169.7503 84.08226 -2.02 0.050 -339.2069 -.2936307

_cons | 21.27041 238.5246 0.09 0.929 -459.4444 501.9852

. reg Consumption income age wealth sex education

Source | SS df MS Number of obs = 50

-------------+---------------------------------- F(5, 44) = 149.03

Model | 115840318 5 23168063.7 Prob > F = 0.0000

Residual | 6840033.62 44 155455.31 R-squared = 0.9442

-------------+---------------------------------- Adj R-squared = 0.9379

Total | 122680352 49 2503680.65 Root MSE = 394.28

------------------------------------------------------------------------------

Consumption | Coef. Std. Err. t P>|t| [95% Conf. Interval]

-------------+----------------------------------------------------------------

income | .7147582 .0344614 20.74 0.000 .6453057 .7842107

Prepared by yitbarek
Software practical training

age | 6.708642 3.70608 1.81 0.077 -.7604719 14.17776

wealth | .0026088 .0068365 0.38 0.705 -.0111692 .0163868

sex | -3.911833 112.778 -0.03 0.972 -231.201 223.3773

education | -196.966 81.40955 -2.42 0.020 -361.0362 -32.89581

_cons | -75.47325 230.9427 -0.33 0.745 -540.9077 389.9612

------------------------------------------------------------------------------

. predict reside, residual This is the error term


(1 missing value generate
value
. histogram reside
(bin=7, start=-1018.8731, width=260.80632)

. qnorm reside

. replace G = 1 in 1

(1 real change made)

. replace G = 2 in 2

(1 real change made)

. replace G = 3 in 3

(1 real change made)

. replace G = 45 in 4

(1 real change made)

Prepared by yitbarek
Software practical training

. replace G = 4 in 4

(1 real change made)

. replace G = 5 in 5

(1 real change made)

. replace G = 6 in 6

(1 real change made)

. replace G = 7 in 7

(1 real change made)

. replace G = 8 in 8

(1 real change made)

. replace G = 9 in 9

(1 real change made)

. replace G = 0 in 10

(1 real change made)

. replace G = 10 in 10

(1 real change made)

. replace G = 11 in 11

(1 real change made)

. replace G = 12 in 12

Prepared by yitbarek
Software practical training

(1 real change made)

. replace G = 14 in 13

(1 real change made)

. replace G = 13 in 13

(1 real change made)

. replace G = 14 in 14

(1 real change made)

. replace G = 151 in 15

variable G was byte now int

(1 real change made)

. replace G = 15 in 15

(1 real change made)

. replace G = 16 in 16

(1 real change made)

. replace G = 17 in 17

(1 real change made)

. replace G = 18 in 18

(1 real change made)

. replace G = 19 in 19

(1 real change made)

Prepared by yitbarek
Software practical training

. replace G = 20 in 20

(1 real change made)

. replace G = 21 in 21

(1 real change made)

. replace G = 22 in 22

(1 real change made)

. replace G = 23 in 23

(1 real change made)

. replace G = 24 in 24

(1 real change made)

. replace G = 25 in 25

(1 real change made)

. replace G = 26 in 26

(1 real change made)

. replace G = 27 in 27

(1 real change made)

. replace G = 28 in 28

(1 real change made)

. replace G = 29 in 29

Prepared by yitbarek
Software practical training

(1 real change made)

. replace G = 30 in 30

(1 real change made)

. replace G = 32 in 31

(1 real change made)

. replace G = 312 in 31

(1 real change made)

. replace G = 31 in 31

(1 real change made)

. replace G = 1 in 32

(1 real change made)

. replace G = 32 in 32

(1 real change made)

. replace G = 33 in 33

(1 real change made)

. replace G = 34 in 34

(1 real change made)

. replace G = 35 in 35

(1 real change made)

Prepared by yitbarek
Software practical training

. replace G = 36 in 36

(1 real change made)

. replace G = 37 in 37

(1 real change made)

. replace G = 37 in 38

(1 real change made)

. replace G = 38 in 38

(1 real change made)

. replace G = 39 in 39

(1 real change made)

. replace G = 40 in 40

(1 real change made)

. replace G = 41 in 41

(1 real change made)

. replace G = 42 in 42

(1 real change made)

. replace G = 42 in 43

(1 real change made)

. replace G = 43 in 43

(1 real change made)

Prepared by yitbarek
Software practical training

. replace G = 44 in 44

(1 real change made)

. replace G = 45 in 45

(1 real change made)

. replace G = 46 in 46

(1 real change made)

. replace G = 47 in 47

(1 real change made)

. replace G = 48 in 48

(1 real change made)

. replace G = 49 in 49

(1 real change made)

. replace G = 50 in 50

(1 real change made)

rename G year

. tsset year
time variable: year, 1 to 50
Prepared by yitbarek
Software practical training

delta: 1 unit

estat bgodfrey

Breusch-Godfrey LM test for autocorrelation


---------------------------------------------------------------------------
lags(p) | chi2 df Prob > chi2
-------------+-------------------------------------------------------------
1 | 0.654 1 0.4186
---------------------------------------------------------------------------
H0: no serial correlation

. linktest

Source | SS df MS Number of obs = 50


-------------+---------------------------------- F(2, 47) = 398.41
Model | 115847243 2 57923621.3 Prob > F = 0.0000
Residual | 6833109.32 47 145385.305 R-squared = 0.9443
-------------+---------------------------------- Adj R-squared = 0.9419
Total | 122680352 49 2503680.65 Root MSE = 381.29

Prepared by yitbarek
Software practical training

------------------------------------------------------------------------------
Consumption | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
_hat | .9738604 .1249117 7.80 0.000 .7225704 1.22515
_hatsq | 4.03e-06 .0000185 0.22 0.828 -.0000331 .0000412
_cons | 30.54092 173.3982 0.18 0.861 -318.2912 379.373

. reg Consumption income age wealth sex education

Source | SS df MS Number of obs = 50


-------------+---------------------------------- F(5, 44) = 149.03
Model | 115840318 5 23168063.7 Prob > F = 0.0000
Residual | 6840033.62 44 155455.31 R-squared = 0.9442
-------------+---------------------------------- Adj R-squared = 0.9379
Total | 122680352 49 2503680.65 Root MSE = 394.28

------------------------------------------------------------------------------
Consumption | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
income | .7147582 .0344614 20.74 0.000 .6453057 .7842107
age | 6.708642 3.70608 1.81 0.077 -.7604719 14.17776

Prepared by yitbarek
Software practical training

wealth | .0026088 .0068365 0.38 0.705 -.0111692 .0163868


sex | -3.911833 112.778 -0.03 0.972 -231.201 223.3773
education | -196.966 81.40955 -2.42 0.020 -361.0362 -32.89581
_cons | -75.47325 230.9427 -0.33 0.745 -540.9077 389.9612
------------------------------------------------------------------------------

. vif multicollinearity

Variable | VIF 1/VIF


-------------+----------------------
income | 1.87 0.534951
wealth | 1.58 0.631211
education | 1.30 0.769051
age | 1.08 0.922924
sex | 1.02 0.979359
-------------+----------------------
Mean VIF | 1.37

When dependent variable is dummy variable

. import excel "C:\Eco exel\Ass. Data.xlsx", sheet("Sheet1") firstrow


(7 vars, 1,452 obs)

Prepared by yitbarek
Software practical training

. reg poor hhsize food ageh sexh

Source | SS df MS Number of obs = 1,449


-------------+---------------------------------- F(4, 1444) = 214.04
Model | 123.453236 4 30.863309 Prob > F = 0.0000
Residual | 208.216881 1,444 .144194516 R-squared = 0.3722
-------------+---------------------------------- Adj R-squared = 0.3705
Total | 331.670117 1,448 .229053948 Root MSE = .37973

------------------------------------------------------------------------------
poor | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
hhsize | .0732773 .0039137 18.72 0.000 .0656001 .0809546
food | -.000682 .0000265 -25.78 0.000 -.0007339 -.0006301
ageh | .0008127 .0006396 1.27 0.204 -.0004419 .0020673
sexh | -.0112448 .0234336 -0.48 0.631 -.0572123 .0347227
_cons | .1966898 .0410728 4.79 0.000 .1161211 .2772585
------------------------------------------------------------------------------

. hettest

Breusch-Pagan / Cook-Weisberg test for heteroskedasticity


Ho: Constant variance
Variables: fitted values of poor

Prepared by yitbarek
Software practical training

chi2(1) = 4.56
Prob > chi2 = 0.0327

. logit poor hhsize food ageh sexh

Iteration 0: log likelihood = -942.31957


Iteration 1: log likelihood = -529.48515
Iteration 2: log likelihood = -302.0363
Iteration 3: log likelihood = -251.87142
Iteration 4: log likelihood = -249.83261
Iteration 5: log likelihood = -249.82872
Iteration 6: log likelihood = -249.82872

Logistic regression Number of obs = 1,449


LR chi2(4) = 1384.98
Prob > chi2 = 0.0000
Log likelihood = -249.82872 Pseudo R2 = 0.7349

------------------------------------------------------------------------------
poor | Coef. Std. Err. z P>|z| [95% Conf. Interval]
-------------+----------------------------------------------------------------
hhsize | 1.719588 .1220925 14.08 0.000 1.480292 1.958885

Prepared by yitbarek
Software practical training

food | -.0338838 .0024035 -14.10 0.000 -.0385946 -.0291731


ageh | .0226271 .0068405 3.31 0.001 .0092199 .0360342
sexh | -.277187 .2532055 -1.09 0.274 -.7734606 .2190866
_cons | -1.814925 .4459077 -4.07 0.000 -2.688888 -.9409622
------------------------------------------------------------------------------
Note: 187 failures and 2 successes completely determined.

. mfx Marginal result effect

Marginal effects after logit


y = Pr(poor) (predict)
= .00313972
------------------------------------------------------------------------------
variable | dy/dx Std. Err. z P>|z| [ 95% C.I. ] X
---------+--------------------------------------------------------------------
hhsize | .0053821 .00195 2.77 0.006 .00157 .009194 5.79158
food | -.0001061 .00004 -2.82 0.005 -.00018 -.000032 437.37
ageh | .0000708 .00003 2.16 0.030 6.7e-06 .000135 49.3271
sexh*| -.000926 .00095 -0.98 0.329 -.002785 .000933 .725328
------------------------------------------------------------------------------
(*) dy/dx is for discrete change of dummy variable from 0 to 1

Prepared by yitbarek
Software practical training

. logistic poor hhsize food ageh sexh ኣኡኢ

Logistic regression Number of obs = 1,449


LR chi2(4) = 1384.98
Prob > chi2 = 0.0000
Log likelihood = -249.82872 Pseudo R2 = 0.7349

------------------------------------------------------------------------------
poor | Odds Ratio Std. Err. z P>|z| [95% Conf. Interval]
-------------+----------------------------------------------------------------
hhsize | 5.582231 .6815485 14.08 0.000 4.394227 7.091418
food | .9666838 .0023234 -14.10 0.000 .9621407 .9712483
ageh | 1.022885 .006997 3.31 0.001 1.009263 1.036691
sexh | .7579128 .1919077 -1.09 0.274 .4614135 1.244939
_cons | .1628501 .0726161 -4.07 0.000 .0679564 .3902522
------------------------------------------------------------------------------
Note: _cons estimates baseline odds.
Note: 187 failures and 2 successes completely determined.

. probit poor hhsize food ageh sexh

Iteration 0: log likelihood = -942.31957

Prepared by yitbarek
Software practical training

Iteration 1: log likelihood = -484.86014


Iteration 2: log likelihood = -298.06067
Iteration 3: log likelihood = -255.15329
Iteration 4: log likelihood = -254.59291
Iteration 5: log likelihood = -254.59268
Iteration 6: log likelihood = -254.59268

Probit regression Number of obs = 1,449


LR chi2(4) = 1375.45
Prob > chi2 = 0.0000
Log likelihood = -254.59268 Pseudo R2 = 0.7298

------------------------------------------------------------------------------
poor | Coef. Std. Err. z P>|z| [95% Conf. Interval]
-------------+----------------------------------------------------------------
hhsize | .9063801 .0561507 16.14 0.000 .7963268 1.016433
food | -.0178587 .0010972 -16.28 0.000 -.0200092 -.0157082
ageh | .0126743 .0039009 3.25 0.001 .0050286 .0203199
sexh | -.1681744 .1442627 -1.17 0.244 -.4509242 .1145754
_cons | -1.020861 .2536082 -4.03 0.000 -1.517924 -.523798
------------------------------------------------------------------------------
Note: 325 failures and 24 successes completely determined.

. mfx
Marginal effects after probit

Prepared by yitbarek
Software practical training

y = Pr(poor) (predict)
= .00103803
------------------------------------------------------------------------------
variable | dy/dx Std. Err. z P>|z| [ 95% C.I. ] X
---------+--------------------------------------------------------------------
hhsize | .0031582 .0018 1.76 0.079 -.000369 .006685 5.79158
food | -.0000622 .00003 -1.78 0.075 -.000131 6.3e-06 437.37
ageh | .0000442 .00003 1.57 0.116 -.000011 .000099 49.3271
sexh*| -.0006644 .00073 -0.91 0.362 -.002093 .000764 .725328
------------------------------------------------------------------------------
(*) dy/dx is for discrete change of dummy variable from 0 to 1

APPLICATION TO Time series DATA IN ECONOMETRICS


Time series data in secondary data

. import excel "C:\Eco exel\time series data for bostwana.xlsx", sheet("Sheet1")


firstrow
(12 vars, 63 obs)

. tsset year
time variable: year, 1990 to 2020
delta: 1 unit

Prepared by yitbarek
Software practical training

. varsoc rgdp fd ge gcf fdi


We choose which have more
stars (*). In this example lag 4
Selection-order criteria have more stars
Sample: 1994 - 2020 Number of obs = 27
+---------------------------------------------------------------------------+
|lag | LL LR df p FPE AIC HQIC SBIC |
|----+----------------------------------------------------------------------|
| 0 | -2379.53 3.5e+70 176.632 176.703 176.872 |
| 1 | -2293.06 172.94 25 0.000 3.9e+68 172.079 172.507 173.518 |
| 2 | -2259.44 67.244 25 0.000 2.5e+68 171.44 172.225 174.08 |
| 3 | -2227.76 63.361 25 0.000 2.9e+68 170.945 172.087 174.785 |
| 4 | -2143.33 168.85* 25 0.000 2.0e+67* 166.543* 168.042* 171.583* |
+---------------------------------------------------------------------------+
Endogenous: rgdp fd ge gcf fdi
Exogenous: _cons

. dfuller rgdp In this we compare in the absolute value

Dickey-Fuller test for unit root Number of obs = 30

---------- Interpolated Dickey-Fuller ---------

Prepared by yitbarek
Software practical training

Test 1% Critical 5% Critical 10% Critical


Statistic Value Value Value
------------------------------------------------------------------------------
Z(t) -0.433 -3.716 -2.986 -2.624
------------------------------------------------------------------------------
MacKinnon approximate p-value for Z(t) = 0.9045

. dfuller d.rgdp d. is first difference.

Dickey-Fuller test for unit root Number of obs = 29

---------- Interpolated Dickey-Fuller ---------


Test 1% Critical 5% Critical 10% Critical
Statistic Value Value Value
------------------------------------------------------------------------------
Z(t) -5.025 -3.723 -2.989 -2.625
------------------------------------------------------------------------------
MacKinnon approximate p-value for Z(t) = 0.0000

. dfuller fd

Dickey-Fuller test for unit root Number of obs = 30

Prepared by yitbarek
Software practical training

---------- Interpolated Dickey-Fuller ---------


Test 1% Critical 5% Critical 10% Critical
Statistic Value Value Value
------------------------------------------------------------------------------
Z(t) -1.457 -3.716 -2.986 -2.624
------------------------------------------------------------------------------
MacKinnon approximate p-value for Z(t) = 0.5548

. dfuller d.fd

Dickey-Fuller test for unit root Number of obs = 29

---------- Interpolated Dickey-Fuller ---------


Test 1% Critical 5% Critical 10% Critical
Statistic Value Value Value
------------------------------------------------------------------------------
Z(t) -7.315 -3.723 -2.989 -2.625
------------------------------------------------------------------------------
MacKinnon approximate p-value for Z(t) = 0.0000

. vecrank rgdp fd ge gcf fdi, trend (constant) lags (4)

Johansen tests for cointegration

Prepared by yitbarek
Software practical training

Trend: constant Number of obs = 27


Sample: 1994 - 2020 Lags = 4
-------------------------------------------------------------------------------
5%
maximum trace critical
rank parms LL eigenvalue statistic value
0 80 -2248.5164 . 202.1177 68.52
1 89 -2205.9837 0.95717 117.0522 47.21
2 96 -2172.6542 0.91532 50.3933 29.68
3 101 -2157.0417 0.68541 19.1682 15.41
4 104 -2151.7289 0.32533 8.5427 3.76
5 105 -2147.4576 0.27123
-------------------------------------------------------------------------------

. var rgdp fd ge gcf fdi

Vector autoregression

Sample: 1992 - 2020 Number of obs = 29


Log likelihood = -2432.864 AIC = 171.5769
FPE = 2.75e+68 HQIC = 172.389
Det(Sigma_ml) = 5.07e+66 SBIC = 174.17

Equation Parms RMSE R-sq chi2 P>chi2

Prepared by yitbarek
Software practical training

----------------------------------------------------------------
rgdp 11 4.8e+08 0.9879 2374.699 0.0000
fd 11 1.08661 0.9446 99.85629 0.0000
gcf 11 6.9e+08 0.8969 252.3678 0.0000
fdi 11 2.6e+08 0.8755 203.9881 0.0000
ge 11 1.3e+08 0.9475 523.8991 0.0000
----------------------------------------------------------------

------------------------------------------------------------------------------
| Coef. Std. Err. z P>|z| [95% Conf. Interval]
-------------+----------------------------------------------------------------
rgdp |
rgdp |
L1. | 1.010628 .2050682 4.93 0.000 .6087019 1.412555
L2. | -.0381898 .2221547 -0.17 0.864 -.473605 .3972254
|
fd |
L1. | 1.07e+08 6.13e+07 1.74 0.081 -1.33e+07 2.27e+08
L2. | -6.92e+07 6.67e+07 -1.04 0.300 -2.00e+08 6.16e+07
|
gcf |
L1. | -.1774543 .1701311 -1.04 0.297 -.510905 .1559965
L2. | -.2096107 .2247472 -0.93 0.351 -.650107 .2308857
|

Prepared by yitbarek
Software practical training

fdi |
L1. | -.3050753 .3459094 -0.88 0.378 -.9830453 .3728948
L2. | .416203 .3061941 1.36 0.174 -.1839265 1.016332
|
ge |
L1. | .2327599 .6129942 0.38 0.704 -.9686866 1.434206
L2. | .5692975 .7752281 0.73 0.463 -.9501217 2.088717
|
_cons | -1.66e+08 9.42e+08 -0.18 0.860 -2.01e+09 1.68e+09
-------------+----------------------------------------------------------------
fd |
rgdp |
L1. | -7.93e-10 4.60e-10 -1.72 0.085 -1.70e-09 1.09e-10
L2. | -2.40e-10 4.99e-10 -0.48 0.631 -1.22e-09 7.37e-10
|
fd |
L1. | -.1427297 .1374723 -1.04 0.299 -.4121705 .1267111
L2. | .1022119 .1496937 0.68 0.495 -.1911823 .3956062
|
gcf |
L1. | 1.29e-09 3.82e-10 3.37 0.001 5.38e-10 2.03e-09
L2. | -1.12e-09 5.04e-10 -2.23 0.026 -2.11e-09 -1.34e-10
|
fdi |

Prepared by yitbarek
Software practical training

L1. | 6.36e-10 7.76e-10 0.82 0.412 -8.85e-10 2.16e-09


L2. | 3.51e-09 6.87e-10 5.10 0.000 2.16e-09 4.85e-09
|
ge |
L1. | 6.51e-09 1.38e-09 4.73 0.000 3.82e-09 9.21e-09
L2. | 4.13e-09 1.74e-09 2.37 0.018 7.17e-10 7.54e-09
|
_cons | 11.95576 2.113646 5.66 0.000 7.813092 16.09843
-------------+----------------------------------------------------------------
gcf |
rgdp |
L1. | .2508526 .2942851 0.85 0.394 -.3259356 .8276409
L2. | -.1864142 .3188052 -0.58 0.559 -.8112609 .4384326
|
fd |
L1. | -1.01e+08 8.79e+07 -1.14 0.253 -2.73e+08 7.18e+07
L2. | 8.90e+07 9.57e+07 0.93 0.352 -9.86e+07 2.77e+08
|
gcf |
L1. | .9209463 .2441482 3.77 0.000 .4424246 1.399468
L2. | -.5577192 .3225256 -1.73 0.084 -1.189858 .0744193
|
fdi |
L1. | -.1098597 .4964007 -0.22 0.825 -1.082787 .8630677

Prepared by yitbarek
Software practical training

L2. | 1.104968 .4394068 2.51 0.012 .2437468 1.96619


|
ge |
L1. | -.8189481 .8796832 -0.93 0.352 -2.543095 .9051992
L2. | 1.338642 1.112499 1.20 0.229 -.8418152 3.519099
|
_cons | 3.85e+08 1.35e+09 0.29 0.776 -2.26e+09 3.03e+09
-------------+----------------------------------------------------------------
fdi |
rgdp |
L1. | .496477 .1084424 4.58 0.000 .2839338 .7090202
L2. | -.1593828 .1174779 -1.36 0.175 -.3896354 .0708697
|
fd |
L1. | 1.01e+08 3.24e+07 3.12 0.002 3.76e+07 1.65e+08
L2. | 1872828 3.53e+07 0.05 0.958 -6.73e+07 7.10e+07
|
gcf |
L1. | -.2419659 .0899672 -2.69 0.007 -.4182985 -.0656334
L2. | .1251382 .1188489 1.05 0.292 -.1078013 .3580777
|
fdi |
L1. | -.3088848 .1829208 -1.69 0.091 -.6674031 .0496335
L2. | .1846009 .1619189 1.14 0.254 -.1327544 .5019562

Prepared by yitbarek
Software practical training

|
ge |
L1. | -.9186361 .3241583 -2.83 0.005 -1.553975 -.2832975
L2. | -.7909533 .4099494 -1.93 0.054 -1.594439 .0125328
|
_cons | -1.94e+09 4.98e+08 -3.90 0.000 -2.92e+09 -9.65e+08
-------------+----------------------------------------------------------------
ge |
rgdp |
L1. | .0759575 .0568896 1.34 0.182 -.035544 .1874591
L2. | -.0559748 .0616297 -0.91 0.364 -.1767667 .0648172
|
fd |
L1. | 2.97e+07 1.70e+07 1.75 0.080 -3604538 6.30e+07
L2. | -3.01e+07 1.85e+07 -1.63 0.103 -6.64e+07 6122276
|
gcf |
L1. | .0196805 .0471974 0.42 0.677 -.0728247 .1121857
L2. | -.0516715 .0623489 -0.83 0.407 -.173873 .0705301
|
fdi |
L1. | -.0428689 .0959615 -0.45 0.655 -.2309499 .1452121
L2. | .0494793 .0849437 0.58 0.560 -.1170073 .2159659
|

Prepared by yitbarek
Software practical training

ge |
L1. | .9866718 .1700555 5.80 0.000 .6533692 1.319975
L2. | -.1452466 .2150621 -0.68 0.499 -.5667606 .2762674
|
_cons | 1.19e+08 2.61e+08 0.46 0.649 -3.93e+08 6.31e+08

Vec (vector error correlation)

It show short run and long run relation


. vec rgdp fd gcf fdi ge
But var shows only short run relation

Vector error-correction model

Sample: 1992 - 2020 Number of obs = 29


AIC = 172.4909
Log likelihood = -2462.119 HQIC = 173.0668
Det(Sigma_ml) = 3.81e+67 SBIC = 174.3297

Equation Parms RMSE R-sq chi2 P>chi2


----------------------------------------------------------------
D_rgdp 7 4.5e+08 0.6381 38.78262 0.0000
D_fd 7 1.5355 0.5194 19.97174 0.0056
D_gcf 7 6.7e+08 0.4504 18.02589 0.0119
D_fdi 7 2.9e+08 0.6321 37.80211 0.0000
D_ge 7 1.3e+08 0.3156 10.14688 0.1804
----------------------------------------------------------------

Prepared by yitbarek
Software practical training

------------------------------------------------------------------------------
| Coef. Std. Err. z P>|z| [95% Conf. Interval]
-------------+----------------------------------------------------------------
D_rgdp |
_ce1 |
L1. | -.0887958 .0302234 -2.94 0.003 -.1480326 -.029559
|
rgdp |
LD. | .1500726 .2199029 0.68 0.495 -.2809292 .5810743
|
fd |
LD. | 1.14e+08 5.47e+07 2.08 0.038 6434668 2.21e+08
|
gcf |
LD. | .0647085 .1985564 0.33 0.745 -.3244549 .453872
|
fdi |
LD. | -.6281035 .2546112 -2.47 0.014 -1.127132 -.1290747
|
ge |
LD. | -.6623541 .6698702 -0.99 0.323 -1.975276 .6505674
|
_cons | 1.93e+08 1.24e+08 1.56 0.119 -4.95e+07 4.36e+08

Prepared by yitbarek
Software practical training

-------------+----------------------------------------------------------------
D_fd |
_ce1 |
L1. | -3.72e-10 1.04e-10 -3.59 0.000 -5.75e-10 -1.69e-10
|
rgdp |
LD. | -3.16e-10 7.54e-10 -0.42 0.675 -1.79e-09 1.16e-09
|
fd |
LD. | -.6780372 .1874461 -3.62 0.000 -1.045425 -.3106495
|
gcf |
LD. | 2.37e-09 6.80e-10 3.48 0.001 1.03e-09 3.70e-09
|
fdi |
LD. | -2.97e-09 8.73e-10 -3.40 0.001 -4.68e-09 -1.26e-09
|
ge |
LD. | 5.53e-10 2.30e-09 0.24 0.810 -3.95e-09 5.05e-09
|
_cons | -.463695 .4240684 -1.09 0.274 -1.294854 .3674639
-------------+----------------------------------------------------------------
D_gcf |
_ce1 |

Prepared by yitbarek
Software practical training

L1. | -.1481328 .0449276 -3.30 0.001 -.2361893 -.0600763


|
rgdp |
LD. | .2694785 .3268891 0.82 0.410 -.3712124 .9101695
|
fd |
LD. | -1.22e+08 8.13e+07 -1.50 0.133 -2.81e+08 3.72e+07
|
gcf |
LD. | .680435 .2951573 2.31 0.021 .1019373 1.258933
|
fdi |
LD. | -1.092978 .3784836 -2.89 0.004 -1.834792 -.351164
|
ge |
LD. | -2.107858 .9957728 -2.12 0.034 -4.059537 -.1561794
|
_cons | -1.03e+08 1.84e+08 -0.56 0.575 -4.64e+08 2.57e+08
-------------+----------------------------------------------------------------
D_fdi |
_ce1 |
L1. | .044336 .0198611 2.23 0.026 .0054089 .0832631
|
rgdp |

Prepared by yitbarek
Software practical training

LD. | .4069414 .1445078 2.82 0.005 .1237113 .6901714


|
fd |
LD. | 7.03e+07 3.59e+07 1.96 0.050 -97099.14 1.41e+08
|
gcf |
LD. | -.33498 .1304801 -2.57 0.010 -.5907163 -.0792437
|
fdi |
LD. | -.5302059 .1673161 -3.17 0.002 -.8581395 -.2022724
|
ge |
LD. | -.1903711 .440201 -0.43 0.665 -1.053149 .672407
|
_cons | 4.18e+07 8.13e+07 0.51 0.607 -1.18e+08 2.01e+08
-------------+----------------------------------------------------------------
D_ge |
_ce1 |
L1. | -.0024277 .0089648 -0.27 0.787 -.0199985 .015143
|
rgdp |
LD. | .043245 .0652273 0.66 0.507 -.0845982 .1710882
|
fd |

Prepared by yitbarek
Software practical training

LD. | 2.69e+07 1.62e+07 1.66 0.098 -4916648 5.87e+07


|
gcf |
LD. | .0399349 .0588956 0.68 0.498 -.0754983 .1553681
|
fdi |
LD. | -.0522914 .0755225 -0.69 0.489 -.2003127 .0957299
|
ge |
LD. | .0642072 .1986961 0.32 0.747 -.32523 .4536443
|
_cons | -28153.7 3.67e+07 -0.00 0.999 -7.20e+07 7.19e+07
------------------------------------------------------------------------------

Cointegrating equations

Equation Parms chi2 P>chi2


-------------------------------------------
_ce1 4 91.20135 0.0000
-------------------------------------------

Identification: beta is exactly identified

Johansen normalization restriction imposed

Prepared by yitbarek
Software practical training

------------------------------------------------------------------------------
beta | Coef. Std. Err. z P>|z| [95% Conf. Interval]
-------------+----------------------------------------------------------------
_ce1 |
rgdp | 1 . . . . .
fd | -8.80e-08 3.88e+08 -0.00 1.000 -7.61e+08 7.61e+08
gcf | 3.578237 .8072873 4.43 0.000 1.995983 5.160491
fdi | -6.68777 2.195395 -3.05 0.002 -10.99067 -2.384874
ge | -9.876562 2.626303 -3.76 0.000 -15.02402 -4.729103
_cons | -4.61e+09 . . . . .

የቀረውን ከ ppt ተመልከቱ

Prepared by yitbarek

You might also like