Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 1

ECO 6352-701 Applied Econometrics EXERCISE 19

Prof. T. Fomby Spring 2001

Purpose: To learn how to estimate models by the Heckit Method that are subject to sample selectivity bias. Such sample selectivity bias arises when the sample is truncated based on a selection rule that is correlated with the value of the dependent variable. More formally, if the error in the selection rule is correlated with the error in the regression equation of primary interest, there is selectivity bias and least squares estimates of the original regression equation will be biased. One way to test and correct for such selectivity bias is to include the inverse Mills ratio in your regression equation. The inverse Mills ratio is a "correction term" for the bias that arises from the selectivity bias problem. Here we are going to examine Examples 17.1 (Married Women's Labor Force Participation) and 17.5 (Wage Offer Equation for Married Women) in your Wooldridge textbook. The data we will be using is provided in the Wooldridge textbook as MROZ.RAW. Go to the website for this course and download the SAS program Heckit.sas. Be sure and read the comments that are interspersed throughout the program. Use the output from this program to answer the following questions. (a) In words describe the selectivity that can potentially occur when analyzing the wage offer equation of Married Women. See the Wooldridge textbook discussion in Section 17.5 entitled "Sample Selection Corrections." (b) Using the Heckit Method, test for sample selectivity bias in the Wage Offer equation. What is the null hypothesis of the test? What is the alternative hypothesis of the test? When is it appropriate to use the Least Squares estimates and their standard errors obtained from the original regression equation? When is Least Squares inconsistent and how do you get consistent estimates of the coefficients when selectivity bias is present? (Hint: Think Heckit equation.) (c) In the Heckit equation, when the inverse mills ratio is statistically significant (and hence there is selectivity bias), the least squares coefficient estimates are consistent but their standard errors are too small and thus overstate significance. Why is this the case? All I want is an intuitive answer. (Hint: See the comments in the Heckit.sas computer program and read Wooldridge, p. 562.)

You might also like