Professional Documents
Culture Documents
Define Discriminant Function Analysis
Define Discriminant Function Analysis
Define Discriminant Function Analysis
Discriminant
function
analysis is
statistical
grouping
analysis
variable)
by
to
predict
one
or
more continuous or binary independent variables (called predictor variables). The original
dichotomous discriminant analysis was developed by Sir Ronald Fisher in 1936.[1] It is
different from an ANOVA or MANOVA, which is used to predict one (ANOVA) or multiple
(MANOVA) continuous dependent variables by one or more independent categorical
variables. Discriminant function analysis is useful in determining whether a set of variables is
effective in predicting category membership.[2]
2. Purpose of the method.
Ans: There are several purposes for DA:
To classify cases into groups using a discriminant prediction equation.
To investigate independent variable mean differences between groups formed by the dependent
variable.
To determine the percent of variance in the dependent variable explained by the independents.
To determine the percent of variance in the dependent variable explained by the independents
over and above the variance accounted for by control variables, using sequential discriminant
analysis.
To assess the relative importance of the independent variables in classifying the dependent
variable.
To discard variables which are little related to group distinctions.
To test theory by observing whether cases are classified as predicted.
In DA, the independent variables are the predictors and the dependent variables
are the groups.
Essentially, discrim is interested in exactly how the groups are differentiated not
just that they are significantly different (as in MANOVA)
the dependent, then select the second independent which most correlates with the remaining
variance in the dependent, and so on until selection of an additional independent does not
increase the R-squared (in DA, canonical R-squared) by a significant amount (usually
signif=.05). As in multiple regression, there are both forward (adding variables) and
backward (removing variables) stepwise versions.
Stepwise procedures are sometimes said to eliminate the problem of multicollinearity, but this
is misleading. The stepwise procedure uses an intelligent criterion to set order, but it certainly
does not eliminate the problem of multicollinearity. To the extent that independents are highly
intercorrelated, the standard errors of their standardized discriminant coefficients will be
inflated and it will be difficult to assess the relative importance of the independent variables.
The researcher should keep in mind that the stepwise method capitalizes on chance
associations and thus significance levels are worse (that is, numerically higher) than the true
alpha significance rate reported. Thus a reported significance level of .05 may correspond to a
true alpha rate of .10 or worse. For this reason, if stepwise discriminant analysis is employed,
use of cross-validation is recommended.