Professional Documents
Culture Documents
BSNL Research
BSNL Research
BSNL Research
Factor analysis is a method for investigating whether a number of variables of interest are
linearly related to a smaller number of unobservable factors. In the special vocabulary of
factor analysis, the parameters of these linear functions are referred to as loadings.Under
certain conditions (A1 and A2 in the text), the theoretical variance of each variable and
the covariance of each pair of variables can be expressed in terms of the loadings and the
variance of the error terms.
Factor analysis usually proceeds in two stages. In the first, one set of Loadings is
calculated which yields theoretical variances and covariance that if the observed ones as
closely as possible according to a certain criterion. These loadings, however, may not
agree with the prior expectations, or may not lend themselves to a reasonable
interpretation. Thus, in the second stage, the first loadings are \rotated" in an effort to
arrive at another set of loadings that are equally well observed variances and covariance,
but are more consistent with prior expectations or more easily interpreted.
The communality of a variable is the part of its variance that is explained by the common
factors. The specific variance is the part of the variance of the variable that is not
accounted by the common factors.
The correlation coefficient a concept from statistics is a measure of how well trends in the
predicted values follow trends in past actual values. It is a measure of how well the
predicted values from a forecast model "fit" with the real-life data.
The correlation coefficient is a number between 0 and 1. If there is no relationship
between the predicted values and the actual values the correlation coefficient is 0 or very
low (the predicted values are no better than random numbers). As the strength of the
relationship between the predicted values and actual values increases so does the
correlation coefficient. A perfect fit gives a coefficient of 1.0. Thus the higher the
correlation coefficient is the better one.
Introduction to Multiple Regression (1 of
3)
In multiple regression, more than one variable is used to predict the criterion. For
example, a college admissions officer wishing to predict the future grades of college
applicants might use three variables (High School GPA, SAT, and Quality of letters of
recommendation) to predict college GPA. The applicants with the highest predicted
college GPA would be admitted. The prediction method would be developed based on
students already attending college and then used on subsequent classes. Predicted scores
from multiple regression are linear combinations of the predictor variables. Therefore, the
general form of a prediction equation from multiple regression is: