Professional Documents
Culture Documents
Assessing Readiness For E-Learning
Assessing Readiness For E-Learning
Assessing Readiness For E-Learning
net/publication/227505696
CITATIONS READS
173 11,075
3 authors:
Donald Triner
Massachusetts Institute of Technology
7 PUBLICATIONS 205 CITATIONS
SEE PROFILE
Some of the authors of this publication are also working on these related projects:
We Share Science: Using Short Video Abstracts to Share Research Across Disciplines View project
All content following this page was uploaded by Ryan Watkins on 19 November 2019.
Ryan Watkins
The George Washington University
Doug Leigh
Pepperdine University
Don Triner
United States Coast Guard
ABSTRACT
St. Louis Community College Twenty checkbox questions that are randomly ar-
http://www.stlcc.cc.mo.us/ ranged. Ten of these are positive indicators and the
distance/assessment/ other ten are the opposite. Prediction of suitability is
based on the number of positive responses as compared
to the number of negative responses.
Suffolk County Community Twelve multiple-choice questions with three answer-
College options. General feedback and prediction is given to
http://www.sunysuffolk.edu/ respondent without assigning a score. However, there
Web/VirtualCampus/ are explanations for twelve of the dimensions or ques-
tions asked.
Tallahassee Community Introduces seven elements or facets of a successful
College online distance learner. Ten multiple-choice questions
http://www.tcc.cc.fl.us/courses/ with three option-answers follow survey. No submission
selfassess.asp is required as respondent is given guidelines for mark-
ing his score based on the number of responses, with
option “a” being the most suitable for distance learning,
“b” being somewhat suitable for distance learning, and
“c” being not suitable.
University of Phoenix Six multiple-choice questions with each having four
Petersons.com Distance options. Three standard feedback questions advise or
Learning forewarn respondents about what distance education
http://iiswinprd03.petersons. entails.
com/dlwizard/code/default.asp
WEBducation.com Eleven multiple-choice questions with four options to
http://webducation.com/free/ determine what kind of learner the respondent is by
learningquiz.asp counting the number of R’s, K’s, V’s and A’s. Each of these
letters corresponds to a different mode of learning:
R= Reading/Writing
K= Kinesthetic
V= Visual
A = Auditory
Capella University Six multiple-choice questions concerning time schedule,
http://www.capella.edu/reborn/ convenience, time commitment, computer, reading, dis-
html/index.aspx cipline; six Yes/No response-type questions regarding
independence and collaboration; four Yes/No response-
type questions regarding class interactions.
Florida Gulf Coast Forty-seven multiple-choice questions, most of which
University are Yes/No response-type covering mainly technology
http://www.fgcu.edu/support/ skills. Skills assessed include basic computer operation
techskills.html and concept, word processing skills, Internet/Web/Web
Board, email, library skills, and computer and Internet
accessibility.
Table 2
Cronbach’s Alpha Coefficients for Each Subscale Within the
Initial Self-assessment (i.e., sample one)
Table 3
Eigenvalues After Varimax Rotation
Individual Cumulative
No. Percent Scree Plot
Eigenvalue Percent
1 3.178536 16.11 16.11 ||||
2 2.929239 14.84 30.95 |||
3 3.120172 15.81 46.76 ||||
4 5.145354 26.07 72.83 ||||||
5 1.955438 9.91 82.74 ||
6 1.845009 9.35 92.09 ||
Note: Eigenvalues under 1.0 are not reported for the sake of brevity.
cal problems with the online version evidence supporting only the internal
of the instruments. consistency of items within the E-
learner Readiness Self-assessment:
Discussion a necessary (but not sufficient) step
After completing analyses using in demonstrating the overall valid-
data from the second sample, it was ity and utility of the measure. The
determined that a third version of revised E-learner Self-Assessment
the instrument could be developed demonstrated characteristics of in-
that would demonstrate the desired ternal consistency that make it an ap-
internal consistency necessary for propriate candidate as an instrument
continuing research. The revised for continued research regarding its
instrument (see Appendix) included external utility (i.e., predictability).
the integration of items in scales for The researchers plan to continue
“Technology Skills and Online Rela- their research in the validation of the
tionships.” Although items in the sub- instrument by evaluating the ability
scale of “Internet Discussions” were of the instrument to predict perfor-
identified with a Cronbach Alpha co- mance in a wide-range of e-learning
efficient less than the desired .8, since experiences.
only marginal benefits from deleting This study of the initial and
individual items were evident from revised instrument does, however,
the data analysis, those items were provide evidence to support that the
included in the revised instrument. questions used consistently measure
These items were subject to a few the desired scales that were initially
changes in wording. derived from the e-learning litera-
Unfortunately, data collected to ture. Consequently, future versions
support the external validity of the of the E-learning Readiness Self-as-
instrument could not be analyzed due sessment may provide practitioners
to technical problems. Continuing ef- and researchers with a valid and
forts to obtain a fourth sample were reliable instrument for measuring
not within the scope of the study. As the readiness of learners for success
a result, the study concluded with in the online classroom.
1 = Completely Disagree
2 = Strongly Disagree
3 = Not Sure
4 = Strongly Agree
5 = Completely Agree