Assessing Readiness For E-Learning

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 15

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/227505696

Assessing Readiness for E‐Learning

Article  in  Performance Improvement Quarterly · December 2004


DOI: 10.1111/j.1937-8327.2004.tb00321.x

CITATIONS READS

173 11,075

3 authors:

Ryan Watkins Doug Leigh


George Washington University Pepperdine University
143 PUBLICATIONS   905 CITATIONS    144 PUBLICATIONS   497 CITATIONS   

SEE PROFILE SEE PROFILE

Donald Triner
Massachusetts Institute of Technology
7 PUBLICATIONS   205 CITATIONS   

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

We Share Science: Using Short Video Abstracts to Share Research Across Disciplines View project

E-Learning View project

All content following this page was uploaded by Ryan Watkins on 19 November 2019.

The user has requested enhancement of the downloaded file.


Performance Improvement Quarterly, 17(4) pp. 66-79

Assessing Readiness for E-Learning

Ryan Watkins
The George Washington University

Doug Leigh
Pepperdine University

Don Triner
United States Coast Guard

ABSTRACT

Today, e-learning is a common de- self-assessment. As a first step in de-


livery media for education and train- fining an instrument that measures
ing within many organizations. Yet, an e-learner’s readiness, with the
while both the supply and demand for cooperation of volunteer participants
e-learning opportunities has risen in from the U.S. Coast Guard, this study
recent years, many professionals are looked into the validity and internal
beginning to question whether e-learn- consistency of items within a self-as-
ers are prepared to be successful in an sessment of e-learning readiness that
online learning environment (e.g., Gug- is under development, and provided
lielmino & Guglielmino, 2003; Watkins data for the continuing develop-
& Corry, 2005). After all, a learner’s ment and revision of the instrument.
demonstrated success in a conventional Having demonstrated evidence of
education and training classroom may internal consistency and construct
not be an adequate predictor of success validity, the self-assessment now pro-
in an e-learning classroom. vides a tool for continuing research
One way of gauging a potential into the prediction of e-learning
online learner’s readiness is through performance.

Introduction more than 1.6 million college stu-


Since the upsurge of Internet use dents took at least one online course,
in the mid-1990s, the characteristics and more than half a million of those
of distance education (particularly students were completing their
in economically developed countries) degrees entirely online (The Sloan
have changed substantially. Today, Consortium, 2004).
earlier modes of distance education The movement toward online
(e.g., correspondence courses, radio- learning (or “e-learning”) has not
based courses, video taped lectures) been limited to institutions of higher
are being updated or replaced with education. Today, many private sec-
Internet-based learning opportuni- tor organizations use the Internet to
ties or other computer-based modes deliver training (American Society
of delivery (NCES, 1999). During the for Training and Development, 2003;
Fall semester of 2002, for example, Stephenson, 2003). According to the

66 Performance Improvement Quarterly


American Society for Training and strument to measure an individual’s
Development (ASTD) the delivery of perceived readiness to engage in
training programs via e-learning plat- e-learning. Specifically, the research
forms (for example, DVD, CD-ROM, sought to obtain empirical evidence
Internet) has increased to rates as of the construct validity and internal
high as 29% in 2002 and 31% in 2003 consistency of the instrument. While
across a variety of business sectors the development of questions based
(ASTD, 2003). And while the use of the on a literature review is a necessary,
Internet for e-learning delivery ranged but not sufficient, step for defining the
from 32% to 74% ingredients of the
across organization effective measure;
in 2003 (depend- The purpose of this a subsequent step
ing on the business is to determine the
sector), questions study was to support internal consisten-
remain regarding the development cy of the recom-
both the instruc- mended items. This
tional soundness of
of an instrument determination is
e-learning as well to measure an then a precursor
as the readiness of individual’s to future studies of
learners to engage the predictability
in online learning perceived readiness (i.e., external valid-
environments. to engage in ity) of the measure.
In cooperation
Research
e-learning. with participants
Problem and Specifically, the from the United
Purpose research sought to States Coast Guard
The predomi- (USCG), this study
nance of e-learning obtain empirical used the statisti-
as a delivery meth- evidence of the cal relationships
od for instruction is among the ques-
well documented.
construct validity tions included on
Today’s learners and internal the self-assessment
consist of those that consistency of the to determine the ex-
have likely had ex- tent to which mea-
perience in tradi- instrument. sure yields similar
tional classroom results among its
environments, but different sections
may not have experience in online as it measures a single phenomenon.
learning situations. Despite this pre- Based on these results an updated
dominance of e-learning in both the instrument is proposed and discus-
literature and within instructional sion of an initial effort to determine
settings, measures of learner’s readi- the predictability of the instrument is
ness for new environment are rarely included. As a result of technical prob-
assessed for their internal consistency lems, however, only recommendations
and external utility. for future research on the ability of the
The purpose of this study was to instrument to predict learner success
support the development of an in- can be offered at this time.

Volume 17, Number 4/2004 67


Background (Chan, Yum, Fan, Jegede, & Taplin,
From both pedagogical and tech- 1999). And while many of these
nical perspectives, the relatively characteristics have demonstrated
young field of distance education is ties with success using specific tech-
swiftly changing. While technical nologies and/or media, predicting
advances lead to new teaching strate- more general success continues to be
gies, the growing experience of online challenging.
instructors often times generates the The limitations of past research
development of new software and should not suggest, however, that e-
hardware to facilitate the demands learners are without an assortment
of the online classroom. This cycle of of surveys and questionnaires avail-
change in an evolving field of prac- able to offer guidance in determining
tice, however, continually challenges their readiness for online learning.
researchers to validate theories A selection of online assessments of
and concepts across range of media, readiness for e-learning is available
technologies, and teaching strategies. in Table 1.
Meeting these challenges has been These, and other online assess-
especially difficult for those looking ments of readiness for e-learning,
to validate a measure for predicting can be a valuable tool for either
learner success. organizations or individuals look-
In 2002, Lim, Eberstein, and Waint ing to Internet-based courses as an
suggested that research related to opportunity to expand knowledge
the readiness of learners to adapt to and skills. The development of an
the online learning environment has assessment instrument of e-learner
not kept pace with the changes in the readiness, with evidence supporting
field of distance education. With new its validity, first internal consistency
technologies quickly changing the and then external utility, could influ-
capacity of Internet-based courses ence the retention and success rates
to support increased social support, of e-learners in a variety of online
interactivity, and simulation experi- learning experiences in both higher
ences, it is no wonder that research- education and training institutions.
ers have not been able to adequately Carr (2000) suggests “no national
measure learner readiness. With statistics exist yet about how many
each new advance in technology, re- students complete distance programs
searchers are continually challenged or courses, but anecdotal evidence
to expand their concepts of what and studies by individual institu-
knowledge, skills, and attitudes are tions suggest that course-completion
necessary for success in e-learning. and program-retention rates are
Research on e-learner perfor- generally lower in distance-educa-
mance has most often focused on re- tion courses than in their face-to-face
lationships between performance in counterparts” (p. A39). And while
the online environment and specific research has not verified a significant
learner characteristics, including for problem with retention rates in on-
example self-directed learning (An- line education or training programs,
derson, 1993; Pachnowski & Jurczyk, any decline in retention rates stands
2000), learning styles (Aragon, John- to strain many organizations already
son, & Shaik, 2000), and study habits limited resources.

68 Performance Improvement Quarterly


Table 1
Sample of Online Distance Learning Self-Test Instruments
(derived from Lim, Eberstein, and Waint, 2002)

Institutions Main Features


The Community College of Quantitative score for each of three response options
Baltimore Country to ten questions. Yields an estimate of the degree to
http://www.bccc.state.md.us/ which distance learning courses fit an individual’s
metadot/index.pl?iid=2259&isa= circumstances and lifestyle.
Category&op=show

St. Louis Community College Twenty checkbox questions that are randomly ar-
http://www.stlcc.cc.mo.us/ ranged. Ten of these are positive indicators and the
distance/assessment/ other ten are the opposite. Prediction of suitability is
based on the number of positive responses as compared
to the number of negative responses.
Suffolk County Community Twelve multiple-choice questions with three answer-
College options. General feedback and prediction is given to
http://www.sunysuffolk.edu/ respondent without assigning a score. However, there
Web/VirtualCampus/ are explanations for twelve of the dimensions or ques-
tions asked.
Tallahassee Community Introduces seven elements or facets of a successful
College online distance learner. Ten multiple-choice questions
http://www.tcc.cc.fl.us/courses/ with three option-answers follow survey. No submission
selfassess.asp is required as respondent is given guidelines for mark-
ing his score based on the number of responses, with
option “a” being the most suitable for distance learning,
“b” being somewhat suitable for distance learning, and
“c” being not suitable.
University of Phoenix Six multiple-choice questions with each having four
Petersons.com Distance options. Three standard feedback questions advise or
Learning forewarn respondents about what distance education
http://iiswinprd03.petersons. entails.
com/dlwizard/code/default.asp
WEBducation.com Eleven multiple-choice questions with four options to
http://webducation.com/free/ determine what kind of learner the respondent is by
learningquiz.asp counting the number of R’s, K’s, V’s and A’s. Each of these
letters corresponds to a different mode of learning:
R= Reading/Writing
K= Kinesthetic
V= Visual
A = Auditory
Capella University Six multiple-choice questions concerning time schedule,
http://www.capella.edu/reborn/ convenience, time commitment, computer, reading, dis-
html/index.aspx cipline; six Yes/No response-type questions regarding
independence and collaboration; four Yes/No response-
type questions regarding class interactions.
Florida Gulf Coast Forty-seven multiple-choice questions, most of which
University are Yes/No response-type covering mainly technology
http://www.fgcu.edu/support/ skills. Skills assessed include basic computer operation
techskills.html and concept, word processing skills, Internet/Web/Web
Board, email, library skills, and computer and Internet
accessibility.

Volume 17, Number 4/2004 69


One of many obstacles in pre- was initially developed to provide
dicting learner success is defining potential e-learners with quick, yet
“success.” Success for distance edu- comprehensive, analysis of their
cation can be viewed from multiple readiness for success in an online
perspectives, each having its own learning environment (Watkins,
definition and criteria. For example, 2003). The audience for the instru-
for supervisors of online learners the ment is intended to be individuals
definition of success would likely in- without previous e-learning experi-
clude the learner’s ability to improve ence, and as a result, the instrument
productivity and time off-the-job; requires the self-evaluation of par-
while for the e-learner success may ticipants on future behaviors. The
be defined through positive interac- preliminary assessment instrument
tions with peers and the instructor, or provided an initial blend of questions
the capacity to apply skills in future that potential e-learners should ask
positions rather than their current in determining if they are ready to
job (see Watkins, 2004). These diverse be successful in online education or
perspectives on success, including training. The instrument was con-
those of the instructor, client, and sidered a useful guide for assisting
client’s clients, are all critical to the individuals in determining their
success of an e-learning initiative. readiness for e-learning, as well as
Nevertheless, as a self-assessment, identifying practical study and tech-
the proposed instrument focused nical skills that should be developed
on individual (i.e., learner) achieve- prior to enrolling in online courses.
ments within a broad e-learning In 2003, funding for research into
context. the initial validation of an e-learner
The success of e-learning as an al- readiness self-assessment was pro-
ternative or supplement to classroom vided by the International Society for
instruction requires many changes Performance Improvement.
to our currently accepted mindsets
regarding education or training (see Method
Kaufman, Watkins, & Guerra, 2002). Participants
These changes include, but are not To obtain evidence regarding
limited to, our perceptions regarding the internal validity of the Online
the responsibilities of educators and Learner Readiness Self-Assessment
trainers to ensure that learners are instrument, the research team ob-
adequately prepared to be success- tained three samples of volunteer
ful in the learning environments we participants from enlisted personnel
create. The validation of an e-learner of the U.S. Coast Guard (USCG). Par-
readiness assessment instrument ticipants were attending either “boot
is one of the first steps required camp” or rate training and were not
for research in this area to provide required to be actively enrolled in an
practitioners with tools for improving online course.
both individual and organizational To maintain anonymity, specific
performance through useful e-learn- identifiers were not collected during
ing experiences. the research. However, in 2004 the
As a result of this context, the Human Resource Research Orga-
Online Learner Self-Assessment nization reported that 84% of the

70 Performance Improvement Quarterly


male recruits and 82% of the female They were required to be enrolled in
recruits of the USCG were self-iden- training that was at least partially
tified as “white” in 2002, with 4% of delivered using online technologies,
males and 7% of females self-iden- but were not required to have e-learn-
tified as “Black,” and 8% of males ing experience. A total of 15 volunteer
and 5% of females self-identified as participants were included in the
“Hispanic” respectively. Of those re- third sample and the demographics
cruits, 89% had earned a high school of USCG personnel in this sample
diploma and 11% earned a GED or were similar to those of the first two
other alternative high school diplo- samples, with the only notable differ-
ma. USCG recruits were primarily ence being the additional experience
between the ages of 20 and 24 (35%), of having completed basic training.
with 7% being 17
to 19, 21% being Self-assessment
25 to 29, 14% being Success for Instruments
30 to 34, and the The initial self-
remainder (23%)
distance assessment instru-
being over the age education ment (Watkins,
of 34. can be viewed 2003) consisted
The assessment of 40 statements
of internal consis- from multiple related to readi-
tency relies on perspectives, each ness for e-learn-
the relationships ing success, which
of answers within
having its own were grouped into
the measure and definition and 10 scales (e.g.,
its subscales to de- criteria. technology access,
termine if similar technology skills,
results among its online readings,
different sections are measures of a Internet chat). For each statement
single construct. Nine hundred thir- participants completed a 5-point Lik-
ty-six participants were included in ert-type scale response ranging from
the first two samples of the study, 436 “completely disagree” to “completely
and 500 participants in each sample, agree” with the statement.
respectively. It is typically recom- Sample statements included: “I
mended that validation studies have can send an email with a file at-
10 participants for each item in the tached”; “I think that I would be able
scale (DeVellis, 2003). As the initial to communicate effectively with oth-
measure consisted of 40 items, the ers using online technologies (e.g.,
sample size was deemed sufficient. email, chat)”; “I think that I would
The third sample consisted of be able to take notes when reading
15 USCG personnel who recently papers on the computer”; “I require
completed basic training and were more time than most others to pre-
receiving their rate (i.e., job specific) pare responses to a question.” The
training prior to assignment. Partici- statements solicited both declara-
pants in this final group received the tions about their abilities to perform
revised instrument and a measure common e-learning tasks as well as
of self-defined e-learning success. predictive statements regarding the

Volume 17, Number 4/2004 71


application of their skills in areas using a paper copy of the assessment
that may not have previous experi- and a scantron form (see Watkins,
ence (e.g., online synchronous chat, 2003). The initial assessment includ-
use of online audio). Since many ed 40 statements related to learner
participants in the study had no readiness measured on a 5-point
previous experiences as e-learners, Likert-type agreement scale. The
the readiness assessment required questions were developed based on
participants to anticipate their suc- a review of the literature. The face
cess in applying skills in the online validity of the items was assessed
environment. through reviews by colleagues at
Based on the initial data collection three universities.
and analyses, a revised instrument Data from the completed assess-
was developed. The revised instru- ments with the first sample were
ment consisted of 27 statements and used to perform both an item and
the same 5-point Likert-type scale factor analysis. These analyses ex-
response format. Thirteen questions amined the individual items and
from the initial instrument were re- groups of items of the initial instru-
moved and multiple questions were ment, providing information for the
re-written to improve their ability determination of which subset(s) of
to communicate the constructs they questions would best provide a valid
represented. Given the number and assessment of the desired construct.
characteristics of the changes to the Following the revisions to the ini-
initial instrument, a second sample tial instrument based on the item and
group of participants was used to factor analysis, participants from the
provide evidence regarding the second sample completed the revised
internal consistency of the revised instrument. The resultant data were
instrument. used to complete a second item and
In addition, a second survey was factor analysis of the revised instru-
developed to assess the perceptions ment.
of performance in actual e-learning The third, and final, sample com-
experiences. This survey permitted pleted the second revision to the
participants to self-define their “suc- instrument. Participants in the third
cess” in completing e-learning experi- sample completed both the instru-
ences, and was delivered completely ment used with the second sample
online. The survey consisted of seven and the self-perceived performance
questions; five multiple choice items survey using an online data collection
and two open response items. This site. Data from the second assessment
additional survey was provided only (self-perceived performance upon
to the third, and smaller, sample of competition of an online experience)
USCG personnel who were expe- were intended to be used to examine
rienced students in the e-learning the relationship of readiness with
environment. self-perceived performance in an ac-
tual online learning experience, but
Procedure had to be discarded due to a technical
The first sample of participants error that corrupted the data and cre-
was asked to complete the initial ated inconsistencies that could not be
online learner readiness instrument remedied by the researchers.

72 Performance Improvement Quarterly


Results analysis, item-total statistics were
Data Analysis: Sample One calculated which provided indica-
Using data from the first sample, tions of the relationship of individual
Cronbach’s Alpha coefficient for each items to the overall composite score
scale within the instrument was of each scale within the instrument.
used to determine the strength of the These item-total statistics included
relationship among the items within the corrected item-total correlation,
each scale. The Alpha coefficient is the multiple-squared-correlation,
based on the average correlation of and the Alpha if a respective item
items within the scale and/or sub- were deleted. Items for which the “Al-
scales of the measure. Based on the pha if item deleted” was greater than
coefficients, the determination was the overall Alpha of the scale were
made regarding which scales should earmarked for possible modification
be combined or removed as well as or deletion from the instrument.
which scales could benefit from the Several scales and individual
addition or subtraction of items. An items of the instrument were revised
Alpha value of at least 0.8 was select- based on this analysis. Evidence sup-
ed as a minimal value for accepting porting increases in Cronbach’s Al-
the items in a category (Kehoe, 1995). pha within a scale when an item was
Each of the Cronbach Alpha coeffi- removed was used to justify reducing
cients for the instrument’s scales are the instrument from 40 questions to
presented in Table 2. 27 questions.
Based on this data analysis, items
were selected for inclusion, modifica- Data Analysis: Sample Two
tion, or deletion from the next version An exploratory factor analysis of
of the instrument. As part of this the data from the second sample was

Table 2
Cronbach’s Alpha Coefficients for Each Subscale Within the
Initial Self-assessment (i.e., sample one)

Self-assessment category Number of Items Cronbach’s Alpha


Technology Access 4 0.88
Technology Skills 5 0.88
Online Relationships 6 0.86
Motivation 4 0.82
Online Readings 3 0.72
Online Video/ Audio 3 0.85
Internet Chat 3 0.82
Discussion Boards 3 0.05
Online Groups 3 0.73
Importance to your success 6 0.79

Volume 17, Number 4/2004 73


then conducted to discern the un- nology (e.g., building online working
derlying factors of the relationships relationships). As a result, in the
found among the items included in revised instrument these questions
the revised instrument. After analy- are grouped into a single category of
sis, the consistency (i.e., communal- Online Skills and Relationship.
ity as represented statistically by In addition to the exploratory
Eigenvalues or coefficients) of the factor analysis, Cronbach’s Alpha
constructs was used to identify how coefficients were again calculated for
well each variable was predicted each item of the revised instrument
by the remaining items in the as- completed by the second sample.
sessment1. Eigenvalues over 1.00 This process resulted in reducing the
customarily suggest the number of interpretation of the 27 statement
factors to analyze (Gorsuch, 1983). instrument to just six factors (cited
Table 3 presents Eigenvalues along in Table 4).
with a scree plot of the data. Both
suggest that the revised instrument Data Analysis: Sample Three
was best considered as a measure of The final data set to be collected
six discrete constructs. and reviewed in the study was col-
Based on the results of the ex- lected from the third sample of USCG
ploratory factor analysis, the re- participants. These participants were
sults of the “Technology Skills” and experienced in online training and
“Online Relationships” scales were education, and enrolled in USCG rate
considered representative of a single training. The participants in the third
construct. After reviewing the items sample completed both the revised e-
within each scale of the instrument learning readiness self-assessment
the researchers concluded that items used with the second sample, as well
related to technical skills (e.g., us- as a seven-item survey of their self-
ing email) could be combined with perceived performance in e-learning.
items related the to the content of Data from the third instrument were
the communications using the tech- not analyzed however due to techni-

Table 3
Eigenvalues After Varimax Rotation

Individual Cumulative
No. Percent Scree Plot
Eigenvalue Percent
1 3.178536 16.11 16.11 ||||
2 2.929239 14.84 30.95 |||
3 3.120172 15.81 46.76 ||||
4 5.145354 26.07 72.83 ||||||
5 1.955438 9.91 82.74 ||
6 1.845009 9.35 92.09 ||
Note: Eigenvalues under 1.0 are not reported for the sake of brevity.

74 Performance Improvement Quarterly


Table 4
Cronbach’s Alpha Coefficients for Each Subscale Within the
Revised Instrument

Self-assessment category Number of Items Cronbach’s Alpha


Technology Access 3 0.95
Online Skills and Relationships 9 0.95
Motivation 3 0.88
Online Audio/Video 3 0.90
Internet Discussions 4 0.74
Importance to your success 5 0.86

cal problems with the online version evidence supporting only the internal
of the instruments. consistency of items within the E-
learner Readiness Self-assessment:
Discussion a necessary (but not sufficient) step
After completing analyses using in demonstrating the overall valid-
data from the second sample, it was ity and utility of the measure. The
determined that a third version of revised E-learner Self-Assessment
the instrument could be developed demonstrated characteristics of in-
that would demonstrate the desired ternal consistency that make it an ap-
internal consistency necessary for propriate candidate as an instrument
continuing research. The revised for continued research regarding its
instrument (see Appendix) included external utility (i.e., predictability).
the integration of items in scales for The researchers plan to continue
“Technology Skills and Online Rela- their research in the validation of the
tionships.” Although items in the sub- instrument by evaluating the ability
scale of “Internet Discussions” were of the instrument to predict perfor-
identified with a Cronbach Alpha co- mance in a wide-range of e-learning
efficient less than the desired .8, since experiences.
only marginal benefits from deleting This study of the initial and
individual items were evident from revised instrument does, however,
the data analysis, those items were provide evidence to support that the
included in the revised instrument. questions used consistently measure
These items were subject to a few the desired scales that were initially
changes in wording. derived from the e-learning litera-
Unfortunately, data collected to ture. Consequently, future versions
support the external validity of the of the E-learning Readiness Self-as-
instrument could not be analyzed due sessment may provide practitioners
to technical problems. Continuing ef- and researchers with a valid and
forts to obtain a fourth sample were reliable instrument for measuring
not within the scope of the study. As the readiness of learners for success
a result, the study concluded with in the online classroom.

Volume 17, Number 4/2004 75


Continued research in several American Society for Training and
areas is necessary in the continued Development (2003). ASTD 2003
refinement of an effective self-as- state of the industry report execu-
tive summary. Retrieved March 31,
sessment of readiness. For example,
2004, from http://www.astd.org/NR/
the distinction between “Technology rdonlyres/6EBE2E82-1D29-48A7-
Skills” and “Online Relationships” 8A3A-357649BB6DB6/0/SOIR_2003_
is worthy of further exploration. Executive_Summary.pdf
Likewise, additional data on the re- Carr, S. (2000). As distance education
lationship of scale scores with both comes of age, the challenge is keep-
perceived and actual success from ing the students. Chronicle of Higher
multiple perspectives (e.g., supervi- Education, 46(23). Retrieved May 10,
sor, learner, and instructor) could be 2004 from http://chronicle.com/free/
v46/i23/23a00101.htm
essential in supporting the broad use
Chan, M., Yum, J., Fan, R., Jegede, O.,
of the instrument as a predictive and & Taplin, M. (1999). A comparison
prescriptive tool for those consider- of the study habits and preferences
ing e-learning opportunities. Lastly, of high achieving and low achieving
as research into e-learning continues, Open University students. Proceed-
other potential constructs related to ings from the Conference of the Asian
new technologies and online teaching Association of Open Universities. Re-
strategies should be pursued as pro- trieved May 10, 2004 from http://www.
spective items for the instrument. ouhk.edu.hk/cridal/papers/chanmsc.
pdf (5-10-02).
DeVellis, R.F. (2003). Scale development:
Funding for this study was pro- Theory and applications. Thousand
vided by the International Society Oaks, CA: Sage.
for Performance Improvement. Gorsuch, R.L. (1983). Factor analysis (2nd
ed.). Hillsdale, NJ: Lawrence Erlbaum
Notes Associates.
1 Guglielmino, P., & Guglielmino, L.
Communality is similar to the (2003). Are your learners ready for
R-Squared value that would be e-learning? In G. Piskurich (Ed.),
achieved if a variable were regressed The AMA handbook of e-learning.
on the retained factors, though cal- New York: American Management
culations are based on an adjusted Association.
correlation matrix. Human Resource Research Organization
(2004). Population representation in
References military services. Retrieved May 15,
Anderson, M.R. (1993). Success in dis- 2004, from http://www.humrro.org/
tance education courses versus tradi- poprep2002/contents/contents.htm
tional classroom education courses. Kaufman, R., Watkins, R., & Guerra, I.
Unpublished doctoral dissertation, (2002). Getting valid and useful edu-
Oregon State University. cational results and payoffs: We are
Aragon, S.R., Johnson, S.D., & Shaik, N. what we say, do, and deliver. Interna-
(2000). The influence of learning style tional Journal of Educational Reform,
preferences on student success in on- 11(1), 77-92.
line vs. face-to-face environments. In Kehoe, J. (1995). Basic item analysis
G. Davies & C. Owen (Eds.), Proceed- for multiple-choice tests. Practical
ings of WebNet 2000—World Confer- Assessment, Research & Evaluation,
ence on WWW and Internet (pp. 17-22). 4(10). [Available online: http://ericae.
San Antonio, TX: AACE. net/pare/getvn.asp?v=4&n=10]

76 Performance Improvement Quarterly


Lim, S., Eberstein, A., & Waint, B. (2002). RYAN WATKINS, Ph.D. is an as-
Unpublished student paper. Florida sociate professor at the George
State University. Washington University in Wash-
NCES (1999). Distance education tech-
ington, DC. Ryan teaches courses
nologies. Retrieved April 1, 2004, from
http://nces.ed.gov/surveys/peqis/pub- in instructional design, distance
lications/2000013/6.asp education, needs assessment, as
Pachnowski, L.M., Lynne, M., & Jurczyk, well as technology management.
J.P. (2000, February). Correlating He is an author of the best selling
self-directed learning with distance books E-learning Companion (2005)
learning success. Proceedings from and 75 E-learning Activities (2005).
the Eastern Educational Research Web: http://www.ryanrwatkins.com.
Association, Clearwater, FL. E-mail: rwatkins@gwu.edu
Stephenson, J. (2003, April). A review of
research and practice in e-learning in
the workplace and proposals for its DOUG LEIGH, Ph.D. is an assistant
effective use. Proceedings from the professor with Pepperdine Univer-
Annual Meeting of the American Edu- sity’s Graduate School of Education
cational Research Association, April in Los Angeles, CA, and previously
21-25, 2003, Chicago, IL. served as technical director of proj-
The Sloan Consortium (2004). Sizing the ects for Florida State University.
opportunity: The quality and extent of Coauthor of two books (Strategic
online education in the United States, Planning for Success, 2003 and
2002 and 2003. Retrieved April 1,
Useful Educational Results, 2001),
2004, from http://www.aln.org/re-
sources/overview.asp his ongoing research, publication
Watkins, R. (2003). Readiness for on- and lecture interests concern needs
line learning self-assessment. In E. assessment, evaluation, change
Biech (Ed.), The 2003 Pfeiffer annual: creation, and related topics. Web:
Training. San Francisco: Jossey-Bass- http://www.dougleigh.com. E-mail:
Pfeiffer. dleigh@pepperdine.edu
Watkins, R. (2004). Ends and means:
Defining success. Distance Learning DON TRINER, Commander, USCG,
Magazine, 1(2), 85-86.
has over 20 years experience in ap-
Watkins, R., & Corry, M. (2004). E-learn-
ing companion: A student’s guide to plied leadership and learning. He is
online success. New York: Houghton the prospective commanding officer
Mifflin. of the MACKINAW, which will be
the most technologically advanced
ship in the Coast Guard when deliv-
ered. E-mail: DTriner@aol.com

Volume 17, Number 4/2004 77


Appendix

1 = Completely Disagree
2 = Strongly Disagree
3 = Not Sure
4 = Strongly Agree
5 = Completely Agree

1. I have access to a computer with an Internet


1 2 3 4 5
connection.
Technology
Access

2. I have access to a fairly new computer (e.g., enough


1 2 3 4 5
RAM, speakers, CD-ROM).
3. I have access to a computer with adequate software
1 2 3 4 5
(e.g., Microsoft Word, Adobe Acrobat).

4. I have the basic skills to operate a computer (e.g.,


1 2 3 4 5
saving files, creating folders).
5. I have the basic skills for finding my way around
the Internet (e.g., using search engines, entering 1 2 3 4 5
passwords).
6. I can send an email with a file attached.
1 2 3 4 5
Online Skills and Relationships

7. I think that I would be comfortable using a


computer several times a week to participate in a 1 2 3 4 5
course.
8. I think that I would be able to communicate
effectively with others using online technologies (e.g., 1 2 3 4 5
email, chat).
9. I think that I would be able to express myself
clearly through my writing (e.g., mood, emotions, and 1 2 3 4 5
humor).
10. I think that I would be able to use online tools (e.g.,
email, chat) to work on assignments with students 1 2 3 4 5
who are in different time zones.
11. I think that I would be able to schedule time to
provide timely responses to other students and/or 1 2 3 4 5
the instructor.
12. I think that I would be able to ask questions and
1 2 3 4 5
make comments in clear writing.

13. I think that I would be able to remain motivated


1 2 3 4 5
even though the instructor is not online at all times.
Motivation

14. I think that I would be able to complete my work


even when there are online distractions (e.g., friends 1 2 3 4 5
sending emails or Websites to surf).
15. I think that I would be able to complete my work
even when there are distractions in my home (e.g., 1 2 3 4 5
television, children, and such).

78 Performance Improvement Quarterly


Appendix (continued)

16. I think that I would be able to relate the content


Online Audio/Video

of short video clips (1-3 minutes typically) to the 1 2 3 4 5


information I have read online or in books.
17. I think that I would be able to take notes while
1 2 3 4 5
watching a video on the computer.
18. I think that I would be able to understand course-
related information when it’s presented in video 1 2 3 4 5
formats.

19. I think that I would be able to carry on a


conversation with others using the Internet (e.g., 1 2 3 4 5
Internet chat, instant messenger).
Internet Discussions

20. I think that I would be comfortable having several


discussions taking place in the same online chat
1 2 3 4 5
even though I may not be participating in all of
them.
21. I think that I would be able to follow along with
an online conversation (e.g., Internet chat, instant 1 2 3 4 5
messenger) while typing.
22. I sometimes prefer to have more time to prepare
1 2 3 4 5
responses to a question.

23. Regular contact with the instructor is important to


1 2 3 4 5
my success in online coursework.
Importance to your success

24. Quick technical and administrative support is


1 2 3 4 5
important to my success in online coursework.
25. Frequent participation throughout the learning
process is important to my success in online 1 2 3 4 5
coursework.
26. I feel that prior experiences with online
technologies (e.g., email, Internet chat, online
1 2 3 4 5
readings) are important to my success with online
course.
27. The ability to immediately apply course materials
1 2 3 4 5
is important to my success with online courses.

Volume 17, Number 4/2004 79

View publication stats

You might also like