Are Face To Face Classes More Effective Than Online Classes An Empirical Examination

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 16

Marketing Education Review

ISSN: 1052-8008 (Print) 2153-9987 (Online) Journal homepage: https://www.tandfonline.com/loi/mmer20

Are Face-to-Face Classes More Effective Than


Online Classes? An Empirical Examination

Gopala Ganesh, Audhesh Paswan & Qin Sun

To cite this article: Gopala Ganesh, Audhesh Paswan & Qin Sun (2015) Are Face-to-Face
Classes More Effective Than Online Classes? An Empirical Examination, Marketing Education
Review, 25:2, 67-81, DOI: 10.1080/10528008.2015.1029851

To link to this article: https://doi.org/10.1080/10528008.2015.1029851

Published online: 15 Jun 2015.

Submit your article to this journal

Article views: 955

View related articles

View Crossmark data

Citing articles: 9 View citing articles

Full Terms & Conditions of access and use can be found at


https://www.tandfonline.com/action/journalInformation?journalCode=mmer20
ARE FACE-TO-FACE CLASSES MORE EFFECTIVE THAN ONLINE CLASSES?
AN EMPIRICAL EXAMINATION
Gopala Ganesh, Audhesh Paswan, and Qin Sun

Using data from a unique undergraduate marketing math course offered in both traditional and
online formats, this study looks at four dimensions of course evaluation: overall evaluation, per-
ceived competence, perceived communication, and perceived challenge. Results indicate that stu-
dents rate traditional classes better on all four dimensions. However, the relationships between
outcome variables (overall evaluation and perceived competence) and controllable variables
(perceived communication and challenge) remain consistent across both formats. Perceived
competence and perceived communication are positively associated with overall evaluation of the
course. While perceived challenge has no significant relationship with overall evaluation, it is
positively associated with perceived competence.

Online classes and degree programs, as a component of Earlier studies on distance learning show that distance-
distant-learning initiatives, have attracted a lot of learning classes (e.g., using Intranet, Internet, one-way
attention from educational institutions, administrators, and two-way audio and video delivery methods) and
policymakers, and society at large. According to results programs were perceived as somewhat ineffective by
of Pew Research Center 2011 Surveys, about 23 percent both students and faculty (Bergmann & Dobie, 1999;
of college graduates have taken online courses, and of Ponzurick, France, & Logar, 2000; Wayland, Swift, &
those, about 15 percent have earned a degree comple- Wilson, 1996), while more recent research indicates an
tely online. Viewed from the supply side, 91 percent of increasing confidence in computer-mediated distant
two-year colleges, 89 percent of four-year public col- learning formats (Peltier, Schibrowsky, & Drago, 2007;
leges and universities, and 60 percent of four-year pri- Wood, Solomon, & Allan, 2008). For example, Clarke,
vate schools offer online classes (Pew Research Flaherty, and Mottner (2001) found no significant differ-
Center, 2011). The tremendous growth in online ence between the effectiveness of online and offline
courses and programs could be attributed to the flex- courses. In addition, while 29 percent of the public
ibility and convenience provided by this delivery for- believes that the value of online courses is equivalent to
mat (Bristow, Humphreys, & Ziebell, 2011; Mintu- that provided by face-to-face courses, a much higher per-
Wimsatt, 2001; Peltier, Drago, & Schibrowsky, 2003; centage of college presidents do (51 percent) (Pew
Roberts, 1996), the advancement of information tech- Research Center, 2011). Some researchers even assert
nology (Berger & Topol, 2001; Close, Dixit, & that online education could provide a superior learning
Malhotra, 2005; Hatfield & Taylor, 1998; Malhotra, experience for students (Peltier, Schibrowsky, &
Dixit, & Uslay, 2002), and the unique characteristics Drago, 2007; Sherif & Kahn, 2005). At the very least, the
of adult and millennial generation learners (Eastman & online delivery of courses provides a viable and flexible
Swift, 2001; Wood, Solomon, & Allan, 2008). option to college students, especially to those with time,
distance, family, career, or language constraints (Liu,
Gomez, Khan, & Yen, 2007; Steiner & Hyman, 2010;
Stewart, 2004). Notwithstanding such a rich literature
Gopala Ganesh (Ph.D., University of Houston), University base, the debate on the capability and effectiveness of
Distinguished Teaching Professor, College of Business, online format for delivery of education relative to tradi-
University of North Texas, Denton, TX, ganesh@unt.edu.
tional face-to-face format is still open. Empirical studies
Audhesh Paswan (Ph.D., University of Mississippi), Professor
are needed to examine the potential influence of mixed
of Marketing, College of Business, University of North Texas,
Denton, TX, paswana@unt.edu. teaching methods on learning effectiveness for market-
Qin Sun (Ph.D., University of North Texas), Assistant ing students (Ganesh, Sun, & Barat, 2010). Against this
Professor, College of Business Administration, Trident backdrop, in this paper, we compare the face-to-face and
University International, Cypress, CA, qin.sun@trident.edu.

Marketing Education Review, vol. 25, no. 2 (summer 2015), pp. 67–81.
Copyright Ó 2015 Society for Marketing Advances
ISSN: 1052–8008 (print) / ISSN 2153–9987 (online)
DOI: 10.1080/10528008.2015.1029851
68 Marketing Education Review

online course delivery formats on various outcome and design, instructors’ subject knowledge, instructors’
input dimensions, and explore the relationships among capability to communicate clearly and create rapport
these dimensions. Specifically, we focus on two perceived with the students, and to deliver the course enthusias-
outcome dimensions (perceived competence and overall tically (Paswan & Young, 2002; Sweeney, Morrison,
evaluation of the course) and two input dimensions Jarratt, & Heffernan, 2009). In comparison, the effec-
(perceived challenge and perceived communication in tiveness of online courses is seen to be associated with
the course). factors such as student-to-student interactions, stu-
The findings have important implications not just dent-to-instructor interactions, instructor support
for the education sector, but also for public policy- and mentoring, lecture delivery quality, course con-
makers and society at large. Most states and educa- tent, and course structure (Clayson, 2009; Peltier,
tional institutions are facing an ever-increasing Drago, & Schibrowsky, 2003; Peltier, Schibrowsky, &
shortage of resources and everyone in the education Drago, 2007; Ponzurick, France, & Logar, 2000). In a
sector (e.g., educators, administrators, policymakers, face-to-face class, traditional instructors serve as the
and even students) is constantly looking for ways to key information source for the students, whereas
increase efficiency without sacrificing quality or effec- those who teach online courses are mainly seen as
tiveness (Clayson, 2009; Mintu-Wimsatt, 2001; Peltier, facilitators (Berger & Topol, 2001).
Drago, & Schibrowsky, 2003; Peltier, Schibrowsky, & Teaching effectiveness refers to the extent to which
Drago, 2007; Roberts, 1996; Wood, Solomon, & students learn and gain knowledge in a specific course
Allan, 2008). One solution that is attracting almost (Cohen, 1981). However, there seems to be some diver-
everyone’s attention is the use of increasingly popular gence in opinion about how to measure teaching effec-
distant-learning technologies, especially given the tiveness. One perspective focuses on perceived learning,
rapid advancements in the Internet and information perceived gain in competence, and the affective dimen-
technologies (Berger & Topol, 2001; Clayson, 2009; sions. In business schools, it is common to see educators
Clarke, Flaherty, & Mottner, 2001; Close, Dixit, & and administrators using some version of the Student
Malhotra, 2005; Hatfield & Taylor, 1998; Malhotra, Evaluation of Teaching Effectiveness (SET) measurement
Dixit, & Uslay, 2002). Despite opposition from various scales (Clayson, 2009; Ganesh & Paswan, 2010; Paswan
quarters, it seems that distant-learning classes and pro- & Young, 2002) and student self-assessment of know-
grams are here to stay and flourish. One argument put ledge (Sitzmann, Ely, Brown, & Bauer, 2010). These
forth by opponents is that simply delivering online measures of teaching effectiveness and learning seek to
courses and programs in an online-only environment capture students’ perceived competence, perceptions
might compromise the quality of education. Hence, about the antecedent dimensions, and overall evalua-
there is a need to better understand the differences tion (i.e., affective dimension) of the course (Sitzmann,
and similarities between distance learning and face-to- Brown, Casper, Ely, & Zimmerman, 2008; Sitzmann, Ely,
face teaching. In addition, there is limited research on Brown, & Bauer, 2010). A second perspective focuses on
the relationships among the outcome and input vari- objective indicators of teaching effectiveness and argues
ables (e.g., in this study, the overall evaluation of the that measures such as test scores and GPA are valid,
course, perceived competence, perceived communica- reliable, and hence better indicators of cognitive learn-
tion, perceived challenge) from the students’ perspec- ing outcomes (Bacon & Bean, 2006; Flanagan, 2012;
tives (Parayitam, Desai, & Phelps, 2007). We hope that Kraiger, Ford, & Salas, 1993). Some even argue that the
this study contributes to that effort and helps us make real test of teaching effectiveness is the extent to which
more informed evaluations and decisions. students are able to use what they learn in a classroom at
work or in life (Finch, Nadeau, & O’Reilly, 2013;
Wymbs, 2011). Finally, a few have tried to reconcile
LITERATURE REVIEW
these divergent perspectives by suggesting that teaching
The literature on education in general, and business effectiveness is closely related to student learning out-
and marketing education in particular, is rich with comes (Cohen, 1981).
studies focusing on the effectiveness of teaching effort The biggest hurdle faced by these bridge builders is
and its antecedents. The effectiveness of face-to-face the inability to link the perceived or affective measures
teaching was found to be related to instructional of teaching effectiveness, the objective measures of
Summer 2015 69

learning, and the subsequent use of knowledge to a that they are becoming more competent in the subject
particular student or a particular instructor. The reasons matter, they are more likely to feel positive about the
for the difficulty, to name a few, include legal matters, course and the teacher. As a result, we hypothesize a
issues of political correctness, and the nonlinear nature positive relationship between these two dimensions
of how people acquire and use knowledge. We too capturing perceived teaching effectiveness:
faced the problem of an inability to link the name of
Hypothesis 1: Perceived competence will be positively
the student and their objective assignment scores,
associated with overall evaluation of the course.
exam performance, and GPA to their responses on the
perceived and affective measures of teaching effective- We next discuss those important aspects of a class-
ness. Having acknowledged this shortcoming, we next room that the instructor can control in both online
discuss the key variables and the various hypothesized and face-to-face environments, and which influence
relationships among them. either perceived competence or the overall evaluation
of the course—perceived communication in the class-
Overall Evaluation and Perceived Teaching room and the perceived challenge of successfully com-
Competence pleting the course. Identification of these constructs is
in sync with the existing literature on self-regulated
Overall evaluation of a course is designed to test whether learning, intrinsic versus extrinsic motivation, and
the students feel that they got what they should be getting teaching effectiveness (Koon & Murray, 1995; Paswan
from the course and whether they liked the course or not. & Young, 2002; Tang, 1997; Young, 2005).
Literature on technology adoption (Davis, 1989, 1993;
Davis, Bagozzi, & Warshaw, 1989) and marketing
(Darke, Chattopadhyay, & Ashworth, 2006; Ding, Perceived Communication
Eliashberg, Huber, & Saini, 2005) have amply demon-
strated that consumers’ overall evaluation or the affective Perceived communication refers to students’ percep-
evaluation of a stimulus helps in technology or product tions of a teacher’s communication skills. Several stu-
adoption and is associated with their satisfaction with the dies in the education literature have established that
stimulus. In the teaching evaluation field, most SET scales perceived communication is a key determinant of effec-
tend to include some measure of overall satisfaction with tive teaching (Andrew, Cobb, & Giampietro, 2005;
the course and/or the instructor (Wachtel, 1998). In this Clayson, 2009; Neves & Sanyal, 1991; Paswan &
study, we look at this overall evaluation as a subjective Young, 2002; Roberts & Becker, 1976; Sweeney,
measure of teaching effectiveness; it is defined as learning Morrison, Jarratt, & Heffernan, 2009). Open communi-
value as perceived by students. cation between professors and students could enhance
Our second focal construct capturing teaching effec- student perceived ease of use of a course and perceived
tiveness is perceived competence. It is defined as the usefulness of the course (Barat, Rajamma, Zolfagharian,
students’ feeling of mastering their class work & Ganesh, 2009). In addition, the interactions between
(Harter, 1981; Pintrich & De Groot, 1990). Those stu- students and professors enhance the students’ knowl-
dents with higher perceived competence have been edge acquisition, thus increasing their perceived com-
found to be more intrinsically motivated and likely to petence in accomplishing the course tasks (Jeltova
use self-regulated learning strategies than those with et al., 2007). While some students have difficulty
lower perceived competence (Young, 2005), leading to decoding the feedback from professors, thus undermin-
higher test performance in the former group (Bicen & ing the effectiveness of communication (Higgins, 2000)
Laverie, 2009). Furthermore, when students anticipate and thereby impeding students’ perceived competence,
higher test grades, they tend to evaluate the course in general, most researchers have found positive rela-
better (Clayson, 2009; Sweeney, Morrison, Jarratt, & tionships between student perception of classroom
Heffernan, 2009). In addition, based on the TAM communication and perceived teaching effectiveness,
(Technology Acceptance Model) literature that exam- as well as satisfaction with the instructor and the
ines the relationship between level of competence and course (Parayitam, Desai, & Phelps, 2007; Paswan &
attitude (Davis, 1989, 1993; Davis, Bagozzi, & Young, 2002). Thus, we hypothesize that perceived
Warshaw, 1989), we argue that when students feel communication level in the class will be positively
70 Marketing Education Review

associated with the overall evaluation of the course and developments in online education, most students still
perceived competence gained through the course. perceive face-to-face classes more positively than online
classes, except perhaps on the convenience dimension
Hypothesis 2a: Perceived communication will be posi-
(Mintu-Wimsatt, 2001; Peltier, Schibrowsky, &
tively associated with overall evaluation of the course.
Drago, 2007; Ponzurick, France, & Logar, 2000). In addi-
Hypothesis 2b: Perceived communication will be posi- tion, researchers have found that face-to-face classes are
tively associated with perceived competence. normally associated with higher level of communication
and rapport in the classroom (Sweeney, Morrison, &
Heffernan, 2009).
Perceived Challenge By comparison, such associations are weak in an
online, typically asynchronous, environment due to
Based on the extant literature, perceived challenge is the relatively rare active classroom interaction
defined as the students’ assessment of the amount of between the instructor and students. However, with
work in the class and the extent to which the class is the complement of technological tools such as discus-
difficult. Literature on education effectiveness and per- sion forum, electronic conference, wikis, messaging
ceived course challenge (Clayson, 2009; Parayitam, and e-mails, online courses could create effective
Desai, & Phelps, 2007; Paswan & Young, 2002) suggests interactive communities to facilitate and encourage
that it may be directly associated with perceived compe- experiential learning among students (Park, Crocker,
tence gained through the class but inversely with the Nussey, Springate, & Hutchings, 2010; Wood,
overall evaluation of the course and the instructor. Solomon, & Allan, 2008). In addition, online learning
Literature on student evaluations of faculty suggests is enhanced by students having more time and flex-
that more demanding courses have a negative effect on ibility to think, search, and then ask or respond to
students’ evaluation of the course and the faculty questions via resources such as discussion board or
(Paswan & Young, 2002). Some have started questioning online chat, while also being able to repetitively
the notion of making classes easy to boost self-confidence review all the previous discussions and course materi-
and perception of confidence (Clifford, 1990). Others als that are typically recorded (Eastman & Swift, 2001).
have found that positive feedback provided by instructors Thus, experiential learning can be promoted by inter-
would result in higher level of perceived competence, active online teaching, which conforms to the require-
which increases students’ comfort level and lowers their ment of the Association to Advance Collegiate Schools
perception of challenge (Boggiano, Main, & Katz, 1988; of Business International (AACSB) that their members
Boggiano, Ruble, & Pittman, 1982). Taking these findings increase flexibility and responsiveness to student
into account, we would expect students to have less satis- needs (AACSB, 2003; Hatfield & Taylor, 1998).
factory evaluations of the course if it is perceived to be Furthermore, the effectiveness of online courses has
more challenging, but at the same time, they may feel been linked with level of interaction, instructor sup-
that they have become more competent. port and mentoring, lecture delivery quality, course
content, and course structure (Clayson, 2009; Peltier,
Hypothesis 3a: Perceived challenge will be negatively
Drago, & Schibrowsky, 2003; Peltier, Schibrowsky, &
associated with overall evaluation of the course.
Drago, 2007; Ponzurick, France, & Logar, 2000).
Hypothesis 3b: Perceived challenge will be positively Summarizing the findings from these studies, we
associated with the perceived competence. find that while the online medium is becoming very
popular among many students, faculty members, and
administrators, there are still some skeptics about its
effectiveness in comparison with the tried and tested
Course Delivery Format
face-to-face teaching medium. It is possible that some
Literature on online versus face-to-face teaching of these negative feelings may be attributed to the
(Bergmann & Dobie, 1999; Clayson, 2009; Mintu- initially weak online course delivery technologies.
Wimsatt, 2001; Ponzurick, France, & Logar, 2000; Recent developments have made great strides in mak-
Wayland, Swift, & Wilson, 1996) suggests that despite ing the online medium more attractive. This is particu-
the popularity of the online format and technological larly relevant for a student body that literally lives with
Summer 2015 71

electronic gadgets, such as cell phones, tablet compu- taken after the mandatory Principles of Marketing
ters, and Internet resources such as text messages, core course. This highly structured course is offered in
Facebook, and Twitter, to name a few (Malhotra, both face-to-face and online instructional formats that
Dixit, & Uslay, 2002; McGorry, 2006; Wymbs, 2011; do not differ in cost to the student or in compensation
Zahay & Fredricks, 2009). Given these somewhat to the professor and the enrollment is self-selected
inconclusive and conflicting evidences, we believe (although the room capacity–constrained face-to-face
that while face-to-face classes may still be seen in a class typically fills up faster).
more positive light, the relationship between teaching
effectiveness dimensions (i.e., overall evaluation and
Description of the Two Instructional Formats
perceived competence) and controllable dimensions
(i.e., perceived communication and perceived Data collection for this study was done over a period of
challenge) will be similar in both formats. four years during which both instructional formats
were taught using an identical resource pool, a collec-
tion of 50 to 60 one-page minicases on various market-
Hypothesis 4: The course delivery format (face-to-face ing math topics. In the face-to-face class, each topic
versus online) will have no moderating effect on the was discussed by the professor while solving several
relationship between the overall evaluation of the relevant minicases on an overhead projector. A series
course and perceived competence. of step-by-step discussion questions guided and encour-
Hypothesis 5a: The course delivery format (face-to-face aged wide participation by the students. In the online
versus online) will have no moderating effect on the format, each topic was implemented as an enhanced
relationship between overall evaluation of the course PDF PowerPoint presentation. Several relevant mini-
and perceived communication. cases, accompanying blank, professor-designed Excel
worksheets and their solution PDFs were hyperlinked
Hypothesis 5b: The course delivery format (face-to-face within the presentation. One unique resource created
versus online) will have no moderating effect on the for most of the cases, primarily for the benefit of the
relationship between overall evaluation of the course online class, was a Camtasia Audio+Video (A+V) record-
and perceived challenge. ing in which the professor solved the case by hand on a
writing tablet while explaining the steps. Some of the
Hypothesis 6a: The course delivery format (face-to-face cases were also solved using Excel and the process was
versus online) will have no moderating effect on the recorded as an A+V. Since the Web site and A+Vs were
relationship between perceived competence and per- available anyway, the face-to-face students were
ceived communication. allowed access to a mirrored Web site, with content
activated on a delayed basis, after coverage in class.
Hypothesis 6b: The course delivery format (face-to-face
This “after class, electronic tutor” was a special and
versus online) will have no moderating effect on the
unusual resource for the face-to-face class. The course
relationship between perceived competence and per-
and its implementation are described in detail else-
ceived challenge.
where (Ganesh, Sun, and Barat, 2010). The online
The following section discusses the research method class was closely monitored by the professor to com-
used to test the hypothesized relationships developed pensate for absence of the typical face-to-face interac-
above. tion and therefore consumed a lot of time to answer
student clarification questions, which were unpredict-
RESEARCH METHOD able in terms of number, timing, and complexity.
These had to be handled using online resources such
The research context for this study was an undergrad- as a voluntary participation discussion board, which
uate marketing math course, offered at a major public was quite busy during the semester.
university in the southwestern United States. The All graded work that made up the 1,000-point seme-
course is designed to improve the students’ quantita- ster total were identical in both instructional formats,
tive skills and analytic thinking. It is a required course required individual effort, and consisted of several
for undergraduate marketing majors and is usually components, namely, Excel homework assignments,
72 Marketing Education Review

comprehensive cases, and multiple-choice midterm instructional formats were in their early years.
and final exams, described briefly as follows: Students were offered a 0.5 percent course grade incen-
tive for their voluntary participation (self-identified
1. Three Excel homework assignments, each with
through their voluntary response at a different Web
five cases and worth 100 points, in which stu-
site). This made little difference in the grade outcome.
dents worked on professor-provided blank Excel
Of the 1,104 students (which is a response rate of 90
worksheets to exactly match the professor’s acces-
percent) who responded, 1,038 identified the instruc-
sible PDF solution and submitted their hard copy
tional format of their class. About 40 percent, or 428
solution and formula sheets. Typically, three of
respondents, identified themselves as being from the
the five cases were already solved by hand in
face-to-face class, while the remaining 60 percent, or
class; professor-recorded Audio+Videos (A+Vs)
610, were from the online class. We would like the
were also made available for these. The remaining
reader to note that both face-to-face and online sec-
two were “challenge cases” not solved in class for
tions were taught using identical materials, assign-
which the professor’s blank Excel worksheet and
ments, and the multiple-choice, computer-graded
its solution PDF were both provided but there was
exams, all prepared by the professor who designed the
no accompanying A+V.
course. While that professor taught all the face-to-face
2. Comprehensive Case 1 (100 points), in which
and some of the online sections, a doctoral student TA
students first designed and then solved their
assigned to the professor was the instructor of record
own Excel worksheet for a single case not solved
for the remaining online classes. Even for those online
in class with no professor’s PDF solution and only
classes, however, the professor took care of responses to
a few hint answers to guide them.
discussion board posts by students and set up the grad-
3. Comprehensive Case 2 (100 points), in which
ing criteria, and carefully supervised grading of the
students designed a PowerPoint presentation for
assignments by the TA to ensure maximum consistency
the CC1 case, to detailed specifications, using the
between the two instructional formats.
completed CC1 worksheet.
4. Multiple-choice format Midterm Exam (200
points).
5. Multiple-choice format Final Exam (300 points). ANALYSIS AND FINDINGS
For all non-exam assignments, students were
Tables 1 and 2 present a comparison of the students
allowed to help each other through study teams
who took the class in the face-to-face and online for-
or the discussion board, with the requirement
mats. Other than gender and self-reported CGPA
that they use the current semester’s materials
(cumulative GPA, which were similar across the two
and declare in writing that they did not copy
class formats) the significant differences are expected
someone else’s work. In CC1, they were allowed
and make sense. For instance, a greater proportion of
to consult each other in solving the case by hand,
the face-to-face students lived in the university town. A
but after that, they were required to design their
much higher percentage of marketing students, for
own worksheet. Likewise, they could discuss the
whom this was a mandatory class, took it face-to-face.
specifics of the CC2 charts with other students
Online students were much more dependent on the
but their finished presentations had to look suffi-
Web site for the class and hence checked e-mail and
ciently unique. The online students came to cam-
the discussion board much more frequently than in the
pus for face-to-face exams on Saturdays and were
face-to-face class. The online students were older, lived
required to submit their graded assignments
farther from the university town, and had greater
either to a drop-box on campus or snail mail/
experience with online classes than the face-to-face
courier their submission to the professor.
students. While self-reported CGPA did not differ
across the two classes, the face-to-face students ended
the semester with a marginally higher weighted test
Data Collection
score percentage and semester grade percentage than
Data for this research was collected using an online the online students. However, the Cohen’s d statistic
survey between 2003 and 2007, when both shows that effect sizes for the metric-scaled data are at
Summer 2015 73

Table 1 Data for these scale items were first subjected to an


Sample Characteristics of Two Instructional Format iterative exploratory factor analysis procedure. This
Groups resulted in a satisfactory and clean factor structure con-
Instructional format taining 22 items loading on 5 factors (i.e., significant
Bartlett’s test, KMO [Kaiser-Meyer-Olkin] measure of
f2f Online
0.94, and a cumulative explained variance of 74.76
n % n % percent). These five factors were labeled as perceived
Gender competence, perceived communication, overall evalua-
Female 204 49% 330 55% tion, perceived challenge, and satisfaction with the face-
Male 210 51% 270 45% to-face or Internet format (see Table 3 for the rotated
Total 414 100% 600 100% factor structure, percentage of variance explained, and
Home zip code the α scores). A very similar structure resulted when the
University town 246 61% 260 44%
data were analyzed separately by instructional format.
Outside university 160 39% 336 56%
town
The five factors exhibited satisfactory internal consis-
Total 406 100% 596 100% tency, with most of the Cronbach’s αs (also in Table 3)
Grouping by major being mostly very well above 0.6 (Churchill, 1979).
Marketing 322 80% 326 56% Furthermore, the scale items generally had higher inter-
Other 80 20% 258 44% item correlation within factor than across factors, indi-
Total 402 100% 584 100% cating acceptable levels of convergent and discriminant
How to decide on class validity (Churchill, 1995). In addition, a look at the
format
correlation (computed using composite factor scores;
Department Web page 27 8% 25 5%
Another student 57 17% 58 11% see Table 4) indicate that in general the interconstruct
recommended correlations were less than the corresponding α scores.
None of the listed 196 58% 201 39% These suggest that the internal consistency within items
reasons measuring construct are greater than the association
Other 61 18% 236 45%
between constructs, thus providing further evidence of
Total 341 100% 520 100%
convergent and discriminant validity. This encouraged
Frequency of checking
e-mail from professor us to proceed further using the composite scores for
Once a day or more 149 36% 295 49% each construct.
Less than once a day 262 64% 312 51% Table 5 compares the two instructional formats on
Total 411 100% 607 100% the five factors. For all five factors, the face-to-face
Frequency of checking group has significantly higher means than the online
discussion board
group, indicating that the traditional class is seen in a
Once a day or more 114 28% 256 42%
more positive light. However, the Cohen’s d scores
Less than once a day 296 72% 352 58%
Total 410 100% 608 100% capturing the effect size were at best small to medium.
To test the hypothesized relationships, six multi-
Notes. f2f = face-to-face. Except for gender (p = 0.073), all differences
significant at p = 0.0005.
variate regression analyses were conducted using the
composite scores for the five factors, divided into two
sets of three regressions each. The first set specified
best modest (“small to medium”), with the exception overall evaluation as the dependent variable and per-
of experience with online classes (medium). ceived competence, perceived communication, and
The authors acknowledge that the main reason for perceived challenge as independent variables (Table 6
data collection was to help fine-tune the online and —Model 1 a, b, and c), while the second specified
face-to-face classes to make them as similar as possible perceived competence as the dependent variable, and
in terms of course content. Therefore, the instrument perceived communication and perceived challenge as
included all relevant scale items that were motivated by independent variables (Table 7—Model 2 a, b and c).
the constructs and their specific measurement items All six regressions in both sets included expected grade
typically found in most commonly used faculty evalua- in the class, CGPA and satisfaction with format as
tion instruments such as SET. control variables.
74 Marketing Education Review

Table 2
Demographics & Academic Performance of Two Instructional Format Groups
Instructional format
f2f Online Cohen’s d

Mean n Mean n Value Explanation

Age 24.58 412 25.68 597 0.27 small to medium


Miles home from university town 12.15 394 20.94 595 0.37 small to medium
Number of WebCT classes taken 1.81 394 3.30 595 0.51 medium
Self-reported cumulative GPA (CGPA) 3.12 405 3.11 596 0.02 very small
Earned semester % grade (max = 100) 83.1 443 80 615 0.25 small to medium
Weighted test score %: Midterm + Final Exam 70.61 337 68.75 548 0.14 small
Notes. f2f = face-to-face.
Except for CGPA (cumulative GPA), all differences significant at p = 0.0005.

Table 6 shows the results of set 1 (overall evaluation three regressions. This provides clear support for H4,
as dependent variable). The β weights (Model 1a) pro- H5a, and H5b.
vide support for H1 and H2a but not H3a. Perceived Finally, H6 was tested in a similar manner. The
competence and perceived communication are signifi- regression model using perceived competence as the
cantly and positively associated with overall evaluation dependent variables was run two more times, once for
while perceived challenge is not. Of the three control each instructional format (results shown in Table 7—
variables, expected grade and satisfaction with the for- Model 2 b and c), and Chow statistic was computed
mat of the class are significant, while CGPA is not. using residual sum of squares for the combined sample
Based on the β weights, perceived competence has the (Model 2 a), face-to-face sample (Model 2 b), and online
stronger impact on overall evaluation compared with sample (Model 2 c). The Chow statistic (= 1.99, df:5 and
perceived communication. 827, p value > 0.05) indicates that the relationships
Examination of the regression of perceived compe- between perceived communication, perceived chal-
tence (Model 2 a) provides support for H2b and H3b. lenge, and perceived competence are consistent across
Perceived communication and perceived challenge the two samples representing the face-to-face and
both significantly and positively influence competence. online class formats. Thus, H6a and H6b were
Of these, perceived communication has the stronger supported.
effect on perceived competence gained in the class Again, β weights for all three control variables in
compared with perceived challenge. Of the three con- their relationships with overall satisfaction as depen-
trol variables, expected grade and satisfaction with for- dent variable were consistent across two subsamples,
mat are significant whereas CGPA is not. that is, face-to-face and online class formats (Models 1
To test H4 and H5, the regression model using over- b and c in Table 6). However, when we used perceived
all evaluation as the dependent variable was run two competence as a dependent variable, the β weights
more times, once for each instructional format (results associated with expected grade were not consistent
shown in Table 6—Model 1 b and c). The resulting across these two samples, whereas β weights associated
three estimates of residual sum of squares (total plus with CGPA and satisfaction with the format were con-
each instructional format) were used to compute the sistent (Models 2 b and c in Table 7).
Chow’s statistic (1.41, df 6, 822, p value > 0.05), show-
ing that the relationships between perceived compe-
DISCUSSION
tence, perceived communication, perceived challenge,
and overall evaluation are similar between the two The goal of this study was to compare the two class
samples representing the face-to-face and online class formats—face-to-face and online—on two outcome
formats. A closer look at β weights from Models 1 a, b, dimensions and two controllable input dimensions,
and c indicates that the pattern is consistent across all as well as the relationships between these two sets of
Table 3
Rotated Factor Structure
Rotated (Varimax) Principal Component

Constructs Scale items 1 2 3 4 5

Perceived communication WebCT e-mail satisfactory communication method with professor 0.84
Professor responded to e-mail and Discussion Area posting in timely fashion 0.81
Content and materials on WebCT easy to access, review 0.80
WebCT Discussion Area effective for clarifications 0.77
Course content and materials on WebCT easy to follow, understand 0.74
Requirements for graded assignments explained well 0.71
Opportunity for clarification of exams, assignments adequate 0.62
Perceived competence More confident “working case numbers” 0.83
More confident using math in marketing 0.82
Improved ability to approach methodically 0.82
Taught tools for marketing decisions 0.79
Skills would be useful for life 0.79
More confident using spreadsheets 0.74
Overall evaluation M&M learning value vs. other marketing classes 0.78
M&M learning value vs. other UNT classes 0.75
Take M&M if not elective 0.70
M&M pushed me to peak performance compared to other classes 0.65
Perceived challenge More work than other marketing classes 0.91
More challenging than other business classes 0.88
More work than other business classes 0.87
Satisfaction with the format Will take another class from professor in same format 0.92
No regrets about format chosen 0.54
Percentage of variance explained (post-rotation) 45.89 10.46 9.35 4.93 4.13
Coefficient α score 0.84 0.94 0.84 0.88 0.60
Note. M&M = marketing math course.
Summer 2015
75
76 Marketing Education Review

Table 4
Correlation Between Factors
Overall Perceived Perceived Perceived Satisfaction with the
evaluation competence communication challenge format

Overall evaluation 0.84


Perceived competence 0.683 0.94
Perceived communication 0.508 0.596 0.84
Perceived challenge 0.175 0.278 0.302 0.88
Satisfaction with the format 0.492 0.512 0.512 0.154 0.66
Notes. All correlations significant at the 0.01 level (two-tailed). Diagonal elements are coefficient α scores.

Table 5
Comparison of Face-to-Face and Online Class Formats on Key Variables
Means Cohen’s d

f2f n Online n value Explanation

Overall evaluation 3.17 412 2.90 600 0.35 small to medium


Perceived competence 7.52 413 6.99 599 0.27 small to medium
Perceived communication 7.92 414 7.34 601 0.33 small to medium
Perceived challenge 7.57 412 7.15 601 0.22 small
Satisfaction with the format 7.13 347 5.95 516 0.47 medium
Notes. f2f = face-to-face.
Means for all five constructs differ between the two groups at p value ≤ 0.01.

constructs. Data were collected from a required mar- not take away from the fact that as retailers enhance
keting math class offered in both instructional formats the level of interaction, feeling of security, and other
and the results indicate that the traditional class is still factors typically associated with brick-and-mortar
perceived more favorably than online classes on all stores in their online retailing Web sites, consumers
four dimensions. In addition, the results indicate that are increasingly moving to online retailing environ-
perceived competence and perceived communication ments. Extending this analogy to a classroom setting,
are positively associated with the overall evaluation of the results of this study suggest that the relationships
the class. Likewise, perceived communication and per- between instructor-controllable variables (perceived
ceived challenge are positively associated with per- communication and perceived challenge) and the out-
ceived competence. Surprisingly, perceived challenge come variables (perceived competence and overall
did not have any influence on the overall evaluation, evaluation) remain consistent across both formats. In
which implies that some students may not really other words, if the instructor uses the technology to
always dislike being challenged. ensure similar levels of perceived communication and
These seemingly contradictory results are really not perceived challenge across face-to-face and online
at odds with one another. The more favorable percep- classes, it is likely to result in similar enhancement of
tion of face-to-face classes on various dimensions may perceived competence. In turn, positive student per-
be due to the fact that we are still social animals and ception of these three variables (perceived communi-
like meeting other people. This may explain why we cation and challenge, and the resultant perceived
all do not shop for everything in our pajamas at our competence) will lead to an enhanced evaluation of
bedroom and like to visit stores and malls where we an online class, as they would for a face-to-face class.
can interact with human beings. However, this does Clearly, the findings suggest that without exception,
Table 6
Multiple Regression Results with Overall Evaluation as the Dependent Variable (DV)
Model 1: DV = Overall evaluation
(a) Combined sample (n = 834) (b) f2f only (n = 333) (c) Online only (n = 501)

Independent variable Unstd. β Std. β t Sig. Unstd. β Std. β t Sig. Unstd. β Std. β t Sig

(Constant) 0.530 2.97 0.00 0.365 1.265 0.207 0.702 3.062 0.002
Perceived competence 0.187 0.489 15.04 0.00 0.206 0.527 9.999 0.000 0.177 0.475 11.228 0.000
Perceived communication 0.055 0.129 4.01 0.00 0.048 0.105 2.062 0.040 0.057 0.142 3.358 0.001
Perceived challenge 0.007 0.018 0.70 0.48 0.019 0.045 1.084 0.279 –0.002 –0.005 –0.154 0.878
Expected M&M grade 0.149 0.150 5.72 0.00 0.172 0.169 4.185 0.000 0.133 0.141 3.954 0.000
cumulative GPA –0.029 –0.015 –0.60 0.55 –0.029 –0.015 –0.387 0.699 –0.034 –0.018 –0.545 0.586
Satisfaction with format 0.043 0.143 4.81 0.00 0.039 0.116 2.492 0.013 0.039 0.138 3.519 0.000
R 0.72 0.732 0.701
R2 0.52 0.536 0.492
Adjusted R2 0.52 0.527 0.486
Sum of squared residuals 245.496 92.451 150.538
Chow Test statistic 1.41, F distributed with 6 and 822 degrees of freedom. Not significant at α < 0.01 or 0.05.
Notes. f2f = face-to-face; M&M = marketing math.
Boldface numbers show significant results
Summer 2015
77
78 Marketing Education Review

Table 7
Multiple Regression Results with Perceived Competence as the Dependent Variable (DV)
Model 2: DV = Perceived competence
(a) Combined sample (n = 837) (b) f2f only (n = 335) (c) Online only (n = 502)

Independent variable Unstd. β Std. β t Sig. Unstd. β Std. β t Sig. Unstd. β Std. β t Sig

(Constant) 0.327 0.659 0.510 –0.110 –0.142 0.887 0.551 0.846 0.398
Perceived communication 0.453 0.41 12.957 0.000 0.506 0.430 9.013 0.000 0.422 0.388 9.433 0.000
Perceived challenge 0.166 0.16 5.749 0.000 0.153 0.145 3.366 0.001 0.172 0.165 4.586 0.000
Expected M&M grade 0.385 0.15 5.412 0.000 0.173 0.066 1.572 0.117 0.510 0.201 5.491 0.000
cumulative GPA –0.093 –0.02 –0.696 0.486 0.034 0.007 0.166 0.868 –0.183 –0.036 –1.034 0.302
Satisfaction with format 0.194 0.25 8.129 0.000 0.245 0.288 6.256 0.000 0.172 0.225 5.576 0.000
R 0.673 0.697 0.654
R2 0.453 0.486 0.428
Adjusted R2 0.450 0.478 0.422
Sum of squared residuals 1914.138 670.764 1220.569
Chow Test statistic 1.99, F distributed with 5 and 827 degrees of freedom. Not significant at α = 0.01 or 0.05.
Notes. f2f = face-to-face; M&M = marketing math.
Boldface numbers show significant results
Summer 2015 79

these relationships hold well regardless of instruc- student perceptions of two learning formats (Bristow,
tional format of the class, which (e.g., face-to-face vs. Humphreys, & Ziebell, 2011).
online) has no significant moderating impact on the
relationship between overall evaluation and factors
LIMITATION AND IMPLICATIONS
such as perceived competence, perceived communica-
tion, and perceived challenge. Furthermore, it also has A key limitation of this study is its focus on one single
no moderating influence on the relationships between course at a single university. Further research may be
perceived competence and factors such as perceived needed to apply the findings in this study to other set-
communication and perceived challenge. tings. For instance, would similar findings result if the
The nonsignificant relationship between perceived course were not a rigidly structured undergraduate mar-
challenge and overall evaluation of the course keting math class? Would a graduate-level class, taught
(irrespective of the delivery format) could be explained using the two formats produce similar results? In addi-
as follows. First, the focal course is a math-based mar- tion, in this study we focused only on three key deter-
keting course and the incoming students taking this minants of the overall course evaluation—perceived
class know that this would be a “hot potato” (through competence, perceived communication, and perceived
formal and informal mode of communication), and challenge—while controlling for variables such as satis-
hence challenge perception may not have any effect faction with the course format, cumulative GPA, and
on their overall evaluation of the course. A second expected GPA in the focal course. It would be naive for
explanation could be that students expect a challen- us to presume that these constructs and variables form
ging course to make them more competent and may an exhaustive set of variables affecting the overall course
not let the level of difficulty and challenge influence evaluation. Future studies should explore other related
their overall evaluation of the course. variables such as student experience with both online
These findings are consistent with existing studies and face-to-face format, and investigate other boundary
(e.g., Clayson, 2009; Peltier, Drago, & Schibr conditions. We did not collect information on many
owsky, 2003; Peltier, Schibrowsky, & Drago, 2007; other background variables such as marital status, chil-
Ponzurick, France, & Logar, 2000) and have important dren in the home, or employment status and recognize
implications for educators and administrators. that these could influence attitude toward format of the
Irrespective of the course delivery format, the level of class, especially for online students. There is also a need
perceived communication and interaction need to be to define and examine teaching efficiency (as opposed to
high enough to obtain either a favorable overall evalua- teaching effectiveness), which was not included as a
tion, or to develop a perception of competence among variable in this research.
students. Furthermore, while making a course perceptibly There is, in general, lack of research attention on
more challenging apparently has no direct effect on the teaching efficiency in the extant literature. Moreover,
overall evaluation of the course, it does have a significant additional research is needed with a focus on more
positive effect on perceived competence, which in turn objective and cognitive measures of teaching effective-
has a strong positive effect on the overall evaluation of ness such as GPA due to the possible self-reporting bias
the class. Therefore, making the class less challenging of students relative to learning outcomes. With the
may be detrimental to everyone concerned. The results advancement of information technology, online univer-
also suggest that while perceptually face-to-face classes sities increasingly utilize multimedia and live confer-
tend to outperform online classes on various dimensions, ences to emulate the traditional classroom. There is a
it is possible to achieve similar levels of perceived com- need to investigate how these technological develop-
petence among students and obtain similar evaluations ments could improve the effectiveness and efficiency
in both face-to-face and online class formats, as long as of online teaching and learning. As a result, additional
the instructor keeps the perceived communication and research is necessary to keep track of the changing per-
perceived challenge similar in both formats. ceptions of students and faculty with respect to the two
Consequently, the findings in this study extend the pre- learning platforms. We hope that these limitations pro-
vious work on comparison of traditional versus online vide the impetus for future investigations in this area,
course format and especially fill the gap to evaluate the which is probably going to change the face of academia.
80 Marketing Education Review

REFERENCES validity studies. Review of Educational Research, 51,


281–309.
Andrew, M. D., Cobb, C. D., & Giampietro, P. J. (2005). Darke, P. R., Chattopadhyay, A., & Ashworth, L. (2006). The
Verbal ability and teacher effectiveness. Journal of importance and functional significance of affective cues
Teacher Education, 56(4), 343–354. in consumer choice. Journal of Consumer Research, 33(3),
Association to Advance Collegiate Schools of Business 322–328.
(AACSB) International. (2003). Eligibility procedures and Davis, F. D. (1989). Perceived usefulness, perceived ease of use
standards for business accreditation. St. Louis, MO: AACSB and user acceptance of information technology. MIS
International. Quarterly, 13, 319–340.
Bacon, D. R., & Bean, B. (2006). GPA in research studies: An Davis, F. D. (1993). User acceptance of information technol-
invaluable but neglected opportunity. Journal of ogy: System characteristics, user perceptions and beha-
Marketing Education, 28(1), 35–42. vioral impacts. International Journal of Man-Machine
Barat, S., Rajamma, R. K., Zolfagharian, M. A., & Ganesh, G. Studies, 38, 475–487.
(2009). Student course perceptions: A perceived-ease-of- Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. (1989). User
use-perceived-usefulness framework. Marketing Education acceptance of computer technology: A comparison of
Review, 15(Winter), 35–45. two theoretical models. Manage Science, 35, 982–1003.
Berger, K. A., & Topol, M. T. (2001). Technology to enhance Ding, M., Eliashberg, J., Huber, J., & Saini, R. (2005).
learning: Use of a Web site platform in traditional classes Emotional bidders: An analytical experimental examina-
and distance learning. Marketing Education Review, 11(3), tion of consumers’ behavior in a priceline-like reverse
15–26. auction. Management Science, 51(3), 352–364.
Bergmann, M., & Dobie, K. (1999). The Interactive Video Eastman, J. K., & Swift, C. O. (2001). New horizons in dis-
Learning Environment and Teacher Evaluation: tance education: The online learner-centered marketing
Teachers May Never Have to Leave Their Campus But class. Journal of Marketing Education, 23(1), 25–34.
They Must “Go the Extra Mile”. Marketing Education Finch, D., Nadeau, J., & O’Reilly, N. (2013). The future of
Review, 9(1), 21–28. marketing education: A practitioner’s perspective.
Bicen, P., & Laverie, D. A. (2009). Group-based assessment as Journal of Marketing Education, 35(1), 54–67.
a dynamic approach to marketing education. Journal of Flanagan, J. L. (2012). Online versus face-to-face instruction:
Marketing Education, 31(2), 96–108. Analysis of gender and course format in undergraduate
Boggiano, A. K, Main, D. S., & Katz, P. A. (1988). Children’s business statistics courses. Academy of Business Journal,
preference for challenge: The role of perceived compe- 2(4), 89–98.
tence and control. Journal of Personality and Social Ganesh, G., & Paswan, A. K. (2010). Teaching basic mar-
Psychology, 54(1), 134–141. keting accountability using spreadsheets: An explora-
Boggiano, A. K, Ruble, D. N., & Pittman, T. S. (1982). The tory perspective. Journal of Business Research, 63(2),
mastery hypothesis and the overjustification effect. 182–190.
Social Cognition, 1, 38–49.
Ganesh, G., Sun, Q., & Barat, S. (2010). Improving the
Bristow, D., Shepherd, C. D., Humphreys, M., & Ziebell, M.
marketing math skills of marketing undergraduate
(2011). To be or not to be: That isn’t the question! An
students through a unique undergraduate marketing
empirical look at online versus traditional brick-and-
math course. Marketing Education Review, 20(1),
mortar courses at the university level. Marketing
47–63.
Education Review, 21(3), 241–250.
Harter, S. (1981). A new self-report scale of intrinsic versus
Churchill, G. A. (1979). A paradigm for developing better
extrinsic orientation in the classroom: Motivational and
measures of marketing constructs. Journal of Marketing
informational components. Developmental Psychology,
Research, 16(February), 64–73.
17, 300–312.
Churchill, G. A. (1995). Marketing research: Methodological
Hatfield, L., & Taylor, R. K. (1998). Making business schools
foundations. Hindsdale, IL: Dryden Press.
responsive to customers: Lessons learned and actions.
Clarke, I., III, Flaherty, T. B., & Mottner, S. (2001). Student
Marketing Education Review, 8(2), 1–8.
perceptions of educational technology tools. Journal of
Marketing Education, 23(December), 169–177. Higgins, R. (2000). Be more critical: Rethinking assessment
Clayson, D. E. (2009). Student evaluations of teaching: Are feedback. In, Proceedings of the Conference of the British
they related to what students learn? A meta-analysis and Educational Research Association, Cardiff University,
review of the literature. Journal of Marketing Education, Wales, September.
31(1), 16–30. Jeltova, I., Birney, D., Fredine, N., Jarvin, L., Sternberg, R. J., &
Clifford, M. M. (1990). Students need challenge, not easy Grigorenko, E. L. (2007). Dynamic assessment as a pro-
success. Educational Leadership, 48(1), 22–26. cess-oriented assessment in educational settings.
Close, A. G., Dixit, A., & Malhotra, N. K. (2005). Chalkboards Advances in Speech-Language Pathology, 9, 271–285.
to cybercourses: The Internet and marketing education. Koon, J., & Murray, H. G. (1995). Using multiple outcomes to
Marketing Education Review, 15(2), 81–94. validate student ratings of overall teacher effectiveness.
Cohen, P. A. (1981). Student ratings of instruction and stu- The Journal of Higher Education, 66(1), 61–81.
dent achievement: A meta-analysis of multi-section Kraiger, K. J., Ford, K., & Salas, E. (1993). Application of
cognitive, skill-based and affective theories of
Summer 2015 81

learning outcomes to new methods of training eva- Roberts, J. (1996). The story of distance education: A practi-
luation. Journal of Applied Psychology, 78, 311–328. tioner’sperspective. Journal of the American Society for
Liu, S., Gomez, J., Khan, B., & Yen, C.-J. (2007). Toward a Information Science, 47, 811–816.
learner-oriented community college online course drop- Sherif, J., & Khan, R. (2005). Role and relevance of Web
out framework. International Journal on E‑Learning, 6- supported learning for first generation learners. Journal
(December), 519–542. of American Academy of Business, 6, 123–129.
Malhotra, N. K., Dixit, A., & Uslay, C. (2002). Integrating Sitzmann, T., Brown, K. G., Casper, W. J., Ely, K., &
Internet technology in marketing research education. Zimmerman, R. D. (2008). A review and meta-analysis
Marketing Education Review, 12(3), 25–34. of the nomological network of trainee reactions. Journal
McGorry, S. Y. (2006). Data in the palm of your hand. of Applied Psychology, 93, 280–295.
Marketing Education Review, 16(3), 83–90. Sitzmann, T., Ely, K., Brown, K. G., & Bauer, K. N. (2010). Self-
Mintu-Wimsatt, A. (2001). Traditional vs. technology- assessment of knowledge: A cognitive learning or affec-
mediated learning: A comparison of students’ course tive measure? Academy of Management Learning and
evaluations. Marketing Education Review, 11(2), 63–73. Education, 9(2), 169–191.
Neves, J. S., & Sanyal, R. N. (1991). Classroom communica- Steiner, S, D., & Hyman, M. R. (2010). Improving the student
tion and teaching effectiveness: The foreign-born experience: Allowing students enrolled in a required
instructor. Journal of Education for Business, 66(5), course to select online or face-to-face instruction.
304–308. Marketing Education Review, 20(1), 29–33.
Parayitam, S., Desai, K., & Phelps, L. (2007). The effect of Stewart, B. L. (2004). Online learning: A strategy for social
teacher communication and course content on student responsibility in educational access. Internet and Higher
satisfaction and effectiveness. Academy of Educational Education, 7(4), 299–310.
Leadership Journal, 11(3), 91–105. Sweeney, A. D. P., Morrison, M. D., Jarratt, D., & Heffernan, T.
Park, C. L., Crocker, C., Nussey, J., Springate, J., & Hutchings, (2009). Modeling the constructs contributing to the
D. (2010). Evaluation of a teaching tool-wiki in online effectiveness of marketing lecturers. Journal of Marketing
graduate education. Journal of Information Systems Education, 31(3), 190–202.
Education, 21(3), 313–321. Tang, T. L. (1997). Teaching evaluation at a public institution
Paswan, A., & Young, J. A. (2002). A causal analysis of process of higher education: Factors related to the overall teach-
variables in student evaluation of instructors. Journal of ing effectiveness. Public Personnel Management, 26(3),
Marketing Education, 24(2), 193–202. 379–389.
Peltier, J. W., Drago, W., & Schibrowsky, J. A. (2003). Virtual Wachtel, H. K. (1998). Student evaluation of college teaching
communities and the assessment of online marketing effectiveness: A brief review. Assessment and Evaluation in
education. Journal of Marketing Education, 25(3), 260–276. Higher Education, 23(2), 191–211.
Peltier, J. W., Schibrowsky, J. A., & Drago, W. (2007). The Wayland, J., Owen Swift, C., & Wilson, J. (1996). Distance
interdependence of the factors influencing the perceived education in the principles of marketing class: Are teach-
quality of the online learning experience: A causal model. ing evaluations affected? Edited by Elnora W. Stuart,
Journal of Marketing Education, 29(2),140–153. David J. Ortinau, and Ellen M. Moore, Proceedings of the
Pew Research Center. (2011). The digital revolution and higher Annual Meeting of the Southern Marketing Association, New
education. Retrieved from http://pewsocialtrends.org/ Orleans, LA, 234–238.
files/2011/08/online-learning.pdf Wood, N. T., Solomon, M. R., & Allan, D. (2008). Welcome to
Pintrich, P. R., & De Groot, E. V. (1990). Motivational and the Matrix: E-learning gets a second life. Marketing
self-regulation learning components of classroom aca- Education Review, 18(2), 47–53.
demic performance. Journal of Educational Psychology, Wymbs, C. (2011). Digital marketing: The time for a new
82(1), 33–40. “academic major” has arrived. Journal of Marketing
Ponzurick, T. G., Russo France, K., and Logar, C. M. (2000). Education, 33(1), 93–106.
Delivering graduate marketing education: An analysis of Young, M. R. (2005). The motivational effects of the class-
face-to-face versus distance education. Journal of room environment in facilitating self-regulated learning.
Marketing Education, 22(3), 180–187. Journal of Marketing Education, 27(1), 25–40.
Roberts, C. L., & Becker, S. L. (1976). Communication and Zahay, D., & Fredricks, E. (2009). Podcasting to improve
teaching effectiveness in industrial education. American delivery of a project-based Internet marketing course.
Educational Research Journal, 13(3), 181–197. Marketing Education Review, 19(1), 57–63.

You might also like