Download as pdf or txt
Download as pdf or txt
You are on page 1of 18

European Journal of Engineering Education

ISSN: 0304-3797 (Print) 1469-5898 (Online) Journal homepage: https://www.tandfonline.com/loi/ceee20

Assessing the effectiveness of a hybrid-flipped


model of learning on fluid mechanics instruction:
overall course performance, homework, and far-
and near-transfer of learning

David J. Harrison, Laurel Saito, Nancy Markee & Serge Herzog

To cite this article: David J. Harrison, Laurel Saito, Nancy Markee & Serge Herzog
(2017) Assessing the effectiveness of a hybrid-flipped model of learning on fluid
mechanics instruction: overall course performance, homework, and far- and near-
transfer of learning, European Journal of Engineering Education, 42:6, 712-728, DOI:
10.1080/03043797.2016.1218826

To link to this article: https://doi.org/10.1080/03043797.2016.1218826

Published online: 17 Aug 2016.

Submit your article to this journal

Article views: 617

View related articles

View Crossmark data

Citing articles: 12 View citing articles

Full Terms & Conditions of access and use can be found at


https://www.tandfonline.com/action/journalInformation?journalCode=ceee20
EUROPEAN JOURNAL OF ENGINEERING EDUCATION, 2017
VOL. 42, NO. 6, 712–728
https://doi.org/10.1080/03043797.2016.1218826

Assessing the effectiveness of a hybrid-flipped model of learning


on fluid mechanics instruction: overall course performance,
homework, and far- and near-transfer of learning
David J. Harrisona , Laurel Saitob , Nancy Markeeb and Serge Herzogc
a
Instructional Design Team, University of Nevada, Reno, NV, USA; bDepartment of Natural Resources and
Environmental Science, University of Nevada, Reno, NV, USA; cInstitutional Analysis, University of Nevada, Reno, NV,
USA

ABSTRACT ARTICLE HISTORY


To examine the impact of a hybrid-flipped model utilising active learning Received 4 September 2015
techniques, the researchers inverted one section of an undergraduate fluid Accepted 23 July 2016
mechanics course, reduced seat time, and engaged in active learning
KEYWORDS
sessions in the classroom. We compared this model to the traditional Flipped learning; educational
section on four performance measures. We employed a propensity score videos; fluid mechanics;
method entailing a two-stage regression analysis that considered eight engineering education;
covariates to address the potential bias of treatment selection. First, we propensity score method
estimated the probability score based on the eight covariates, and
second, we used the inverse of the probability score as a regression
weight on the performance of learners who did not select into the
hybrid course. Results suggest that enrolment in the hybrid-flipped
section had a marginally significant negative impact on the total course
score and a significant negative impact on homework performance,
possibly because of poor video usage by the hybrid-flipped learners.
Suggested considerations are also discussed.

1. Introduction
Active learning is recognised as being superior to passive learning (e.g. Burr 1932; Chi 2009; Freeman
et al. 2014), for the simple fact that the human brain is naturally active (Burr 1932; Wittrock 1978).
Active learning activities allow learning to occur through experience (Burr 1932) that fosters
learner involvement in, and influence of, the learning process (Weinstein and Mayer 1986). Chi
(2009) suggests a taxonomy of active learning that divides it into three activity types with increasing
learning effectiveness: active, constructive, and interactive. Active learning activities include learner-
control of pacing, such as the ability to rewind or pause videos (Schwan and Riempp 2004); construc-
tive tasks include learner-explanations (Chi et al. 1994) and summaries (e.g. Linden and Wittrock 1981;
Wittrock and Alesandrini 1990); and interactive learning includes digital pedagogical agents such as
on-screen tutors (e.g. Moreno et al. 2001), interactive multimedia (e.g. Rieber 1996; Scaife et al. 1997),
and learners working together as in problem-based learning (e.g. Barrows and Tamblyn 1980; Wilkie
2004). Problem-based learning, in which small groups of learners wrestle with a problem under the
guidance of an instructor or tutor (Wilkie 2004), appears to be an ideal fit with engineering education,
specifically fluid mechanics as addressed in the current study. In order to allow for such learning
activities to occur during class time, however, the time normally devoted to lectures needs to be
offset.

CONTACT David J. Harrison davidharrison@unr.edu


© 2016 SEFI
EUROPEAN JOURNAL OF ENGINEERING EDUCATION 713

The flipped model allows for such active learning to occur within the classroom (Gannod, Burge,
and Helmick 2008; Strayer 2012; Gaughan 2014; Kim, Kim, et al. 2014) where the instructor can
immediately gauge learners’ understandings, clarify points, and redirect misunderstandings (Camp-
bell et al. 2014; Fraga and Harmon 2014; Love et al. 2014; Touchton 2015). The model also allows
for a more engaging environment (Enfield 2013) that leverages learner-centered activities (Mason,
Shuman, and Cook 2013) and regular group-work conducted under the direct guidance of the
instructor (Ferreri and O’Connor 2013; McLaughlin et al. 2013). The model leverages instructional
technologies to place ‘passive’ content consumption outside of the classroom (e.g. within a tra-
ditional textbook, lecture notes, or an online course management system) while ‘active’ knowledge
assimilation tasks (e.g. problem solving, labs, or other creative work) occur during classroom
contact hours (Lockwood and Esselstein 2013). Despite potential advantages of the flipped
model, researchers note that the nascent body of research examining the model is quite limited
in terms of measuring objective learning outcomes (Fraga and Harmon 2014; Moffett and Mill
2014; Blair, Maharaj, and Primus 2015), as studies relying on learners’ perceptions of learning
and quality of instruction dominate the current body of literature (Giannakos, Krogstie, and Chriso-
choides 2014).
For the study presented here, the flipped model was chosen as an appropriate base-model for
the teaching of fluid mechanics, a domain of knowledge that includes extensive problem solving.
The impetus to bring active learning into the classroom by utilising the flipped model to create
‘recitation sections’ (discussed below) was primarily driven by course evaluations completed by
previous learners. Learners indicated that they thought that they could complete the course
without live lectures, and in the place of lectures, they wanted more time to engage in in-class
problem solving. The approach used in this study, in essence, combined both flipped and
hybrid approaches rather than being a purely flipped model research study. Some researchers
would define this format as essentially a hybrid course (e.g. Utts et al. 2003; Baird and Dupin-
Bryant 2014; Stone and Clark 2015). Other researchers, however, would reject the application of
‘hybrid’ to the current study, because they define hybrid as combining digital delivery of
content while maintaining the same face-to-face classroom time (e.g. Dowling, Godfrey, and
Gyles 2003). Such discussion could be a matter of semantics, but points to an important fact
that the discussion of replaced seat time in blended, hybrid, and flipped models is still fluid and
unsettled. Such a division could be a meaningless unit of measure that may give way to the
effective integration of online and face-to-face techniques (Garrison and Kanuka 2004). Means
et al. utilise a simple definition, adopted in our study, that hybrid courses ‘blend in-class and
online activities’ (2010, xi). Thus, flipped learning resides within the hybrid model and like the
Utts et al. (2003) and Baird and Dupin-Bryant (2014) studies, our study incorporated replaced
seat time.
So that we might examine the impact of a hybrid-flipped model utilising active learning tech-
niques we inverted one section of an undergraduate fluid mechanics course, reduced seat time
while maintaining lecture contact time, and engaged in active learning sessions in the classroom.
We compared this section to the traditional section on four performance measures. We then
employed a propensity score (PS) method entailing a two-stage regression analysis that considered
eight covariates to address the potential bias of treatment selection. The paper begins with a litera-
ture review of research on the flipped model, followed by a description of the model applied to the
fluid mechanics course, and a discussion of the implications of study results. We conclude with some
suggestions for further research.

2. Flipped model research


The flipped model is centuries old (Talbert 2014), having been used in the teaching methods of
Socrates, law schools, and literature courses where learners consumed core content before
714 D. J. HARRISON ET AL.

coming to class. More recently, rationale for the modern concept is traced to three separate roots:
peer-instruction, learning styles, and absentee high school learners.
Harvard University Professor Eric Mazur’s discovery of the effectiveness of peer-instruction
occurred when, through serendipity and a bit of frustration at his own inability to help his
learners understand a particular physics concept, he directed the learners to debate the
matter amongst themselves whereby they quickly grasped it (UMBCtube 2009). Citing Hake
(1998) as evidence that such an interactive-engagement technique led to increased learning,
Mazur became an early champion of peer-instruction. Using this technique within the classroom,
however, required time previously allocated for delivery of his lectures. To free up class time,
Mazur required his learners to review prepared lecture notes of the core content before class.
Once in class, Mazur engaged the learners in discussions and debates examining conceptual
considerations.
Around the same time, Lage, Platt, and Treglia sought a method that allowed economics learners
to consume content based on individual learning styles, rather than the instructor’s teaching style.
They leveraged multimedia and online technologies to invert an introductory economics course
with core content consumed by the learners before class that provided options of lecture-flavors:
videotaped lectures, narrated PowerPoint, or static PowerPoint slides. The course allowed ‘events
that have traditionally taken place inside the classroom [to] now take place outside the classroom
and vice versa’ (2000, 32; emphasis original). The learners indicated their preference for the inverted
model and generally believed that it increased their learning more than the traditional model of
instruction (Lage, Platt, and Treglia 2000).
The flipped model did not garner widespread attention, however, until two high school chemistry
teachers in rural Colorado began video-taping their lectures in 2006 to help learners catch up when
missing classes due to illness or sporting events (Bergmann and Sams 2012). Bergmann and Sams
(2012) utilised the online service YouTube to host their videos, whereupon local television news
studios and teachers discovered and popularised the videos. Soon thereafter, they created a
network of interested teachers and researchers, eventually growing it into an international online
network (the Flipped Learning Network based in Arlington, VA) with an annual conference promoting
the model. Research initiatives in partnership with Pearson, Inc. and George Mason University
(Hamdan et al. 2013) extended the reach of the network. In recent years, the flipped model has gar-
nered many adherents from elementary, secondary, and tertiary levels, the attention of the mass
media, a growing base of online content resources such as Khan Academy (Mountain View, CA:
Khan Academy) and YouTube (Mountain View, CA: Google) offerings, and a small, but increasing,
body of empirical research.

2.1. Claimed benefits


The flipped model, like many innovations, promises a plethora of attractive benefits, including:

. More educational focus on active learning and higher order thinking skills (Baker 2000; Warter-
Perez and Dong 2012; Redekopp and Ragusa 2013).
. Flexible teaching arrangements that allow for interactive and engaging learning within the class-
room (McCray 2000; Foertsch et al. 2002; Gannod, Burge, and Helmick 2008; Dollár and Steif 2009;
Enfield 2013; Redekopp and Ragusa 2013; Gaughan 2014).
. More student-to-student interaction (Baker 2000; Redekopp and Ragusa 2013; Blair, Maharaj, and
Primus 2015; Touchton 2015).
. The ability to meet different learning needs (Lage, Platt, and Treglia 2000; Kim, Park, and Joo 2014;
Touchton 2015).
. Increased learner self-efficacy and control of learning (Baker 2000; Dowling, Godfrey, and Gyles
2003; Enfield 2013; Love et al. 2014).
EUROPEAN JOURNAL OF ENGINEERING EDUCATION 715

. Real-time support for higher-order learning within the classroom (Touchton 2015; Fraga and
Harmon 2014; Lage, Platt, and Treglia 2000).
. More material coverage as compared to the traditional model (Mason, Shuman, and Cook 2013).

2.2. Noted limitations


Researchers cite several limitations of the flipped model, including:

. Initial effort with high start-up costs and investments of time, such as creating and editing videos
(Largent 2013; Campbell et al. 2014; Giannakos, Krogstie, and Chrisochoides 2014). Campbell et al.
estimated that ‘it took approximately 8 hours to prepare a 10 minutes video’, for a total of 600
hours of production time for one semester (2014, 311).
. Increased effort and funding to meet accessibility standards for those with physical and cognitive
disabilities as required by law in various countries (Gaughan 2014; Blair, Maharaj, and Primus
2015).
. Initial adjustment time and extra instructor support for learners who are unfamiliar with the model
(Lockwood and Esselstein 2013; Mason, Shuman, and Cook 2013; Fraga and Harmon 2014).
. Redesign of assessments to measure higher order skills and knowledge utilised in interactive
classes. Such redesigned assessments will require more time for learners to take and instructors
to grade (McLaughlin et al. 2013).
. Rejection of the model by learners, as educational responsibilities shift from instructors to learners,
which the latter sometimes find ‘unfair or unreasonable’ (Wilson 2013, 198). This may increase dis-
satisfaction with course content or instructors, which can negatively impact tenure and promotion
opportunities for faculty (Ferreri and O’Connor 2013).
. The potential for increased technical problems because the model is often dependent on an
online course management system. This may prevent learners from accessing required materials
in a timely manner (see Fraga and Harmon 2014).
. Extra editing work may be required to reduce long videos into shorter segments (Mason,
Shuman, and Cook 2013). It is not uncommon for instructors to create videos lasting half the
length of a typical lecture (Day and Foley 2006; Enfield 2013; Mason, Shuman, and Cook
2013), or the full length (Herold et al. 2012). Gaughan (2014) surmised from learner evaluation
surveys that long video length can be a hindrance to viewing. Several studies report that lear-
ners prefer video length to be 20 minutes or less (Enfield 2013; Redekopp and Ragusa 2013; Har-
rison 2015). How length impacts viewing behaviours and learning outcomes, however, remains
to be resolved.
. A ‘coverage problem’ wherein less course material may be covered due to the time demands of in-
class active learning (Faust and Paulson 1998, 17).

The flipped model, therefore, deserves serious study, critique, and guidance; such items are only
now ramping up with the model’s popularity. While the nascent body of research is not yet settled
enough to create a cohesive summary (Fraga and Harmon 2014; Love et al. 2014), preliminary trends
can be ascertained. Attitudinal survey studies generally indicate positive acceptance of the model by
learners (e.g. Baker 2000; Lage, Platt, and Treglia 2000; McCray 2000; Foertsch et al. 2002; Cottrell and
Robison 2003; Toto and Hien 2009; Strayer 2012; Enfield 2013; Campbell et al. 2014; Kim, Park, and Joo
2014). Because the flipped model allows core content to be studied before face-to-face class time, the
model encourages in-class collaboration (Baker 2000; Ferreri and O’Connor 2013) and higher quality
in-class discussions (Herold et al. 2012; McLaughlin et al. 2013).
Empirical studies examine the flipped model in terms of comparing it to the traditional format of
in-class lectures, with evidence suggesting an overall equivalency of learning outcomes (Lage, Platt,
and Treglia 2000; Davies, Dean, and Ball 2013; McLaughlin et al. 2013; Mason, Shuman, and Cook
716 D. J. HARRISON ET AL.

2013; Larson and Yamamoto 2013; Love et al. 2014; Chin 2014; Campbell et al. 2014; Blair, Maharaj,
and Primus 2015; Fraga and Harmon 2014). Some studies, however, show support for increased learn-
ing with the flipped model (Day and Foley 2006; Moravec et al. 2010; Pierce and Fox 2012; Stone 2012;
Ferreri and O’Connor 2013; McGivney-Burelle and Xue 2013; McLaughlin et al. 2013; Wilson 2013; Kim,
Park, and Joo 2014).

2.3. Media used for content consumption


Different media are used for content consumption outside of the classroom, such as text-based Web
pages (Baker 2000), textbook and readings (Ferreri and O’Connor 2013; Wilson 2013), lecture notes
(UMBCtube 2009), animations (Boyle et al. 2003; Stone 2012), and interactive elements with self-
assessments (Utts et al. 2003; Strayer 2012). The overwhelming method is the use of video media
(Giannakos, Krogstie, and Chrisochoides 2014), divided into two main video recording types: on-
screen presentations and live-classroom presentations. Video recordings of on-screen presentations
(e.g. narrated slides and screen captures) are common (e.g. Lage, Platt, and Treglia 2000; Carlisle
2010; Moravec et al. 2010; Enfield 2013; Chin 2014; Fraga and Harmon 2014; Love et al. 2014;
Moffett and Mill 2014; Blair, Maharaj, and Primus 2015). Similarly, the recordings of live-classroom
presentations, often from previous semesters, are frequently found (e.g. McCray 2000; Foertsch
et al. 2002; Toto and Hien 2009; Ronchetti 2010; Pierce and Fox 2012; Gehringer and Peddycord
2013; Mason, Shuman, and Cook 2013; McLaughlin et al. 2013). Some instructors choose to edit
live lectures so as to decrease overall length (Mason, Shuman, and Cook 2013; McLaughlin et al.
2013), record the live video outside of a classroom setting (Day and Foley 2006), or combine pres-
entation slides with live lectures (Herold et al. 2012). A small segment of instructors choose to use
existing videos created by others, such as Khan Academy videos (Wilson 2013), publisher videos
(Davies, Dean, and Ball 2013; Larson and Yamamoto 2013), and YouTube videos (Kim, Park, and
Joo 2014).

3. Application of the flipped model to an undergraduate fluid mechanics course


In Fall 2013, the flipped learning model was applied to one section of an introductory fluid mechanics
course at a large western United States university. The semester-long course covers fluid mechanics
material such as continuity, energy, and momentum concepts of fluid flow, and applications such as
fluid statics and simple pipe flow, with an emphasis on conditions related to water. The course typi-
cally has 80–100 learners and has been taught with two-and-a-half hours of lecture each week and
voluntary ‘recitation sections’ where learners can work on problems with the instructor and the teach-
ing assistant. These recitation sections are typically much smaller in terms of attendance, allowing for
more student-to-instructor interaction. Conversely, the lecture format does not allow for frequent
student-to-instructor interaction but rather focuses on the delivery of content to a large number
of learners.
The majority of learners in the course are engineering students (civil engineering, environmental
engineering, geological engineering, or mechanical engineering) with the remaining learners major-
ing in ecohydrology or hydrogeology. Attendance at the voluntary recitations is traditionally very low.
Few undergraduate students typically visit the instructor or teaching assistant during their office
hours. Thus, undergraduate students do not spend much time during the semester interacting
with the instructor or the teaching assistant outside of lecture.
After teaching the course for the first time in 2012, the instructor decided to explore teaching the
course in a flipped mode because course evaluations from Fall 2012 indicated some learners felt that
lectures were primarily covering material in the book and they preferred to be able to work on pro-
blems during the class period instead. Thus, in Fall 2013, the instructor broke her class into two sec-
tions: one utilising traditional lectures and one utilising the hybrid-flipped model.
EUROPEAN JOURNAL OF ENGINEERING EDUCATION 717

4. Methods
4.1. Treatments
4.1.1. Traditional section
The traditionally taught section met twice a week for a total of two-and-a-half hours each week. The
instructor utilised the traditional lecture format, disseminating instruction through the use of mul-
tiple white boards and an overhead video projector. The lecture classroom was designed to seat
about 90 learners, and led to very little two-way interaction between the instructor and learners,
thus was a predominantly passive learning environment. One weekly recitation session (described
below), conducted by a teaching assistant and lasting 50 minutes, was available for voluntary
attendance.

4.1.2. Hybrid-flipped section


All lecture sessions from the traditional section were filmed, uploaded to the university’s video repo-
sitory by a graduate teaching assistant, and made available for learners in the hybrid-flipped section
within hours of recording. The professor’s voice, whiteboard content, and images shown on the over-
head projector (when appropriate) were captured and thus very closely represented the passive in-
class environment learners in the traditional course experienced. This was done to ensure that the
amount of available lecture contact time, whether in-person or virtual through the online videos,
was the same for both groups. The videos were divided into shorter topic-based videos that were
generally less than 15 minutes each, resulting in a total of 170 videos, but contained the same
amount of lecture time as provided in the traditional format.
The hybrid-flipped section met for face-to-face instruction once a week in a recitation section for 1
hour and 15 minutes. The learners were encouraged to come to class prepared to work on problems
by choosing to view the videos (as recorded within the traditional section, mentioned above), review
the textbook, or do both. Thus, the amount of available lecture contact time, whether in-person or
virtual through the use of the online videos, was the same for both groups.
During the recitation session, the participants were broken into small groups of three learners to
work together on problems, based on the problem-based learning model that has its origin in
medical education, tracing its roots to Case Western University in the 1960s (Wilkie 2004). The instruc-
tor changed groups each session to ensure consistent active participation by all, rather than let a
‘leader’ with ‘passive learners’ emerge for each group over the semester. She circulated amongst
the learners to provide assistance and answer questions, and when the majority of learners had com-
pleted a problem, she worked through the problem on the whiteboard. Typically, about four to six
problems were addressed in each recitation period.
These recitation sections allowed two activities to occur that were not available with the tra-
ditional format: (1) the learners engaged in active learning within the formal environment of the
class, where they were required to attend and engage, and (2) the instructor could engage in intimate
two-way dialogue with the learners both at the small-group level as she circulated amongst them and
at the whole-group level when she discussed the problems on the white board. In the traditional
format, the learners normally sat passively at their seats, taking notes, and rarely engaged in dialogue
with the instructor.

4.2. Participants
Learners enroled in both sections included in this study totalled 59, all being juniors or seniors. In an
effort to reduce confounding variables, four categories of learners were removed from this study so
as to arrive at the study’s 59 participants: (1) graduate students; (2) learners who were not engineer-
ing students, because of differences in academic backgrounds; (3) learners who did not complete the
course, due to the various amounts of assessments that these students completed; and (4) learners
718 D. J. HARRISON ET AL.

who attended both the traditional and experimental sections, as these learners accumulated contact
time above-and-beyond that of all other learners.
Participants from the traditional section totalled 35, of which 8 were females and 27 males. Within
the field of engineering majors, 21 were pursuing bachelor’s degrees in civil engineering, 10 in geo-
logical engineering/mining, 3 in environmental engineering, and 1 in mechanical engineering. Par-
ticipants from the hybrid-flipped section totalled 24 participants, of which 10 were females and 14
males. Within the field of engineering, 21 were pursuing bachelor’s degrees in civil, 1 in environ-
mental, and 2 in geological engineering/mining.

4.3. Assessments
We used several types of learning measurements to determine learning performance in multiple cat-
egories: overall course grade, homework assignments, worked items on exams, and non-worked
items on exams. Both sections received the same assessment instruments. The overall course
grade was calculated from scores on two mid-term examinations, three quizzes, nine homework
assignments, one final examination, and attendance. The nine homework assignments consisted
of worked problems (to measure far-transfer of learning). Worked problems are defined as problems
in which learners derived their answers by applying equations and assumptions and wrote out the
derivations on paper. Far-transfer of learning refers to the transfer of knowledge and skills from
instructional materials to new situations in which learners engage in new problems and scenarios
(Fiorella and Mayer 2015). For example, learners might be given information about a fluid body
and asked to calculate the reaction forces needed to keep a gate closed that was enclosing the
fluid. Because learners were allowed to work together on homework problems, but were required
to work independently on exams, learner performance of worked items was assessed separately
for the mid-terms and the final from worked items on homework. Worked problems on exams
were similar to homework problems and also measured far-transfer of learning, characterised by
requiring learners to derive their answers by applying equations and assumptions, but learners
only had access to an equation sheet (i.e. the exams were closed-book) and had to work alone.
Non-worked items on the same examinations measured near-transfer of learning. Near-transfer of
learning involves rote learning of facts and knowledge that is fragmented in nature (Fiorella and
Mayer 2015), such as recognition of concepts and assumptions devoid of application to novel situ-
ations. These items were multiple-choice problems and did not require calculation or derivation to
solve. For example, learners might be asked to state where a force would be highest for a given
fluid situation and would be given three options, but learners were not asked to calculate the
actual force. Twenty-seven multiple-choice items spanned the three quizzes and three examinations.
Video usage was tracked through the university’s video hosting platform, which measured partici-
pant access and length of view in increments of 25%.

4.4. Statistical analysis


Our observational study examined learners enroled in both the traditional and hybrid-flipped ver-
sions of a course on fluid dynamics. Due to the small sample size and the possible presence of selec-
tion bias (i.e. into either the traditional or hybrid-flipped sections), we employed a PS method
entailing a two-stage regression analysis. The first part established the PS based on eight baseline
characteristics as covariates (discussed below). Following this, we used the PS as a regression
weight on the performance of learners who did not select into the hybrid course, thus allowing us
to balance the selection bias and thereby estimate the average treatment effect (Austin 2011b).
This method, known as the inverse probability of treatment weighting (IPTW), balances the scores
between treatment groups so as to mitigate bias associated with observable student characteristics
(see Pattanayak, Rubin, and Zell 2011). Propensity scoring methods are popular in medical studies
where randomised control trials are not advisable or ethical, such as post-surgery treatment
EUROPEAN JOURNAL OF ENGINEERING EDUCATION 719

(McCluskey et al. 2013), costs associated with hip replacements (Nikitovic et al. 2013), and smoking
cessation programmes (Austin 2011a). In such studies, the establishment of causality is desired, as
in randomised control studies (Cochran, Paul, and Chambers 1965), yet differing baseline character-
istics (covariates) between non-randomised groups does not allow for the unbiased estimation of the
average treatment effect (Austin 2011b). Therefore, such differences need to be controlled. In
essence, PS methods are designed to establish a balance in the pre-treatment, baseline character-
istics between the control and treatment groups. Covariates of interest must exist prior to adminis-
tration of the treatments, be predictive of treatment selection, and related to outcome
measurements. In other words, the covariates are: pre-treatment conditions, influencing factors on
learner selection behaviour, and related to academic performance in the course.
In this study, the eight covariates we included were academic variables and traditional demo-
graphic data: transfer status (whether the learner transferred in from another institution), gender,
age, class standing, major (civil engineering), total credits earned, overall university grade point
average (GPA), and GPA on course pre-requisites. Gender was selected as a traditional variable,
while the remaining six are strongly related to academic outcomes (see Astin 1993; Pascarella and
Terenzini 2005). While there were several majors present within the learners, by far the largest
group represented was civil engineering. Due to low sample numbers of the other majors, only
civil engineering as a major was included as a predictive factor in the PS model.
Once the PSs for each learner were estimated, weighting of each learner was as follows: each
learner in the hybrid-treatment group was weighted as 1 (because these learners did enrol in the
treatment section); each learner who did not enrol in the hybrid-flipped section was weighted
based on the inverse of their PS: 1/ (1 – PS). For instance, if Learner X was highly likely to select
into the treatment group (e.g. PS of 0.9) but, instead, enroled into the traditional course, that case
would be weighted as:
1/(1 − 0.9) = 1/0.1 = 10.
If Learner Y was less likely to select into the treatment group (e.g. PS of 0.1), that case would be
weighted as:
1/(1 − 0.1) = 1/0.9 = 1.11.
This method is known as the IPTW, wherein the PS is used to ‘create a synthetic sample’ where
treatment status and the distribution of baseline characteristics are independent (Austin 2011b,
408). In essence, this addresses the selection bias (based on the eight covariates), allowing residual
outcome measures to accurately reflect impact of the treatment conditions in terms of the
average treatment effect. For more detailed discussions of PS methods, see Reynolds and DesJardins
(2009), Austin (2011a, 2011b), Pattanayak, Rubin, and Zell (2011), and Herzog (2014).

5. Results
5.1. Selection bias
The first-stage estimation model exhibited good statistical fit (Hosmer-Lemeshow = .726; Nagelkerke
R-square = .37) and yielded a learner’s selection probability that we used in inverse probability
weighting to ensure learners with greater selection propensity who enroled in the regular course
were weighted higher in the analysis (see Herzog 2014; Reynolds and DesJardins 2009). Table 1
shows the PS Logit estimation that includes the variables used in this first-stage estimation.
Results suggest that enrolment in the hybrid-flipped section had a marginally significant negative
impact on the total course score (t = −1.987, p = .053), seen in Table 2, and a significant negative
impact on homework performance (t = −2.841, p = .007), seen in Table 3. Effect sizes were equivalent
to 0.37 and 0.71 standard deviations, respectively. The negative effect indicates that hybrid-flipped
section learners performed worse on homework and total course score than learners enroled in
720 D. J. HARRISON ET AL.

Table 1. Logit model for PS estimation.


Variable B SE Wald df Sig. Exp (B)
Out of state transfer 0.088 0.934 0.009 1 0.925 1.092
Gender −0.654 0.772 0.719 1 0.396 0.520
Age −0.490 0.268 3.344 1 0.067 0.613
Senior class standing 0.597 0.787 0.576 1 0.448 1.817
Major – civil engineering 2.231 0.856 6.793 1 0.009 9.314
Total credits earned 0.034 0.021 2.436 1 0.119 1.034
Overall university GPA 0.767 1.017 0.568 1 0.451 2.153
Pre-requisite GPA 0.103 0.853 0.015 1 0.904 1.108
Constant 2.434 4.604 0.280 1 0.597 11.408

Table 2. Estimation for overall course score.


Unstandardised
coefficients Standardised coefficients Collinearity statistics
Variable B SE Beta t Sig. Tolerance VIF
(Constant) 48.694 24.321 2.002 0.051
Treatment group −5.999 3.019 −0.192 −1.987 0.053 0.717 1.395
Out of state transfer 5.344 4.165 0.119 1.283 0.205 0.787 1.270
Gender −2.374 3.004 −0.074 −0.790 0.433 0.773 1.293
Age 1.596 0.840 0.180 1.901 0.063 0.752 1.330
Senior class standing −3.910 3.056 −0.118 −1.28 0.207 0.795 1.258
Major – civil engineering 5.407 4.293 0.113 1.260 0.214 0.839 1.191
Overall university GPA 27.593 4.590 0.849 6.012 0.000 0.337 2.967
Pre-requisite GPA −3.120 3.757 −0.104 −0.831 0.401 0.431 2.318
Attendance 3.802 2.295 0.173 1.657 0.104 0.617 1.620

the traditional section. Of the other covariates in the model, only cumulative GPA prior to course
enrolment exhibited a significant impact on all measured outcomes, with effect sizes ranging from
0.6 to 0.85 standard deviation changes in the outcome for a 1 standard deviation change in cumu-
lative GPA. Since collinearity is within tolerable limits according to the variance inflation factor (VIF <
3.0) for all covariates, the findings suggest that cumulative academic performance prior to course
enrolment is paramount in explaining variation in the measured outcomes.
We examined the impact of the hybrid-flipped model on higher-order thinking skills, as assessed
by far-transfer learning items (discussed above). As seen in Table 4, the hybrid-flipped format had no
significant impact on such items (t = 0.642, p = .524). Although lower-order thinking skills as reflected
in near-transfer items (discussed above) were lower for the hybrid-flipped course, differences were
not significant (t = −1.463, p = .150), as seen in Table 5.

Table 3. Estimation for total homework score.


Unstandardised
coefficients Standardised coefficients Collinearity statistics
Variable B SE Beta t Sig. Tolerance VIF
(Constant) 6.211 20.324 0.306 0.761
Treatment group −7.166 2.523 −0.372 −2.841 0.007 0.717 1.395
Out of state transfer 0.627 3.480 0.022 0.180 0.858 0.787 1.270
Gender −0.335 2.510 −0.017 −0.133 0.894 0.773 1.293
Age −1.817 2.554 −0.088 −0.711 0.480 0.795 1.258
Senior class standing 1.618 3.587 0.055 0.451 0.654 0.839 1.191
Major – civil engineering 11.447 3.835 0.570 2.985 0.004 0.337 2.967
Overall university GPA 1.174 0.702 0.214 1.672 0.101 0.752 1.330
Pre-requisite GPA −3.207 3.139 −0.172 −1.022 0.312 0.431 2.318
Attendance 3.245 1.918 0.239 1.692 0.097 0.617 1.620
EUROPEAN JOURNAL OF ENGINEERING EDUCATION 721

Table 4. Estimation for worked items (far-transfer learning) scores on quizzes and exams.
Unstandardised
coefficients Standardised coefficients Collinearity statistics
Variable B SE Beta t Sig. Tolerance VIF
(Constant) 113.73 63.43 1.793 0.079
Treatment group 5.055 7.873 0.083 0.642 0.524 0.717 1.395
Out of state transfer 7.556 10.862 0.085 0.696 0.490 0.787 1.27
Gender −11.000 7.833 −0.174 −1.404 0.167 0.773 1.293
Age −6.330 7.970 −0.097 −0.794 0.431 0.795 1.258
Senior class standing 7.128 11.195 0.076 0.637 0.527 0.839 1.191
Major – civil engineering 36.456 11.970 0.571 3.046 0.004 0.337 2.967
Overall university GPA 1.776 2.190 0.102 0.811 0.421 0.752 1.330
Pre-requisite GPA 3.942 9.798 0.067 0.402 0.689 0.431 2.318
Attendance −0.887 5.986 −0.021 −0.148 0.883 0.617 1.620

Table 5. Estimation for non-worked items (near-transfer learning) scores on quizzes and exams.
Unstandardised
coefficients Standardised coefficients Collinearity statistics
Variable B SE Beta t Sig. Tolerance VIF
(Constant) 14.173 14.755 0.961 0.342
Treatment group −2.679 1.831 −0.200 −1.463 0.150 0.717 1.395
Out of state transfer 3.487 2.527 0.180 1.380 0.174 0.787 1.270
Gender −0.255 1.822 −0.018 −0.140 0.889 0.773 1.293
Age 0.001 1.854 0.000 0.001 1.000 0.795 1.258
Senior class standing 4.100 2.604 0.199 1.574 0.122 0.839 1.191
Major – civil engineering 8.371 2.784 0.600 3.006 0.004 0.337 2.967
Overall university GPA 0.266 0.510 0.070 0.523 0.603 0.752 1.330
Pre-requisite GPA −0.154 2.279 −0.012 −0.068 0.946 0.431 2.318
Attendance −0.693 1.392 −0.073 −0.498 0.621 0.617 1.620

5.2. Video viewing


Of the total amount of video instruction (21 hours, 57 minutes, and 57 seconds), learners in the
hybrid-flipped section watched only 23.89%. Two learners did not watch any videos, and the most
watched by one learner was 81.91% of total video time.
Viewing habits as measured by minutes viewed do not show correlations with the four dependent
measures used in our study (skewness and kurtosis of minutes viewed were within acceptable par-
ameters, allowing Pearson Correlation to be conducted). Video viewing did not significantly correlate
with the overall course total, r(22) = .213, p = .318; homework, r(22) = −0.012, p = .956; far-transfer
items, r(22) = 0.317, p = .131; and near-transfer items, r(22) = 0.359, p = .085.

6. Discussion
We attempted to ascertain the impact of a hybrid-flipped model on fluid mechanics instruction by
inverting one section of an undergraduate course on fluid mechanics, with reduced seat time that
was replaced with online viewing of lecture videos, to a section taught with traditional lectures.
We compared learning performance on total course grades, homework scores, and far- and near-
transfer learning items. By using the PS as a regression weight to address potential bias due to unba-
lanced distribution of baseline characteristics across the groups, we attempt to render the treatment
and control group alike based on observable pre-treatment characteristics (Herzog 2014). Bai
describes the PS as ‘used to reduce the selection bias through balancing groups based on the
observed covariates’ by estimating the probability of a subject ‘being assigned to a particular con-
dition in a study given a set of observed covariates’ (2011, 274).
We found that the hybrid-flipped format trended towards negative differences in overall course
score and significant negative differences in homework scores, that is, learners in the hybrid-flipped
722 D. J. HARRISON ET AL.

section did worse than those in the traditional section. While some studies examining the hybrid-
flipped model found significant positive impacts on overall scores (e.g. Day and Foley 2006;
Moravec et al. 2010; Pierce and Fox 2012; Wilson 2013), our study instead agreed with other
research showing that the model does not significantly impact overall scores (e.g. Blair, Maharaj,
and Primus 2015; Fraga and Harmon 2014; Davies, Dean, and Ball 2013; Lage, Platt, and Treglia
2000; Utts et al. 2003). We found, in fact, that the model has the potential to negatively impact
homework performance, which in turn affects overall course grades (likely due to low viewing
rates of the videos, discussed below). The strongest predictor for learner performance was under-
graduate GPA, which indicates that the method of content delivery may not be as important as
individual learner motivation to learn as evidenced by prior academic performance (see Astin
(1993) and Pascarella and Terenzini (2005) for discussions of the impact of prior performance on
future performance).
While some researchers report that the model has the potential to significantly improve higher
order learning (far-transfer of knowledge) because it allows for the immediacy of the instructor as
learners engage in difficult cognitive processing (e.g. Stone 2012; Redekopp and Ragusa 2013; Touch-
ton 2015), our research did not support such findings. We found, instead, that the hybrid-flipped
model did not lead to significant differences between higher order (far-transfer) and lower order
(near-transfer) items on the examinations.
A possible reason for the lower performance in overall course and homework scores, and the lack
of support evidenced by our results for increased higher order learning in the hybrid-flipped section,
could have been the video viewing habits of the participants. Participants were not required to watch
the videos before attending class, which could have led to low patterns of viewing (only 23.89% of
available viewing time on average). While one cannot directly compare the viewing statistics of the
hybrid-flipped section with face-to-face attendance statistics of the traditional section, it is worth
noting that the traditional section learners averaged 93.37% on their attendance measurement as
computed by randomly taking attendance 24 times throughout the semester. However, Pearson cor-
relation statistics indicate that video viewing habits did not significantly correlate with the four out-
comes measurements. In addition, videos were available to learners in the traditional section, and
some of the learners in that section did watch them.
While the overall results of the comparison between the two sections did not result in a clear learn-
ing advantage of either the traditional or the hybrid-flipped format, course evaluation ratings indi-
cated that learners in the traditional section rated the course higher (average 4.02 out of 5; N = 35)
than learners in the hybrid-flipped section (average 3.65; N = 22). However, comments and ratings
from learners in the hybrid-flipped course overwhelmingly were positive about the hybrid-flipped
format, seen in Table 6.

Table 6. Average course evaluation responses from learners in the 2013 hybrid-flipped format section.
Course evaluation question Response
I felt I learned the material better with the hybrid format 4.09
I felt the hybrid format hindered my learning of material 1.91
It was helpful to work in groups on the recitation problems 4.45
Working in groups on the recitation problems confused my understanding of the material 1.82
The instructor was helpful in going over the recitation problems 4.32
The instructor confused my understanding when going over the recitation problems 1.91
The recitation problems were helpful for learning the material 4.68
The recitation problems confused my understanding of the material 1.71
I would recommend offering the course again in the hybrid format rather than the traditional lecture format 4.23
I would recommend offering the course only in the traditional lecture format with voluntary recitations 2.05
I would take another class (i.e. besides hydrologic fluid dynamics [mechanics]) in the hybrid format 4.23
I would take another class (i.e. besides hydrologic fluid dynamics [mechanics]) in the traditional formal (required 3.33
lectures and voluntary problem session)
Note: N = 22.
1 = strongly disagree; 2 = disagree; 3 = neutral; 4 = agree; 5 = strongly agree.
EUROPEAN JOURNAL OF ENGINEERING EDUCATION 723

While the results did not necessarily support the extra time and effort involved in creating the
flipped format (e.g. creating and editing the videos), three advantages of the model became
readily apparent, encouraging the instructor to continue refining the model in future semesters.
First, the model allowed for active learning in the form of problem-based learning to occur in the
formal classroom. This format provided immediate tutelage by the course instructor, which was
not available in the traditional format because few learners attended voluntary recitations or office
hours, so most learners waited until the next class period to access the instructor. Second, the
model encouraged and facilitated interactive dialogue between the learners and the instructor, inter-
action that was not present in the lecture hall atmosphere. Third, the model released a time-slot for
the lecture hall or classroom space for other courses. We surmise that using a flipped-type of format
certainly assists in bringing active learning into the course (with or without the hybrid aspect), but our
experiment also shows some of the pitfalls that must be addressed, namely addressing poor viewing
rates of the videos and the costs to produce the video content.

6.1. Limitations
This study examined a relatively small convenience sample, therefore our results may not be gener-
alisable to a wider population. The statistical model utilised for this study, weighted regression (using
IPTW), however, addresses the potential bias arising from student self-selection inherent in observa-
tional studies. As mentioned above, PS methods assume that all pertinent baseline characteristics
that could impact subjects’ selection into the traditional or hybrid-flipped sections have been
accounted for. Lack of such accounting can limit the effectiveness of PS methods (Bai 2011). We
endeavoured to include these covariates, but as Austin notes, determining exactly what baseline
characteristics should be considered in PS models is not a set science due to the current ‘lack of con-
sensus in the applied literature as to which variables to include’ (2011b, 414), while conversely ‘not all
variables related to treatment and outcome need to be included’ when a wide variety of baseline
characteristics are included due to correlation between the non-included covariates and the included
covariates (Herzog 2014, 24).
In addition, the nature of the content and reliance on worked problems within the recitation sec-
tions limits this study to those content areas that similarly rely on worked problems, such as STEM
education (science, technology, engineering, and mathematics). Furthermore, while the homework
assessments allowed for either individual or collaborative work, the mid-term and final examinations
did not allow for collaborative work. This may have meant that such assessments were related to the
traditional format of learning, thus potentially negatively impacting the learners in the hybrid-treat-
ment group. Such individual assessment, however, was necessary due to accreditation standards and
industry certification practices. We were unable to parse out such potential impact.
Due to the real-world setting of this study, the researchers were unable to ensure that learners in
the hybrid-flipped course watched all the videos, nor that learners did so under the same environ-
mental conditions. This fact may partially account for the results of non-significance. As mentioned
above, watching the videos was not required, and it is doubtful that even if such a stipulation was
imposed that the learners would have complied in similar percentages as achieved by studies con-
ducted in controlled laboratory conditions. Such experimental conditions as existed in the current
study, we would posit, are more similar to those conditions experienced by instructors than the con-
ditions imposed by controlled laboratory conditions. Thus, our results may be of more interest to
those instructors who struggle with the same learner behaviours than results obtained purely from
laboratory conditions.
The learners in the traditional section had optional recitation sessions available to them, led by a
teaching assistant. Although we recorded attendance at these sessions, we were not able to take such
attendance into account within our statistical analysis because only nine of the learners took advan-
tage of these sessions, with a total participation rate of 7.14% (when averaging attendance across all
15 sessions, that ranged from a low of 1 attendee to a high of 6 attendees, out of 35 learners). Thus,
724 D. J. HARRISON ET AL.

attendance at these sessions may have impacted the learning outcomes of the traditional learners
who did attend, but the sample size was too low to warrant analysis.

7. Conclusion and suggestions for further research


The hybrid-flipped format allowed the instructor to interact with the learners to a much higher
degree than the traditional lecture format, especially as the latter format typically includes learner
enrolments several times larger that also require larger lecture halls. Applying some of the lessons
learned from conducting the study and initial impressions of the data, during the 2014 academic
year (following this study), the instructor offered two sections that both utilised the hybrid-flipped
model; there was no traditional lecture section available. Length of in-class contact time was
increased to 110 minutes per week, and this increase appears to have been effective. Cursory exam-
inations of learner course evaluations, however, showed a general distaste for the hybrid-flipped
model, perhaps due to two factors: there was no option for a traditional section, such that learners
could not self-select the hybrid-flipped model; and the sections contained twice the number of lear-
ners as the hybrid-flipped section in the present study conducted in 2013. The latter factor may have
led to a decrease in the effectiveness of the instructor’s presence upon which the hybrid-flipped
model relies along with a decrease in learners’ perceptions of the instructor’s instruction. In the fol-
lowing year, 2015, the instructor again offered a traditional lecture section and a hybrid-flipped
section, whereupon 17 learners enroled in the latter section. Learner feedback regarding the 2015
hybrid-flipped section reversed the 2014 negative reviews, with generally positive feedback about
the format. While the instructor did try to incentivise preparing for class sessions by using three
pop quizzes in 2013, for 2015 she implemented regular and frequent quizzes throughout the seme-
ster (i.e. learners were told that there would be weekly quizzes that were given at the same time each
week). Overall class performance at the end of the semester in the hybrid section in 2015 was much
higher than for the regular section (i.e. 42% received a B+ or higher in the class, as compared to 22%
in the regular class). Thus, more frequent quizzing may be useful for incentivising coming to class
prepared for a hybrid-flipped format.
We believe that five main lessons can be learned from our study and reflections on subsequent
usage of the model in 2014 and 2015:

(1) Instructors cannot assume that learners will watch, read, etc., the core content before coming to
class. Incentives must be considered, for example, regular and frequent accountability quizzes,
and specific explanations of the benefits of watching, reading, etc., each video, chapter, etc.
The propensity of learners to skip such preparation, as we observed, may explain the negative
impacts of the model on academic performance. These possibilities deserve further inquiry.
(2) The ‘hybrid’ portion of the hybrid-flipped model should not be as severe as employed in our
study. Learners need more in-class time to work on problems than our study allowed. Sub-
sequent semesters were altered to allow for more in-class time, which appears to be the
proper direction.
(3) The provision of choice may be an important consideration: allow learners the option to choose a
flipped section or a traditional section.
(4) The size of the class may also be an important consideration. In our study, the hybrid-flipped
section comprised 24 learners, while the following semester allowed over 40. This increase in
enrolment was anecdotally associated with a decrease in the instructor’s effectiveness during
active learning sessions, which was reversed in 2015 when only 17 learners were enroled.
(5) An instructor learning curve may also be a factor that impacts the effectiveness of the flipped
model. The hybrid-flipped model is now in its third year of employment and the instructor indi-
cated that she feels her effectiveness in the active learning sessions has increased since the initial
study was conducted.
EUROPEAN JOURNAL OF ENGINEERING EDUCATION 725

Given the fourth lesson above, the scalability of the hybrid-flipped model is, therefore, a concern.
More research is needed to determine the limits of course enrolment upon the flipped model’s effec-
tiveness, logistics, and learner perceptions of quality. This concern is the flipped model’s Achilles’
heel: as the model relies on the immediacy of the instructor to the learners, does the model
require smaller class enrolments and smaller classrooms in order to be effective? In other words,
does the model require more instructors and more classroom space? If so, its implementation will
be severely hampered in tertiary settings where instructors and classroom space are already under
strain.
Finally, instructors and institutions considering a flipped model must undertake a cost-benefit
analysis. The cost in terms of time, materials, and the funding of a graduate learner necessary to
record, edit, and upload the videos was a significant factor in the present study. As noted by other
researchers (Largent 2013; Campbell et al. 2014; Giannakos, Krogstie, and Chrisochoides 2014), the
start-up costs implementing the model can be significant. Changes in textbook editions, instructors,
nationally normed examinations, etc., may require partial or complete overhaul of the pre-recorded
lectures. If such changes occur fairly soon after the initial investment, the flipped model may turn out
to be more expensive compared to traditional lectures for similar learning performance.

Acknowledgements
The authors would like to thank John Volk for filming, editing, and uploading the lecture videos.

Disclosure statement
No potential conflict of interest was reported by the authors.

Notes on contributors
Mr David J. Harrison is an instructional designer at the University of Nevada, Reno and a doctoral candidate at Old
Dominion University.
Dr Laurel Saito is an Associate Professor in the Department of Natural Resources and Environmental Science at the Uni-
versity of Nevada, Reno and Director of the Graduate Programme of Hydrologic Sciences.
Dr Nancy Markee is associate professor and assistant department chair in the Department of Natural Resources and
Environmental Science.
Dr Serge Herzog is the Director of Institutional Analysis at the University of Nevada, Reno.

ORCiD
David J. Harrison http://orcid.org/0000-0003-1634-7618
Laurel Saito http://orcid.org/0000-0003-3617-3133

References
Astin, Alexander W. 1993. What Matters in College? Four Critical Years Revisited. San Francisco, CA: Jossey-Bass.
Austin, Peter C. 2011a. “A Tutorial and Case Study in Propensity Score Analysis: An Application to Estimating the Effect of
In-hospital Smoking Cessation Counseling on Mortality.” Multivariate Behavioral Research 46 (1): 119–151. doi:10.1080/
00273171.2011.540480.
Austin, Peter C. 2011b. “An Introduction to Propensity Score Methods for Reducing the Effects of Confounding in
Observational Studies.” Multivariate Behavioral Research 46 (3): 399–424. doi:10.1080/00273171.2011.568786.
Bai, Haiyan. 2011. “Using Propensity Score Analysis for Making Causal Claims in Research Articles.” Educational Psychology
Review 23 (2): 273–278. doi:10.1007/s10648-011-9164-9.
Baird, Deborah K., and Pamela A. Dupin-Bryant. 2014. “The Development of Procedures and Policies for Undergraduate
Hybrid Courses: A Comparison Study.” Issues in Information Systems 15 (2): 441–449.
726 D. J. HARRISON ET AL.

Baker, J. W. 2000. “‘The Classroom Flip’: Using Web Course Management Tools to Become the Guide by the Side.” Selected
Papers from the 11th International Conference on College Teaching and Learning.
Barrows, Howard S., and Robyn M. Tamblyn. 1980. Problem-based Learning: An Approach to Medical Education. New York:
Springer.
Bergmann, Jonathan, and Aaron Sams. 2012. Flip Your Classroom: Reach Every Student in Every Class Every Day. Eugene, OR:
International Society for Technology in Education.
Blair, Erik, Chris Maharaj, and Simone Primus. 2015. “Performance and Perception in the Flipped Classroom.” Education
and Information Technologies, 1–18. doi:10.1007/s10639-015-9393-5.
Boyle, Tom, Claire Bradley, Peter Chalk, Ray Jones, and Poppy Pickard. 2003. “Using Blended Learning to Improve Student
Success Rates in Learning to Program.” Journal of Educational Media 28 (2–3): 165–178. doi:10.1080/
1358165032000153160.
Burr, Samuel Engle. 1932. “Active Versus Passive Learning.” Journal of Education 115 (22): 656–657.
Campbell, Jennifer, Diane Horton, Michelle Craig, and Paul Gries. 2014. “Evaluating an inverted CS1.” Proceedings of the
45th ACM Technical Symposium on Computer Science Education, Atlanta, GA.
Carlisle, Martin C. 2010. “Using You Tube to Enhance Student Class Preparation in an Introductory Java Course.”
Proceedings of the 41st ACM Technical Symposium on Computer Science Education, Milwaukee, WI.
Chi, Michelene T. H. 2009. “Active-constructive-interactive: A Conceptual Framework for Differentiating Learning
Activities.” Topics in Cognitive Science 1 (1): 73–105. doi:10.1111/j.1756-8765.2008.01005.x.
Chi, Michelene T. H., Nicholas De Leeuw, Mei-Hung Chiu, and Christian Lavancher. 1994. “Eliciting Self-explanations
Improves Understanding.” Cognitive Science 18 (3): 439–477. doi:10.1207/s15516709cog1803_3.
Chin, Craig Anthony. 2014. “Evaluation of a Flipped Classroom Implementation of Data Communications Course:
Challenges, Insights and Suggestions.” Accessed April 24, 2015. http://www.spsu.edu/cte/publications/
publications2014/sotl_2014_chin.pdf.
Cochran, W. G., and S. Paul Chambers. 1965. “The Planning of Observational Studies of Human Populations.” Journal of the
Royal Statistical Society 128 (2): 234–266. doi:10.2307/2344179.
Cottrell, David M., and Reid A. Robison. 2003. “Case 4: Blended Learning in an Accounting Course.” Quarterly Review of
Distance Education 4 (3): 261–69.
Davies, Randall S., Douglas L. Dean, and Nick Ball. 2013. “Flipping the Classroom and Instructional Technology Integration
in a College-level Information Systems Spreadsheet Course.” Educational Technology Research and Development 61 (4):
563–580. doi:10.1007/s11423-013-9305-6.
Day, Jason, and Jim Foley. 2006. “Evaluating Web Lectures: A Case Study from HCI.” CHI ‘06 Extended Abstracts on Human
Factors in Computing Systems, Montreal, Quebec.
Dollár, Anna, and Paul Steif. 2009. “Web-based Statics Course Used in An Inverted Classroom.” Proceedings of the
American Society for Engineering Education Annual Conference & Exposition, Austin, TX.
Dowling, Carlin, Jayne M. Godfrey, and Nikole Gyles. 2003. “Do Hybrid Flexible Delivery Teaching Methods Improve
Accounting Students’ Learning Outcomes?” Accounting Education 12 (4): 373–391. doi:10.1080/0963928032000
154512.
Enfield, Jacob. 2013. “Looking at the Impact of the Flipped Classroom Model of Instruction on Undergraduate Multimedia
Students at CSUN.” TechTrends 57 (6): 14–27. doi:10.1007/s11528-013-0698-1.
Faust, Jennifer L., and Donald R. Paulson. 1998. “Active Learning in the College Classroom.” Journal on Excellence in College
Teaching 9 (2): 3–24.
Ferreri, Stefanie P., and Shanna K. O’Connor. 2013. “Redesign of a Large Lecture Course into a Small-group Learning
Course.” American Journal of Pharmaceutical Education 77 (1): 13. doi:10.5688/ajpe77113.
Fiorella, Logan, and Richard E. Mayer. 2015. Learning as a Generative Activity: Eight Learning Strategies That Promote
Understanding. Cambridge: Cambridge University Press.
Foertsch, Julie, Gregory Moses, John Strikwerda, and Mike Litzkow. 2002. “Reversing the Lecture/Homework Paradigm
Using eTEACH® Web-based Streaming Video Software.” Journal of Engineering Education 91 (3): 267–274. doi:10.
1002/j.2168-9830.2002.tb00703.x.
Fraga, Lucretia M., and Janis Harmon. 2014. “The Flipped Classroom Model of Learning in Higher Education: An
Investigation of Preservice Teachers’ Perspectives and Achievement.” Journal of Digital Learning in Teacher
Education 31 (1): 18–27. doi:10.1080/21532974.2014.967420.
Freeman, Scott, Sarah L. Eddy, Miles McDonough, Michelle K. Smith, Nnadozie Okoroafor, Hannah Jordt, and Mary Pat
Wenderoth. 2014. “Active Learning Increases Student Performance in Science, Engineering, and Mathematics.”
Proceedings of the National Academy of Sciences 111 (23): 8410–8415. doi:10.1073/pnas.1319030111.
Gannod, Gerald C., Janet E. Burge, and Michael T. Helmick. 2008. “Using the Inverted Classroom to Teach Software
Engineering.” Proceedings of the 30th international conference on Software engineering, Leipzig, Germany.
Garrison, D. Randy, and Heather Kanuka. 2004. “Blended Learning: Uncovering its Transformative Potential in Higher
Education.” The Internet and Higher Education 7 (2): 95–105. doi:10.1016/j.iheduc.2004.02.001.
Gaughan, Judy E. 2014. “The Flipped Classroom in World History.” History Teacher 47 (2): 221–244.
Gehringer, Edward F., and Barry W. Peddycord. 2013. “The Inverted-lecture Model: A Case Study in Computer
Architecture.” Proceeding of the 44th ACM Technical Symposium on Computer Science Education, Denver, CO.
EUROPEAN JOURNAL OF ENGINEERING EDUCATION 727

Giannakos, Michail N., John Krogstie, and Nikos Chrisochoides. 2014. “Reviewing the Flipped Classroom Research:
Reflections for Computer Science Education.” Proceedings of the Computer Science Education Research
Conference, Berlin.
Hake, Richard R. 1998. “Interactive-engagement Versus Traditional Methods: A Six-thousand-Student Survey of
Mechanics Test Data for Introductory Physics Courses.” American Journal of Physics 66 (1): 64–74. doi:10.1119/1.18809.
Hamdan, Noora, Patrick McKnight, Katherine McKnight, and Kari M Arfstrom. 2013. “A White Paper Based on the
Literature Review Titled A Review of Flipped Learning.” Accessed November 15. http://www.flippedlearning.org/
research.
Harrison, David J. 2015. “Assessing Experiences with Online Educational Videos: Converting Multiple Constructed
Responses to Quantifiable Data.” The International Review of Research in Open and Distributed Learning 16 (1): 168–192.
Herold, Michael J., Thomas D. Lynch, R Rajiv Ramnath, and Jayashree Ramanathan. 2012. “Student and Instructor
Experiences in the Inverted Classroom.” Frontiers in Education Conference (FIE). doi:10.1109/FIE.2012.6462428.
Herzog, Serge. 2014. “The Propensity Score Analytical Framework: An Overview and Institutional Research Example.” New
Directions for Institutional Research 2014 (161): 21–40. doi:10.1002/ir.20065.
Kim, Min Kyu, So Mi Kim, Otto Khera, and Joan Getman. 2014. “The Experience of Three Flipped Classrooms in an Urban
University: An Exploration of Design Principles.” The Internet and Higher Education 22: 37–50. doi:10.1016/j.iheduc.
2014.04.003.
Kim, Sang-Hong, Nam-Hun Park, and Kil-Hong Joo. 2014. “Effects of Flipped Classroom Based on Smart Learning on Self-
directed and Collaborative Learning.” International Journal of Control & Automation 7 (12): 69–80. doi:10.14257/ijca.
2014.7.12.07.
Lage, Maureen J., Glenn J. Platt, and Michael Treglia. 2000. “Inverting the Classroom: A Gateway to Creating an Inclusive
Learning Environment.” The Journal of Economic Education 31 (1): 30–43. doi:10.1080/00220480009596759.
Largent, David L. 2013. “Flipping a Large CS0 Course: An Experience Report about Exploring the Use of Video, Clickers and
Active Learning.” Journal of Computing Sciences in Colleges 29 (1): 84–91.
Larson, Stephen, and Junko Yamamoto. 2013. “Flipping the College Spreadsheet Skills Classroom: Initial Empirical
Results.” Journal of Emerging Trends in Computing and Information Sciences 4 (10): 751–758.
Linden, Michele, and Marlin C. Wittrock. 1981. “The Teaching of Reading Comprehension According to the Model of
Generative Learning.” Reading Research Quarterly 17 (1): 44–57. doi:10.2307/747248.
Lockwood, Kate, and Rachel Esselstein. 2013. “The Inverted Classroom and the CS Curriculum.” Proceeding of the 44th
ACM technical symposium on Computer science education, Denver, CO, March.
Love, Betty, Angie Hodge, Neal Grandgenett, and Andrew W. Swift. 2014. “Student Learning and Perceptions in a Flipped
Linear Algebra Course.” International Journal of Mathematical Education in Science and Technology 45 (3): 317–324.
doi:10.1080/0020739X.2013.822582.
Mason, G. S., T. R. Shuman, and K. E. Cook. 2013. “Comparing the Effectiveness of an Inverted Classroom to a Traditional
Classroom in an Upper-division Engineering Course.” Education, IEEE Transactions on 56 (4): 430–435. doi:10.1109/TE.
2013.2249066.
McCluskey, Stuart A., Keyvan Karkouti, Duminda Weijeysundera, Leonid Minkovich, Gordon Tait, and Scott W. Beattie.
2013. “Hyperchloremia after Noncardiac Surgery Is Independently Associated with Increased Morbidity and
Mortality: A Propensity-matched Cohort Study.” Anesthesia & Analgesia 117 (2): 412–421. doi:10.1213/ANE.
0b013e318293d81e.
McCray, Gordon E. 2000. “The Hybrid Course: Merging On-line Instruction and the Traditional Classroom.” Information
Technology and Management 1 (4): 307–327. doi:10.1023/A:1019189412115.
McGivney-Burelle, Jean, and Fei Xue. 2013. “Flipping Calculus.” PRIMUS 23 (5): 477–486. doi:10.1080/10511970.2012.
757571.
McLaughlin, Jacqueline E., LaToya M. Griffin, Denise A. Esserman, Christopher A. Davidson, Dylan M. Glatt, Mary T. Roth,
Nastaran Gharkholonarehe, and Russell J. Mumper. 2013. “Pharmacy Student Engagement, Performance, and
Perception in a Flipped Satellite Classroom.” American Journal of Pharmaceutical Education 77 (9): 1–8. doi:10.5688/
ajpe779196.
Means, Barbara, Yukie Toyama, Robert Murphy, Marianne Bakia, and Karla Jones. 2010. Evaluation of Evidence-based
Practices in Online Learning: A Meta-analysis and Review of Online Learning Studies. Washington, DC: U.S.
Department of Education.
Moffett, Jenny, and Aileen C. Mill. 2014. “Evaluation of the Flipped Classroom Approach in a Veterinary Professional Skills
Course.” Advances in Medical Education and Practice 5: 415–425. doi:10.2147/AMEP.S70160.
Moravec, Marin, Adrienne Williams, Nancy Aguilar-Roca, and Diane K O’Dowd. 2010. “Learn Before Lecture: A Strategy
That Improves Learning Outcomes in a Large Introductory Biology Class.” CBE-life Sciences Education 9 (4): 473–481.
doi:10.1187/cbe.10-04-0063.
Moreno, Roxana, Richard E. Mayer, Hiller A. Spires, and James C. Lester. 2001. “The Case for Social Agency in Computer-
based Teaching: Do Students Learn More Deeply When They Interact with Animated Pedagogical Agents?” Cognition
and Instruction 19 (2): 177–213. doi:10.1207/S1532690XCI1902_02.
Nikitovic, M., W. P. Wodchis, M. D. Krahn, and S. M. Cadarette. 2013. “Direct Health-care Casts Attributed to Hip Fractures
among Seniors: A Matched Cohort Study.” Osteoporos International 24 (2): 659–669. doi:10.1007/s00198-012-2034-6.
728 D. J. HARRISON ET AL.

Pascarella, Earnest T., and Patrick T. Terenzini. 2005. How College Affects Students. Edited by Kenneth A. Feldman. Vol. 2.
San Francisco, CA: Jossey-Bass.
Pattanayak, Cassandra W., Donald B. Rubin, and Elizabeth R. Zell. 2011. “Propensity Score Methods for Creating Covariate
Balance in Observational Studies.” Revista Española de Cardiología (English Version) 64 (10): 897–903.
Pierce, Richard, and Jeremy Fox. 2012. “Vodcasts and Active-learning Exercises in a ‘Flipped Classroom’ Model of a Renal
Pharmacotherapy Module.” American Journal of Pharmaceutical Education 76 (10): 1–5. doi:10.5688/ajpe7610196.
Redekopp, Mark W., and Gisele Ragusa. 2013. “Evaluating Flipped Classroom Strategies and Tools for Computer
Engineering.” Proceedings of ASEE Annual Conference. Accessed April 24, 2015. http://www.asee.org/public/
conferences/20/papers/7063/download.
Reynolds, C. Lockwood, and Stephen L. DesJardins. 2009. “The Use of Matching Methods in Higher Education Research:
Answering Whether Attendance at a 2-Year Institution Results in Differences in Educational Attainment.” In Higher
Education: Handbook of Theory and Research, edited by John C Smart, 47–97. Dordrecht: Springer. doi:10.1007/978-
1-4020-9628-0_2
Rieber, Lloyd P. 1996. “Seriously Considering Play: Designing Interactive Learning Environments Based on the Blending of
Microworlds, Simulations, and Games.” Educational Technology Research and Development 44 (2): 43–58. doi:10.1007/
BF02300540.
Ronchetti, Marco. 2010. “Using Video Lectures to Make Teaching More Interactive.” International Journal of Emerging
Technologies in Learning (iJET) 5 (2): 45–48. doi:10.3991/ijet.v5i2.1156.
Scaife, Michael, Yvonne Rogers, Frances Aldrich, and Matt Davies. 1997. “Designing for or Designing with? Informant
Design for Interactive Learning Environments.” Proceedings of the ACM SIGCHI Conference on Human Factors in
Computing Systems, Atlanta, GA, March.
Schwan, Stephan, and Roland Riempp. 2004. “The Cognitive Benefits of Interactive Videos: Learning to tie Nautical Knots.”
Learning and Instruction 14 (3): 293–305. doi:10.1016/j.learninstruc.2004.06.005.
Stone, Bethany B. 2012. “Flip Your Classroom to Increase Active Learning and Student Engagement.” Proceedings from
28th Annual Conference on Distance Teaching & Learning, Madison, WI. Accessed January 10, 2015. http://www.uwex.
edu/disted/conference/Resource_library/proceedings/56511_2012.pdf.
Stone, Jeffrey A., and Tricia K. Clark. 2015. “Experiences with a Hybrid CS1 for Non-majors.” Journal of Computing Sciences
in College 30 (3): 47–53.
Strayer, Jeremy F. 2012. “How Learning in an Inverted Classroom Influences Cooperation, Innovation and Task
Orientation.” Learning Environments Research 15 (2): 171–193. doi:10.1007/s10984-012-9108-4.
Talbert, Robert. 2014. “Inverting the Linear Algebra Classroom.” PRIMUS 24 (5): 361–374. doi:10.1080/10511970.2014.
883457.
Toto, Roxnne, and Nguyen Hien. 2009. “Flipping the Work Design in an Industrial Engineering Course.” Frontiers in
Education Conference, 2009. FIE ‘09. 39th IEEE, 18–21 October.
Touchton, Michael. 2015. “Flipping the Classroom and Student Performance in Advanced Statistics: Evidence from a
Quasi-experiment.” Journal of Political Science Education 11 (1): 28–44. doi:10.1080/15512169.2014.985105.
UMBCtube. 2009. “Confessions of a Converted Lecturer: Eric Mazur”. YouTube video, November.
Utts, Jessica, Barbara Sommer, Curt Acredolo, Michael W Maher, and Harry R Matthews. 2003. “A Study Comparing
Traditional and Hybrid Internet-based Instruction in Introductory Statistics Classes.” Journal of Statistics Education
11 (3): 171–173.
Warter-Perez, Nancy, and Jianyu Dong. 2012. “Flipping the Classroom: How to Embed Inquiry and Design Projects into a
Digital Engineering Lecture.” Proceedings of the 2012 ASEE PSW Section Conference, April. Accessed September
5. http://aseepsw2012.calpoly.edu/site_media/uploads/proceedings/papers/10B_35_ASEE_PSW_2012_Warter-Perez.
pdf.
Weinstein, Claire E., and Richard E. Mayer. 1986. “The Teaching of Learning Strategies.” In Handbook of Research on
Teaching, edited by Marlin C. Wittrock, 315–327. New York: Macmillan.
Wilkie, Kay. 2004. “What Is Proble-based Learning?” In Chronic Wound Care: A Problem-based Learning Approach, edited by
Moya Morrison, Liza Gretchen, and Kay Wilkie, 12–23. Edinburgh: Mosby.
Wilson, Stephanie Gray. 2013. “The Flipped Class a Method to Address the Challenges of an Undergraduate Statistics
Course.” Teaching of Psychology 40 (3): 193–199. doi:10.1177/0098628313487461.
Wittrock, Marlin C. 1978. “The Cognitive Movement in Instruction.” Educational Psychologist 13 (1): 15–29. doi:10.1080/
00461527809529192.
Wittrock, Merlin C., and Kathryn Alesandrini. 1990. “Generation of Summaries and Analogies and Analytic and Holistic
Abilities.” American Educational Research Journal 27 (3): 489–502. doi:10.3102/00028312027003489.

You might also like