Download as pdf or txt
Download as pdf or txt
You are on page 1of 19

Journal Pre-proof

The polarizing effect of the online flipped classroom

Christian Stöhr, Christophe Demazière, Tom Adawi

PII: S0360-1315(19)30339-2
DOI: https://doi.org/10.1016/j.compedu.2019.103789
Reference: CAE 103789

To appear in: Computers & Education

Received Date: 2 September 2019


Revised Date: 5 December 2019
Accepted Date: 21 December 2019

Please cite this article as: Stöhr C., Demazière C. & Adawi T., The polarizing effect of the online flipped
classroom, Computers & Education (2020), doi: https://doi.org/10.1016/j.compedu.2019.103789.

This is a PDF file of an article that has undergone enhancements after acceptance, such as the addition
of a cover page and metadata, and formatting for readability, but it is not yet the definitive version of
record. This version will undergo additional copyediting, typesetting and review before it is published
in its final form, but we are providing this version to give early visibility of the article. Please note that,
during the production process, errors may be discovered which could affect the content, and all legal
disclaimers that apply to the journal pertain.

© 2019 Published by Elsevier Ltd.


The polarizing effect of the online flipped classroom

Christian Stöhra, Christophe Demazièreb, Tom Adawic

a Department of Communication and Learning in Science (CLS)


Division of Engineering Education Research
Chalmers University of Technology
SE-41296 Gothenburg
Sweden
christian.stohr@chalmers.se

b Department of Physics
Division of Subatomic and Plasma Physics
Chalmers University of Technology
SE-41296 Gothenburg
Sweden
demaz@chalmers.se

c Department of Communication and Learning in Science (CLS)


Division of Engineering Education Research
Chalmers University of Technology
SE-41296 Gothenburg
Sweden
tom.adawi@chalmers.se

Corresponding author:
Christian Stöhr
Department of Communication and Learning in Science (CLS)
Division of Engineering Education Research
Chalmers University of Technology
SE-41296 Gothenburg
Sweden
christian.stohr@chalmers.se

Declarations of interest: none


The polarizing effect of the online flipped classroom

Abstract

Against the background of the burgeoning use of online learning in higher education, and more recently the
online flipped classroom, this study sets out to evaluate the efficacy of the online flipped classroom. To this end,
a campus-based course in applied physics for master and doctoral students was transformed into an online
flipped course. We analyse how this transformation and student participation in specific asynchronous and
synchronous learning activities affect performance for the online flipped format. Results reveal that, while there
was no statistically significant difference in average performance between the campus-based and online flipped
format, the online flipped format led to a significantly larger spread - a polarization - in performance. Further,
drawing on transactional distance theory, we attribute this polarization to a higher oscillation in transactional
distance for the online flipped format. This link between transactional distance and performance constitutes the
main theoretical contribution of this paper. Our findings speak to the importance of analysing shifts in
transactional distance across learning activities to better understand their potential to bolster student learning.
Future research would do well to focus on how students experience such shifts in transactional distance, and
online instructors need to consider how to scaffold students during these shifts.1

Keywords: Distance education and online learning, Teaching/learning strategies, Post-secondary education,
Evaluation methodologies

1. Introduction

The extent and importance of online or web-based learning in higher education has increased tremendously in
the last decade, triggered by new educational technologies and pedagogical approaches. In the US, for example,
6.7 million students were enrolled in distance education courses in 2017, with almost half of them taking their
courses exclusively online (U.S. Department of Education, National Center for Education Statistics, 2019). The
appearance and continuous growth of Massive Open Online Courses (MOOCs) enables—in principle—anyone
with internet to access learning materials from top universities and stresses the relevance of the digital space for
lifelong learning and skill development (Gašević, Kovanovic, Joksimovic, & Siemens, 2014).

Online learning can be defined as “a form of distance education where technology mediates the learning
process, teaching is delivered completely using the Internet, and students and instructors are not required to be
available at the same time and place” (Siemens, Gašević, & Dawson, 2015, p. 100). Existing evidence suggests
that online learning has the potential of being at least as effective as traditional learning (Nguyen, 2015;
Siemens et al., 2015). Advantages compared to traditional classroom learning include the higher distribution and
ease of use, an increased flexibility in terms of location and time, the possibility to interact with rich multi-
media resources, and reduced costs in terms of travel and time away from home (Chen, Ko, Kinshuk, & Lin,
2005). However, new educational technologies and pedagogies do not lead to improved learning per se: both
have to be examined in context, as conditions and ways of implementation impact learning outcomes (Stöhr &
Adawi, 2018).

Online learning activities can be distinguished by their mode of delivery (Hrastinski, 2008). In synchronous
online learning, students and teachers meet in real time, for example via videoconferencing or chats, facilitated
via tools such as Adobe Connect, Google Hangout or Skype. In contrast, asynchronous online learning does not
require students and teachers to be online at the same time, offering more flexibility in terms of when to access
and interact with an activity. Asynchronous online learning activities can include, for example, video lectures
and automatically graded online assignments, but also communicative activities such as discussion forums or
email.

Traditional online courses have almost exclusively focused on asynchronous activities since they enable an

1
Transactional distance is abbreviated as TD; transactional distance theory as TDT, structure as S; dialogue as
D; learner autonomy as LA, questions and answers as Q&A, standard deviation as SD; and information
technology as IT.
increased flexibility. However, as for example Hrastinski (2008) shows, a too heavy reliance on asynchronous
activities can lead to feelings of isolation and can hinder collaborative learning. They typically also exclude the
opportunity for learners to get feedback on immediate questions that arise while interacting with the learning
material. Synchronous learning activities, on the other hand, allow for all this and contribute to learner
motivation and the creation of a sense of community (e.g. Martin & Parker, 2014; Park & Bonk, 2007; Lietzau
& Mann, 2009). Thus, asynchronous and synchronous activities are both important as they serve different
purposes and can complement each other: “Synchronous e-learning increases arousal and motivation, while
asynchronous e-learning increases the ability to process information” (Hrastinski, 2008, p. 54).

However, previous studies have mainly focused on asynchronous online learning, rather than synchronous or
mixed modes of online learning (Siemens et al., 2015; Hrastinski, 2008). Thus, the question of how to
effectively combine synchronous and asynchronous online learning to bolster student learning remains an open
question (Oztok, Zingaro, Brett, & Hewitt, 2013). Further, there is a paucity of research examining how specific
learning activities support or hinder student learning in online environments (Nguyen, 2015).

One promising pedagogical approach for combining asynchronous and synchronous online learning is the online
flipped classroom model. Inspired by the traditional flipped classroom approach (e.g. Tucker, 2012; Bergmann
& Sams, 2012), students are encouraged to watch video lectures (often augmented with quizzes) at home and as
preparation for joint meetings. However, unlike the original flipped classroom model, students and teachers will
not meet physically, but online. The time spent together is dedicated to active and collaborative learning (rather
than lecturing). Chen, Wang, Kinshuk, & Chen (2014) lay out what they call the “holistic flipped classroom”,
stressing the importance of including and monitoring different synchronous and asynchronous learning
activities.
Although the online flipped classroom seems to be gaining traction in higher education, very little research has
focused on evaluating this pedagogical approach (e.g. Chen et al., 2014). That said, in the realm of flipped
classroom research in STEM education, most studies have reported a positive impact on student learning (Barba,
Kaw, &Le Doux, 2016). Two recent meta-analyses, one in mathematics education (Lo, Hew, & Chen, 2017) and
one in engineering education (Lo & Hew, 2019), found a small—but significant—positive effect on student
performance in the flipped format. One of these also notes that, while video-lectures appear to be the dominating
asynchronous instructional activity as preparation for class, major synchronous activities included quizzes at the
beginning of the session, reviews of the pre-class materials, individual and small-group activities such as peer
instruction, and/or lectures to deepen or introduce new material (Lo & Hew, 2019, p. 529). A final point worthy
of note in relation to flipped classroom research is that there is a lack of studies building on sound theoretical
frameworks (Karabulut‐Ilgu, Jaramillo Cherrez, & Jahren, 2018).

Against this backdrop, the present study sets out to evaluate the efficacy of the online flipped classroom through
the lens of transactional distance theory (Moore, 1993). To this end, a campus-based course in applied physics
was transformed into an online flipped course, and we seek to answer the following research questions:

1. What effect does the change from a campus-based format to an online flipped format have on student
performance?

2. How is participation in specific asynchronous and synchronous learning activities related to student
performance in the online flipped format?

3. How is submission of administrative and content-related questions related to student performance in


the online flipped format?

The rest of the paper is structured as follows. In the next section, we introduce key constructs from transactional
distance theory and we draw on these to highlight salient aspects of the course reform. We then describe the
study design, including methodology, and methods for data collection and analysis. This is followed by the
results and their discussion. We end the paper by drawing out some implications for researchers and teachers
with an interest in the online flipped classroom.

2. Theoretical framework and course reform

2.1 Transactional distance theory


This study draws on transactional distance theory (Moore, 1993), or TDT for short, as a theoretical lens to
explore the efficacy of the online flipped classroom. Although TDT has been criticized on various grounds
(Gorsky & Caspi, 2005), it remains “one of the most appealing and well-known theories of distance education”
(Garrison, 2000, p. 9). Briefly, TDT asserts that the salient distance in distance education is transactional—
rather than spatial—where transactional distance, or TD for short, refers to the “psychological and
communication space to be crossed, a space for potential misunderstanding between the inputs of instructor and
those of the learner” (Moore 1993, p. 22). Moore (1984) explained the importance of the notion of TD for
teaching and learning in the following lucid way:

Transactional distance is the extent to which the teacher manages to successfully engage the students in their learning. If
students are disengaged and not stimulated into being active learners, there can be a vast transactional distance, whether
the students are under the teacher’s nose or on the other side of the city. But if a teacher, whether online or on campus,
can establish meaningful educational opportunities, with the right degree of challenge and relevance, and can give
students a feeling of responsibility for their own learning and a commitment to this process, then the transactional gap
shrinks and no one feels remote from each other or from the source of learning (p. 6).

Consequently, teachers should strive to minimize TD to bolster student learning (Benton, Li, Gross, Pallett, &
Webster, 2013). To this end, Moore (1993) proposes that teachers need to consider three variables, which
together determine the level of TD for an educational program or specific learning activity:

• Structure (S), which refers to “the rigidity or flexibility of the program’s educational objectives,
teaching strategies, and evaluation methods. It describes the extent to which an educational program
can accommodate or be responsive to each learner’s individual needs” (Moore, 1993, p. 26). A highly
structured learning activity, such as a traditional lecture or an online video, leaves little room for
meeting the needs of individual students. Accordingly, when structure increases, so does TD.

• Dialogue (D), which refers to the quality of interaction between the teacher and the learner, where each
party contributes and listens respectfully and actively in order to improve the understanding of the
student (Moore, 1993). Accordingly, there is an inverse relationship between dialogue and TD. In fact,
Moore (1993) noted that “one of the major determinants of the extent to which transactional distance
will be overcome is whether dialogue between learners and instructors is possible, and the extent to
which it is achieved” (p. 26). More recent research has also suggested to include the quantity of
dialogue (e.g. Huang, Chandra, DePaolo, Cribbs, & Simmons, 2015) and learner-learner interaction in
this construct (e.g. Benson & Samarawickrema, 2009).

• Learner autonomy (LA), which refers to “the extent to which in the teaching/learning relationship, it is
the learner rather than the teacher who determines the goals, the learning experiences, and the
evaluation decisions of the learning program” (Moore, 1993, p. 31). Unlike structure and dialogue, LA
is not teacher-controlled and difficult to manipulate (Huang et al., 2015). LA and TD are directly
proportional: “the greater the structure and the lower the dialogue in a program the more autonomy the
learner has to exercise” (Moore, 1993, p. 27).

As Huang et al. (2015) point out, it is worth noting that Moore has suggested that structure and dialogue are
inversely proportional (Moore, 1991), where high levels of structure combined with very little dialogue result in
higher TD. But he also acknowledged the possibility of learning environments with high structure and high
dialogue (e.g. correspondence-based courses) or low structure and low dialogue (e.g. independent self-study),
where the first can reduce TD and too much structure tends to be preferable over too little (e.g. Moore, 1993).
However, the acceptable amount of TD is highly dependent on the learners’ level of autonomy, for example,
when deciding whether to use the provided educational resources (such as textbooks or online videos), and also
when, how, and to what extent to use them.

In the next section, we draw on TDT to highlight salient aspects of the course reform, resulting in a theoretical
prediction of the TD in the course. It should be noted that the subjectively-experienced TD for individual
students, which is typically measured by student self-reports, might differ from our hypothetical TD and can be
affected by other factors than the learning design only, such as culture or socio-economic level (Giossos,
Koutsouba, Lionarakis, & Skavantzos, 2009). Like some other researchers (e.g. Murphy & Cifuentes, 2001;
Wikeley & Muschamp, 2004), we focus our analysis on structure and dialogue as key design factors affecting
TD. Learner autonomy is construed more as a characteristic of the learner, though some learning activities might
require higher levels of autonomy than others in order to contribute to student learning.
2.2 Course reform from a TDT perspective

Our analysis is based on the course Modelling of Nuclear Reactors, offered to master and PhD students at
Chalmers University of Technology. The course is geared towards helping students to comprehend and apply
the solution methodology of the computer codes used by the nuclear community for simulating the behaviour of
nuclear reactors. The main emphasis of the course is to derive the typical algorithms, approximations, and the
corresponding limitations, so that the students can apply the codes with confidence in situations that fall within
the validity of the algorithms. Over a period of six iterations, the course design has gradually developed from a
campus-based format to a purely web-based format, starting in 2009 (see Table 1).

Table 1. Course development and participant numbers in the course “Modelling of Nuclear Reactors”
Course iteration
Campus-based format Transitio Online flipped format
n
2009/10 2010/11 2011/12 2012/13 2013/14 2014/15 2015/16
Home assignments 3 3 3 3 4 4 4
Campus lectures x x x
Webcasts with x* x x x x
prompts
Campus tutorials x x x
Online tutorials x x x x
Online quizzes x x x
Wrap-up sessions x x x
Discussion forum x x x
Number of students 13 7 15 6 4 7 6
completing all
home assignments
*recorded in-class sessions made available to students

2.2.1 The campus-based format

During the first three iterations of the course, it was delivered using a fairly traditional campus-based format,
consisting of:

1. Face-to-face campus lectures, including possibilities for Q&A;


2. Face-to-face campus tutorials, focusing on providing help for solving the home assignments; and
3. Individually solved home assignments, with the option to send potential questions to the teacher via
email.

Students had to participate in at least 75% of the campus lectures and tutorials and were also asked to submit a
short summary of the lectures (in addition to the home assignments) that were peer reviewed. In the three home
assignments, students applied some of the algorithms presented during the lectures to solve practical problems.
Thereby, they had to develop their own codes and write reports accordingly. The assignments were assessed at
the end of the course and the basis for passing the course. Students had the opportunity to engage in a dialogue
with the teacher at the end of each lecture/tutorial or also via email if there were questions regarding the home
assignments. In the third year, the in-class sessions were also recorded and made available to the students after
class, marking a first change in the design. All course iterations in the campus-based format were taught by the
same teacher, and he was also responsible for the grading.

From a TDT perspective, both the synchronous campus lectures/tutorials and the asynchronous home
assignments are considered to be of high structure as they follow in a tightly controlled sequence, with
predetermined content and a rigid strategy (see Table 2). Some opportunities for dialogue were—in theory—
created through the Q&A possibilities, the fact that the teacher and students met face-to-face for the
lectures/tutorials, and the email option in the home assignments. Thus, highly structured, uni-directional
activities were complemented with optional dialogue instances. The class met face-to-face, which tends to
reduce TD. Both learning activities and the course design as a whole are therefore considered to have a medium
level of TD.
The required level of learner autonomy is considered to be relatively low in this learning design since learners
meet face-to-face with the instructor and had opportunities for dialogue. However, they still had to take some
control over their learning, for example in how to solve the home assignments. The addition of video-recordings
of the lectures/tutorials in the third year can be interpreted as adding another highly structured, asynchronous
component that nevertheless require a higher level of autonomy with regard to how to access and repeat the
content of the lectures/tutorials. However, as students still meet in-class, the TD of the third iteration of the
course is considered to be medium.

While the teacher experienced that the students rarely asked questions during the campus lectures and tutorials,
he received comparatively many questions regarding the home assignments via email. Since completing the
home assignments were required to pass the course, it is not surprising that students chose to engage with this
form of interaction more often.

Table 2. Summary of learning activities and their TD from a design perspective


Learning activity TDT- dimension
Structure Dialogue Learner Transactional distance
autonomy
Home assignments + emails high medium medium medium
Campus lectures + Q&A high medium medium medium
Campus tutorials + Q&A medium medium medium medium
Webcasts with feedback prompts high very low high very high
Online tutorials (lecture + Q&A) high medium medium medium
Online quizzes high low high high
Wrap-up sessions medium high low low
Discussion forum low high low low

Over some iterations, the course was transformed from using a campus-based format to a wholly online format.
The decision to transform the course was mainly based on the observation that there was a huge interest in the
video lectures/tutorials and the objective to enable distant learners to participate.

2.2.2 The online flipped format

For the four following consecutive iterations, the course was entirely delivered online without any face-to-face
meetings (see Table 1). Iteration four was a transition period, where the content of the previous campus lectures
was offered in an asynchronous, web-based format as webcasts. Similar to the campus format, students were
also asked to submit a short summary after completing the webcasts that were peer reviewed. Learners were
encouraged to give feedback on parts of the casts that they did not understand or thought were of poor quality.
Further, the teacher added tutorials that were led by a PhD student and live-broadcasted in lecture-format with
possibility for Q&A as an opportunity for synchronous interaction between the students and the teacher. As on
campus, the teacher resonated that only few students engaged in this opportunity for dialogue. Although this
design already contained some elements of flipped classroom, particularly on the asynchronous side, there was a
lack of interactive group activities.

In terms of TDT, and compared to the campus lecture format, the TD increased significantly in the first online
iteration of the course (see Table 2). The prior, synchronous, face-to-face meetings were replaced by
asynchronous webcasts with high structure, and little opportunity for dialogue besides the—also
asynchronous—feedback prompts. This learning activity is considered to create the highest TD in the whole
learning design. The required level of learner autonomy increased as well. Although students gained more
flexibility in when and where to watch the web-casts, the teacher had less control over whether and how learners
engage with the webcasts. To limit that autonomy somewhat, the teacher required that students watched at least
75% of the webcasts, which—to some extent—could be checked through the access analytics of the learning
management system. The synchronous tutorial sessions were also highly structured, with an element for
synchronous interaction, resulting in a medium TD. Nevertheless, this design of the first online iteration has to
be considered as the design with the highest TD.
As a response to that, additional learning activities were included in the last three iterations of the course. They
were included to increase the opportunity for meaningful dialogue between the teacher and the students, but
were not mandatory. Those activities included:

• Online quizzes embedded in the webcasts


• Regular synchronous wrap-up sessions
• A discussion forum

As such, the final course design included several learning activities that have been used to flip traditional
courses in STEM education (Lo et al. 2017; Lo & Hew, 2019). The quizzes were designed to test students’
conceptual understanding, enabling students to test themselves, and the results were used as input for the wrap-
up sessions. The wrap-up sessions were added in an attempt to follow a Just-in-Time Teaching approach
(Watkins & Mazur, 2010) and designed to clarify difficult concepts and misinterpretations based on the
feedback that the teacher received through the different learning activities. After a short summary lecture, this
highly interactive session mainly consisted of discussions between the students and the teacher (acting as a
coach), followed by a final round-up explanation by the teacher. Finally, a discussion forum was created to
enable students to interact and discuss the home assignments with each other. Sometimes, the teacher would
post prompts to encourage interactions. The three home assignments were kept during all seven iterations of the
course. However, a fourth home assignment was added after the fourth iteration as the result of course
evaluations, in which students expressed a wish to have an additional home assignment on the last chapter of the
course.

From a TDT perspective, the quizzes were another highly structured asynchronous activity, generating some
(also asynchronous) dialogue through the input to the wrap-up sessions. The synchronous wrap-up sessions and
asynchronous discussion forum were learning activities with little structure and a higher degree of flexibility. As
designed, they were also supposed to create high levels of dialogue and thus result in low TD. Learner
autonomy was also considered to be low in both activities, due to the adaptability and amount of possible
interaction with the teacher and between students. However, as none of those activities were mandatory,
students had to decide whether or not to engage with those activities at all.

Based on this analysis of the course design, drawing on TDT, we can conclude that the final design consisted of
a mixture of activities with high TD (e.g. the web-casts) and low TD (e.g. the wrap-up sessions). The overall TD
was lowered, but changed constantly between the different learning activities. Chen et al. (2014) use the term
“oscillation” to refer to these changes or shifts in TD in a flipped classroom format. This oscillation in TD is in
sharp contrast to the initial course design, mainly consisting of learning activities with medium TD. A
comparison of the two pedagogical formats in the light of our research questions will enable us to get a better
understanding of the underlying mechanisms of the different course designs for different students. In particular,
it will be interesting to examine how students utilize learning activities and in what ways that affects the
performance of those students.

3. Study design

3.1 Methodology

We used a longitudinal, quasi-experimental study design, collecting quantitative data from six iterations of the
course as it was transformed from a campus-based format to an online flipped format between 2009 and 2016.
The course iteration in 2012/2013, constituting a transition phase, was excluded from the analysis (see Table 1).

3.2 Data collection

Table 3. Student demographics for the campus-based and online flipped format
Campus format Online flipped format Total
N 35 17 52
Female students (% of N) 17.1 11.8 15.4
International students (% of N) 25.7 47.1 32.7
Master students (% of N) 94.3 58.8 82.7
PhD students (% of N) 5.7 41.2 17.3
We collected quantitative data from 52 students (see Table 3 for a description of the sample) based on home
assignments, the students’ attendance in course activities (campus lectures and tutorials, web-casts, online
tutorials, wrap-up sessions, and online quizzes) as well as the quantity and content of questions that the students
posed during the course. It should be mentioned that in the campus-based format, attendance at both lectures and
tutorials together was analysed. Student attendance relates to the level that students participate in the different
course activities. For the campus-based format (the first three course iterations), our data was limited to the
attendance at the in-class sessions. For the four web-based course iterations, data was collected from the
different online tools, namely Ping Pong (the learning management system the course used), Mediasite (a video
hosting platform) and Adobe Connect (platform for online meetings) that all offer the possibility to monitor
students’ activity and performance, even if the level of detail of the available data significantly differs between
the three platforms. Although the three platforms collect their data separately, the data was collected on an
individual student level. As the same technology was used for all four course iterations, this offers the
possibility to perform a cross-correlation analysis between various data streams. For the last three course
iterations, our data consists of:

• The students’ participation in synchronous learning activities (the tutorials and wrap-up sessions)
• The students’ participation in asynchronous learning activities (the webcasts and quizzes)
• The questions the students sent to the teachers.

The monitoring of the synchronous sessions was performed using the data available from Adobe Connect,
whereas the monitoring of the webcasts was based on Mediasite. The monitoring of the quizzes was carried out
with Ping-Pong, whereas the monitoring of the questions sent to the teachers relied on various streams:
questions on the Mediasite player, questions sent via Ping-Pong, or directly by e-mail.

In the following we present the different factors we measured and how we operationalized and analysed them.

3.3 Data analysis

3.3.1 Student performance

To measure performance, we included the students’ results on the home assignments that are available for all six
course iterations. In order to perform a comparison that is as faithful as possible between the various course
iterations, only the students who completed all home assignments were considered in the analysis. Based on the
maximum score of 20 for each assignment, we calculated the average and standard deviation over all students
for each course iteration. To test whether there is a statistically significant difference between performance in
the campus-based format and the online flipped format, we used the t-test for independent samples to compare
the average student performances and the Levene’s test for equality of variances to compare the variation of
student performance in the two formats. Since the Levene-test showed that variances were not equal between the
two formats, we used the t-test version that does not assume equality of variances.

3.3.2 Student attendance in synchronous and asynchronous learning activities

For each synchronous and asynchronous learning activity, we calculated attendance as percentage of the total
instances of that learning activity. We then calculated the average and standard deviation for each course
iteration where this learning activity was applied. For the quizzes, the various quizzes given throughout the
course were aggregated in four quiz sessions, with each of the quiz sessions corresponding to a given chapter in
the course, in order to allow a more direct cross-correlation analysis between performance on the home
assignments and the quizzes. For selected learning activities, we also calculated the average participation for
each of the four sessions (averaged over the three years, where the activity was applied). Similar to Section
3.3.1, we also compared the mean and variances of the student attendance at campus lectures/tutorials and
webcasts using the t-test for independent samples and Levene’s test for equality of variances.

3.3.3 Student questions

This indicator measured the number and kind of questions students sent via different channels. The questions
were manually extracted and ordered by topic. After the initial set-up resulted in assignment-related questions
only, the objective of the second course reform in the academic year 2013/2014 was to encourage students to
pose and discuss questions that relate not only to the home assignments but also to the contents presented during
the webcasts. For our analysis, only those questions that were not dealing with the home assignment were
extracted. The questions were then categorized either as administrative questions - meaning questions related to
administrative or technical course logistics—or non-administrative questions related to the actual course
content. It should be noted that, although a forum was available in the course since 2012/2013, the forum was
exclusively used for questions related to the home assignments. Since home assignment-related questions (either
sent to the teachers or collected via the forum) are not considered in the analysis, the forum was thus also
excluded from further analysis.

3.3.4 Relationship between different learning activities and student performance

Since the data were tracked individually for each student, it was possible to cross-correlate the number of points
obtained on all assignments to the attendance at all webcasts, the attendance at all tutorials, the attendance at all
wrap-up sessions, the number of attempts at the quizzes, and the number of content and administrative questions
sent to the teacher, respectively. The relationship was evaluated applying Pearson product-moment correlation.
We defined a correlation as significant if the p-value was equal to or below .05. Further, a linear fit of the data
was also carried out and the goodness of fit was measured by the statistic (defined as the ratio of the sum of
squares of the regression and the total sum of squares). The resulting correlation coefficient can be interpreted as
representing an effect size of the strength of the relationship between two variables. To interpret the size, Cohen
(1988) suggested that a correlation coefficient of .10 represents a small correlation, a coefficient of .30 a
moderate correlation, and a coefficient of .5 or above a strong correlation.

4. Results

4.1 Student performance for the campus versus online flipped format

As a first step, we compared the results of the home assignments in the campus-based and online flipped format
(see Table 4). Students had a mean score of 15.8 (SDC = 2.2) in the campus-based format, compared to 15.5
(SDF = 3.4) in the online flipped format. An independent sample t-test showed that the difference in performance
between the two groups was not statistically significant, t(22.5) = .27, p = .791.

On the other hand, Levene’s Test for Equality of variances showed that the difference in variances/SD is
statistically significant, F = 12.1, p = 0.001. Thus, for the campus-based format, the standard deviation is
significantly lower than for the online flipped format. In other words, while the results of the home assignments
were fairly homogeneous among the students for the campus-based format, we see a much larger variation in
student performance for the online flipped format.

Table 4. Test of student performance variance and means for campus-based and online flipped format
Campus-based Flipped online Levene t*
NC MC SDC NF MF SDF F p t df p
Performance 35 15.77 2.16 17 15.52 3.44 12.12 .001 .27 22.32 .791
*equal variances not assumed

4.2 Attendance in synchronous and asynchronous learning activities

Secondly, we calculated the average student attendance at the different learning activities (see Table 5). The
campus-based format only contains the synchronous campus lectures and tutorials, where students attended in
average 85% (SDC = 11%) of the lectures and tutorials. Regarding the synchronous parts of the online flipped
format, the relative attendance was 79% (SDF = 23%) at the tutorial sessions and 56% (SDF = 37%) at the wrap
ups. One can notice a high standard deviation for both activities indicating a large spread in the attendance rates
between students. The asynchronous web-casts showed the highest attendance rate and were, in average,
accessed by 95% (SDF = 7%) of the students. Finally, students took, in average, 2.7 (SDF = 1.6) attempts to
correctly or incorrectly answer the quizzes. As for the tutorials and wrap-ups, there was a large spread between
students.

Table 5. Average student attendance in different synchronous learning activities


Learning activity Campus-based Flipped online
NC MC SDC NF MF SDF
Campus lecture and tutorial 35 84.97% 10.61%
attendance
Webcast attendance 17 94.69% 6.79%
Number of attempts on quizzes 17 2.66 1.57
Tutorial attendance 17 79.06% 23.19%
Wrap-up attendance 17 56.35% 36.76%

As a next step, we compared attendance at the campus-based lectures/tutorials and lectures delivered as web-
casts (see Table 6). An independent sample t-test showed that the difference in the average attendance (MC =
85%, MF= 95%) between the two formats was statistically significant, t(46) = -3.99, p = .000. The Levene’s Test
for Equality of variances further showed that the difference in variances/SD was statistically significant, F =
6.45, p = 0.014. This means that although both activities had high attendance rates, the average rate of
attendance at the campus lectures/tutorials was significantly lower and showed a larger variation in student
attendance compared to the web-casts.

Table 6. Test of student attendance variance and means for campus-based and online flipped format
Campus based (in Flipped online (in Levene t*
%) %)
N MC SDC N MF SDF F p t df p
C F
Attendance 35 84.97 10.61 17 94.69 6.79 6.45 .014 -3.99 46 .000
*equal variances not assumed

4.3 Cross-correlation analysis of different learning activities with student performance

The results of the cross-correlation analysis between student performance and different learning activities are
presented in Table 7. As can be seen in this table, no linear dependence exists between student performance and
webcast attendance (r = -.05, p = .84) as well as number of quiz attempts (r = -.05, p = .84). A weak correlation
was found between student performance and tutorial attendance (r = .20); however, due to the low number of
students, the correlation is not significant (p = .44) and a linear fit does not reproduce the data faithfully (.04).
On the other hand, student performance and wrap-up attendance were significantly and strongly correlated (r =
.64, p = .006). A linear fit of the data allows also reproducing to some extent the dependence between
performance and wrap-up attendance (.41), indicating that student participation at the wrap-up sessions was
essential for performing well on the home assignments.

Table 7. Cross-correlation analysis between student performance and different learning activities (N=17)
Pearson correlation with performance Statistical measure of the
r p linear fit
Webcast attendance -.05 .84 .0027
Tutorial attendance .20 .44 .04
Wrap-up attendance .64 .006 .41
Number of attempts on quizzes -.05 .84 .0027

4.4 Analysis of student questions sent to the teacher

Table 8 shows the number of content-related and administrative questions per academic year. In the course
iteration 2013/2014, the students posted 14 questions on aspects of the course that were not related to the home
assignments, of which less than a third (29%) were of administrative nature and the rest were content-related. In
the academic year 2014/2015, even more questions (35) were received. However, 94% of those were of
administrative nature, and only two of the questions addressed content aspects of the course. A similar number
of questions (31) was received during the last course iteration. Unlike the previous year, almost half of them
were content-related. In sum, one can state that most of the submitted questions were of administrative nature,
though we observed large variations between years and among the students.

Table 8. Questions sent to the teachers by the students (excluding questions related to home assignments, N=17)
Academic year Total number Number of content- Number of administrative
of questions related questions questions
2013/2014 14 10 (71%) 4 (29%)
2014/2015 35 2 (6%) 33 (94%)
2015/2016 31 15 (48%) 16 (52%)

The cross-correlational analysis of the two question types and performance (see Table 9) shows that student
performance was moderately/strongly correlated with the number of content-related questions (r = .48, p = .05).
An even stronger but negative correlation was found for administrative questions (r = -.73, p = 0.001). Both
correlations were significant at least on the .05 level. Thus, the type of questions students sent is strongly related
to student performance.

Table 9. Cross-correlation analysis between student performance and questions sent in by students (N=17)
Pearson correlation with performance Statistical measure
of the linear fit
r p
Number of administrative questions -.73 .001 0.54
Number of content-related questions .48 .05 0.23

5. Discussion

The present study set out to evaluate the efficacy of the online flipped classroom, drawing on TDT as a
theoretical lens. The first research question concerned the effect of the transformation from a campus-based
format into an online flipped format on student performance. We found no statistically significant difference in
average performance (as measured by results on home assignments). We are not aware of any previous study
attempting to evaluate the efficacy of the online flipped classroom in a more rigorous way. While our results do
not confirm the documented positive effect of the flipped classroom in STEM education (Karabulut‐Ilgu et al.,
2018; Barba et al., 2016; Lo et al., 2017; Lo & Hew, 2019), they are consistent with a considerable corpus of
research suggesting that, in terms of academic performance, “online education can be at least as effective as
traditional classroom instruction” (Kim & Bonk, 2006, p. 23; Nguyen, 2015).

5.1 The oscillation in TD leads to a polarization in performance

An ancillary analysis revealed a significantly larger spread in performance for the online flipped classroom in
comparison to the campus-based format, suggesting that the online flipped classroom leads to a larger
polarization in performance. This indicates that some students tended to perform better in the online format,
while others were struggling even more.

How can this polarization in performance be explained? This leads us to the second research question,
concerning the effect of participating in specific synchronous and asynchronous learning activities on student
performance. From a design perspective, and from the perspective of TDT, the campus-based format consisted
of a limited number of learning activities with high structure and some options for dialogue, resulting in a
medium TD (see Table 2). The transition to the online flipped format resulted in a more complex design, with
more learning activities, some of them—in particular the asynchronous ones—showing high levels of TD
(webcasts, quizzes). However, there is a strong shift in TD when students attend the wrap-up sessions, as these
sessions are based on student participation and discussions. Thus, the design shifted from homogeneous TD to
an oscillation in TD.

Empirically, a comparison between the synchronous campus sessions and their asynchronous counterpart as
webcasts revealed that, although both are mandatory, students watched significantly more lectures provided as
online videos. This can be explained by the greater ease and flexibility of access that are often seen as the main
advantages of asynchronous learning activities, in particular online videos (e.g. Stöhr, Stathakarou, Mueller,
Nifakos, & McGrath, 2019). However, we have no information about how intensively students actually engaged
with the online lectures. On the other hand, the same is true for the campus lectures/tutorials. One might
speculate that disengaged students are somewhat more likely to take away something from the campus
lectures/tutorials than online videos, which would contribute to explaining the polarization effect.
Participation in the non-mandatory activities—the online tutorials, wrap-up sessions and quizzes—was mixed
and particularly low for the wrap-up sessions. A possible explanation for this observation is that the tutorials
were explicitly geared towards helping the students solve the home assignments. So, although an erosion in
student attendance is typical in non-mandatory online learning activities (as can be most impressively apparent
in Massive Open Online Courses, e.g. Eriksson, Adawi, & Stöhr, 2017), one might expect that some students
will prioritize attending the tutorials to pass the course, thereby neglecting the wrap-up sessions as low-
structured learning activities.

The cross-correlation analysis of different learning activities revealed a weak correlation between the
synchronous online tutorials and student performance, which was nevertheless not significant. The
asynchronous webcasts and online quizzes showed no relation at all. On the other hand, students attending the
synchronous wrap-up sessions tended to perform significantly better. Thus, the polarization in performance can
be linked to attendance at the wrap-up sessions. We attribute this to the fact that the wrap-up sessions included a
higher degree of active learning as students engaged in discussions both with each other and with the teacher
about their difficulties, thereby clarifying salient concepts and principles. Indeed, active learning as a key
principle of the flipped classroom has consistently been shown to improve student performance (Freeman et al.,
2014). Interestingly, the correlations between the different learning activities and student performance
corresponds closely to the amount of TD that we assigned to those activities in the analysis of the learning
design.

Our findings are suggestive of the notion that the oscillation in TD across different learning activities leads to a
polarization in performance. This surmised link between TD and academic performance is the main theoretical
contribution of this paper to the literature pertaining to online learning in general and the online flipped
classroom in particular.

These theoretical deliberations and empirical findings mesh with those in a study by Chen et al. (2014), who
evaluated the efficacy of using a flipped classroom approach in a course—not a fully online course, though—
geared towards graduate students. They note that “transactional distance changed when the instructor and
students switched between the asynchronous classroom to the synchronous classroom, and that students had
difficulty adapting to such transactional distance change” (p. 26). Moreover, they found “an increase in grade
average for highly motivated students because they could tolerate such change and performed better than non-
motivated students” (p. 26). While our results are similar, their answer to why students cannot tolerate such
change is somewhat unidimensional. Their explanation for the polarization is exclusively motivational, without
however operationalizing and measuring motivation in any form other than one quote from a student interview.
While we did not systematically study student motivation in this paper, we argue that polarization can also be
explained by higher levels of metacognitive and self-regulated learning skills, including the handling of
technology, more diverse learning activities and higher autonomy of the online flipped format compared to the
campus course. Weak students struggled with the complexity of the course design (many course elements, time
management, parallel, overlapping, interdependent learning activities). Thus, motivation is one factor of several
in our analysis.

Our analysis of student feedback in form of the submitted questions (the third research question) supports this
suggestion. Most of the questions were administrative in character and those students that sent more
administrative questions to the teacher tended to perform significantly worse on the home assignments. The
nature of the administrative questions indicates that the students sending such questions were the ones struggling
with the online flipped format. Despite the attempt by the teacher to provide clear instructions for structuring the
course work, the flexibility of the online flipped classroom resulted in some students being “lost” and not using
the resources in an efficient way. Importantly, these students appear to be also the ones not attending the wrap-
up sessions. If the students who struggle with the online flipped format also shy away from these sessions
largely because of the discussions, they miss the learning effect that the combination of asynchronous and
synchronous activities with oscillating TD is supposed to provide, explaining their lower performance on the
home assignments.

In sum, we see two plausible explanations for this somewhat surprising finding. First, students who struggle
with the online flipped format and struggle with keeping up with the course work will probably prioritize the
sessions that are directly linked to the assessment—in this case, the tutorial sessions. Second, due to a lack of
preparation (as a consequence of not being able to use the resources in an efficient way)—and knowing that
active participation is required during the wrap-up sessions—they may decide not to attend these sessions
because (a) they do not want to reveal their lack of preparation to the teacher and the other students, and (b) they
see no point in attending sessions they have not prepared for. These explanations are consistent with prior
research in STEM education, indicating that students still tend to prefer the lecture format over more active
forms of learning (Yadav, Subedi, Lundeberg, & Bunting, 2011) and tend to misjudge the effectiveness of active
learning (Deslauriers, McCarty, Miller, Callaghan, & Kestin, 2019). Wikeley and Muschamp (2004) make a
similar observation for distance education of PhD students, arguing that instead of loosening structure to
increase dialogue as Moore suggests, it might be advisable to tighten the structure in order to build a better
student expectation to engage in dialogue.

5.2 Implications for research

The present study and the study by Chen et al. (2014) point to the importance of considering changes in TD in
order to better understand for “whom” the flipped classroom approach “works”, whether in a fully online
environment or not. As such, these studies are consistent with a recent call for using a realist approach to
evaluate the flipped classroom approach (Stöhr & Adawi, 2018), pivoting around the questions listed in Table
10 (adapted from Westhorp, 2014). As Stöhr and Adawi (2018, p. 3) explain:

Innovative approaches to teaching often consist of a mix of learning activities (the flipped classroom is no
exception) and a main focus of realist evaluation is to investigate how different learning activities contribute to
different learning outcomes through specific learning mechanisms. These learning mechanisms thus explain how
or why interventions work (or do not work) – they open the “black box” between learning activities and learning
outcomes. Importantly, learning mechanisms may vary across students and they are sensitive to contextual factors.

Along these lines, we have identified the wrap-up sessions both as particularly promising and problematic
learning activity in the online flipped classroom, and we have discussed possible learning mechanisms
underlying the observed polarization in performance. TDT—despite its limited predictive power—provided a
helpful lens in analysing and interpreting the observed effects. Future research pertaining to the online flipped
classroom would do well to pick up on the analytical agenda proposed in Table 10 and, importantly, employ
qualitative research methods to better understand how changes in TD are experienced by students.

Table 10. An analytical agenda for future research on the (online) flipped classroom (from Stöhr & Adawi,
2018)

• For whom will the intervention work and not work, and why?
• In what contexts will the intervention work and not work, and why?
• What are the main mechanisms by which we expect the intervention to work?
• If the intervention works, what outcomes will we see?

5.3 Implications for teaching

To be sure, online teaching and learning—including the online flipped classroom—has many pedagogical
advantages. Our results speak to the importance of including synchronous, active learning activities in online
courses—and encouraging all students to participate in such activities. Still, our findings also bolster the
worrying observation that “online courses are harming the students who need the most help”, as a recent piece in
the New York Times put it (Dynarski, 2018). Bettinger and Loeb (2017) found that “these students’ learning and
persistence outcomes are worse when they take online courses than they would have been had these same
students taken in-person courses”. It is widely acknowledged that online courses tend to favour students with a
well-developed combination of time-management skills and ICT skills, and more generally, students with a
strong capacity for self-regulated learning (Eriksson et al., 2017; Kranzow, 2013). These findings are significant
since any instructional approach should ideally bolster all students’ learning (Chen et al., 2014), and online
teachers would therefore do well to provide (even more) scaffolding in these three areas (Kranzow, 2013). In
times of more and more easily available educational technology tools to enhance active learning, it is key to
consider not only the supposed learning effect of those activities combined and in isolation, but the limiting
effect of complex learning designs for students that struggle to utilize this complexity and therefore choose sub-
optimal and fragmentary learning paths.

5.4 Limitations

Although our results are based on a rigorous statistical analysis of a sufficient total number of students, the
granularity of our analysis is accompanied by a somewhat limited number of students per course iteration (see
Table 1). The teacher considered the level of the students at the start of the course very similar each year,
making the inter-comparison of the students’ performance from year to year more likely to be robust in that
differences in students’ performance can be attributed to the modifications brought to the pedagogical approach
followed during the successive course iterations. Further, it should be noted that a PhD student marked the home
assignments from 2012/2013. This may have affected how they were scored, but the structure of the
assignments was identical and quite rigid. It should also be noted that the students had the possibility to watch
the wrap-up and tutorial sessions asynchronously as well, since such sessions were recorded and made available
online. Nevertheless, the IT systems did not allow monitoring such asynchronous attendance, which is thus not
considered in this analysis. Further, our way of data collection on student attendance does not contain
information about the actual level of engagement of the students with the different learning activities, meaning
we do not know, for example, whether students watched online videos focused or “on the side”. This has to be
considered when interpreting our results (see Section 5). Theoretically, our application of TDT was purely
focused on an analysis of the learning design without considering the actual perceived TD of the different
learners participating in the course. Thus, we demonstrated the usefulness for reflecting about learning designs
but did not contribute to testing the theory and its predictive power as such. While we also consider research in
that direction as important, we think our approach enabled us to circumvent some of the theory’s conceptual
difficulties (e.g. Huang et al., 2015).

6. Conclusions

Against the background of the burgeoning use of online learning in higher education, and more recently the
online flipped classroom, this study set out to evaluate the efficacy of the online flipped classroom through the
lens of TDT. This paper makes a twofold contribution to the literature on online learning in general and the
online flipped classroom in particular. First, our findings speak to the importance of analysing shifts in TD
across learning activities to better understand their potential to support student learning and for whom they will
be effective. Second, our findings suggest that the strong oscillation in TD across asynchronous and
synchronous learning activities might lead to a polarization in performance. Future research would do well to
focus on how students experience such shifts in TD—and, importantly, how online instructors can scaffold the
learning process during these shifts. This could be done by combining our approach with measurements of the
actual perceived TD, based on rigorous student self-report instruments that build on Moore’s TDT and its
further developments (e.g. Zang, 2003; Paul, Swart, Zhang, & MacLeod, 2015). On this point, it would be of
particular interest to analyse online flipped courses relying on an even higher degree of synchronous and
collaborative learning.

References

Barba, L. A., Kaw, A., & Le Doux, J. M. (2016). Guest editorial: Flipped classrooms in STEM. Advances in
Engineering Education, 5(3), 1–6.
Benson, R., & Samarawickrema, G. (2009). Addressing the context of e‐learning: using transactional distance
theory to inform design. Distance Education, 30, 5–21. https://doi.org/10.1080/01587910902845972.
Benton, S. L., Li, D., Gross, A., Pallett, W. H., & Webster, R. J. (2013). Transactional distance and student
ratings in online college courses. American Journal of Distance Education, 27(4), 207-217.
Bergmann, J., & Sams, A. (2012). Flip your classroom: Reach every student in every class every day.
Washington, DC: International Society for Technology in Education.
Bettinger, E., & Loeb, S. (2017). Promises and pitfalls of online education. Washington, DC: Brookings
Institution.
Chen, N.-S., Ko, H.-C., Kinshuk *, & Lin, T. (2005). A model for synchronous learning using the Internet.
Innovations in Education and Teaching International 42, 181–194.
https://doi.org/10.1080/14703290500062599.
Chen, Y., Wang, Y., Kinshuk, & Chen, N.-S. (2014). Is FLIP enough? Or should we use the FLIPPED model
instead? Computers & Education, 79, 16–27. https://doi.org/10.1016/j.compedu.2014.07.004.
Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Erlbaum.
Deslauriers, L., McCarty, L. S., Miller, K., Callaghan, K., & Kestin, G. (2019). Measuring actual learning
versus feeling of learning in response to being actively engaged in the classroom. Proceedings of the
National Academy of Sciences, 116(39), 19251–19257. https://doi.org/10.1073/pnas.1821936116.
Dynarski, E. (2018). Online Courses Are Harming the Students Who Need the Most Help. The New York
Times. https://www.nytimes.com/2018/01/19/business/online-courses-are-harming-the-students-who-need-
the-most-help.html Accessed 03 Juni 2019.
Eriksson, T., Adawi, T., & Stöhr, C. (2017). “Time is the bottleneck”: a qualitative study exploring why learners
drop out of MOOCs. Journal of Computing in Higher Education, 29(1), 133–146.
https://doi.org/10.1007/s12528-016-9127-8.
Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014).
Active learning increases student performance in science, engineering, and mathematics. Proceedings of
the National Academy of Sciences, 111(23), 8410–8415.
Garrison, R. (2000). Theoretical challenges for distance education in the 21st century: A shift from structural to
transactional issues. The International Review of Research in Open and Distributed Learning, 1(1).
https://doi.org/10.19173/irrodl.v1i1.2.
Gašević, D., Kovanovic, V., Joksimovic, S., & Siemens, G. (2014). Where is research on massive open online
courses headed? A data analysis of the MOOC Research Initiative. The International Review of Research
in Open and Distributed Learning, 15(5). https://doi.org/10.19173/irrodl.v15i5.1954.
Gorsky, P., & Caspi, A. (2005). A Critical Analysis of Transactional Distance Theory. Quarterly Review of
Distance Education, 6(1), 1-11.
Hrastinski, S. (2008). Asynchronous and synchronous e-learning. Educause quarterly, 31(4), 51–55.
Huang, X., Chandra, A., DePaolo, C., Cribbs, J., & Simmons, L. (2015). Measuring transactional distance in
web-based learning environments: an initial instrument development. Open Learning: The Journal of
Open, Distance and e-Learning, 30(2), 106–126. https://doi.org/10.1080/02680513.2015.1065720.
Karabulut‐Ilgu, A., Cherrez, N.J., & Jahren, C.T. (2018). A systematic review of research on the flipped
learning method in engineering education. British Journal of Educational Technology, 49(3), 398–411.
https://doi.org/10.1111/bjet.12548.
Kim, K. J., & Bonk, C. J. (2006). The future of online teaching and learning in higher education. Educause
quarterly, 29(4), 22-30.
Kranzow, J. (2013). Faculty leadership in online education: Structuring courses to impact student satisfaction
and persistence. MERLOT Journal of Online Learning and Teaching, 9(1), 131-139.
Lietzau, J.A., & Mann, B.J. (2009). Breaking out of the Asynchronous Box: Using Web Conferencing in
Distance Learning. Journal of Library & Information Services in Distance Learning, 3(3-4), 108–119.
https://doi.org/10.1080/15332900903375291.
Lo, C. K., Hew, K. F., & Chen, G. (2017). Toward a set of design principles for mathematics flipped
classrooms: A synthesis of research in mathematics education. Educational Research Review, 22, 50–73.
https://doi.org/10.1016/j.edurev.2017.08.002.
Lo, C. K., & Hew, K. F. (2019). The impact of flipped classrooms on student achievement in engineering
education: A meta-analysis of 10 years of research. Journal of Engineering Education, 108(4), 523–546.
https://doi.org/10.1002/jee.20293.
Martin, F., & Parker, M.A. (2014). Use of Synchronous Virtual Classrooms: Why, Who, and How? MERLOT
Journal of Online Learning and Teaching, 10(2), 192-210.
Moore, M. G. (1984). On a theory of independent study. In D. Stewart, D. Keegan, & B. Holmberg (Eds.),
Distance education: International perspectives (pp. 68-94). London: Routledge.
Moore, M.G. (1991). Editorial: Distance education theory. American Journal of Distance Education, 5(3), 1–6.
https://doi.org/10.1080/08923649109526758.
Moore, M. G. (1993). Theory of transactional distance. In D. Keegan (Ed.), Theoretical principles of distance
education (pp. 22-38). London: Routledge.
Murphy, K.L., & Cifuentes, L. (2001). Using Web tools, collaborating, and learning online. Distance Education,
22(2), 285–305. https://doi.org/10.1080/0158791010220207.
Nguyen, T. (2015). The effectiveness of online learning: Beyond no significant difference and future horizons.
MERLOT Journal of Online Learning and Teaching 11(2), 309–319.
Oztok, M., Zingaro, D., Brett, C., & Hewitt, J. (2013). Exploring asynchronous and synchronous tool use in
online courses. Computers & Education, 60(1), 87–94. https://doi.org/10.1016/j.compedu.2012.08.007.
Park, Y.J., & Bonk, C.J. (2007). Is online life a breeze? A case study for promoting synchronous learning in a
blended graduate course. MERLOT Journal of Online Learning and Teaching, 3(3), 307–323.
Paul, R. C., Swart, W., Zhang, A. M., & MacLeod, K. R. (2015). Revisiting Zhang’s scale of transactional
distance: Refinement and validation using structural equation modeling. Distance Education, 36(3), 364–
382. https://doi.org/10.1080/01587919.2015.1081741.
Siemens, G., Gašević, D., & Dawson, S. (2015). Preparing for the digital university: a review of the history and
current state of distance, blended, and online learning. Athabasca: Athabasca University.
http://linkresearchlab.org/PreparingDigitalUniversity.pdf Accessed 03 Juni 2019.
Stöhr, C., & Adawi, T. (2018). Flipped Classroom Research: From “Black Box” to “White Box” Evaluation.
Education Sciences, 8(1), 1-4. https://doi.org/10.3390/educsci8010022.
Stöhr, C., Stathakarou, N., Mueller, F., Nifakos, S., & McGrath, C. (2019). Videos as learning objects in
MOOCs: A study of specialist and non-specialist participants’ video activity in MOOCs. British Journal of
Educational Technology, 50(1), 166–176. https://doi.org/10.1111/bjet.12623.
Tucker, B. (2012). The flipped classroom. Education next, 12(1), 82-83.
U.S. Department of Education, National Center for Education Statistics. (forthcoming). Digest of Education
Statistics 2018. retrieved from https://nces.ed.gov/fastfacts/display.asp?id=80
Watkins J., & Mazur E. (2010). Just-in-time teaching and peer instruction. In S. Simkins, & M. H. Maier (Eds.),
Just-in-time teaching: Across the disciplines, and across the academy (pp. 39-62). Sterlin, VA: Stylus
Publishing.
Westhorp, G (2014). Realist Impact Evaluation: An Introduction. London: Overseas Development Institute.
Wikeley, F., & Muschamp, Y. (2004). Pedagogical implications of working with doctoral students at a distance.
Distance Education, 25(1), 125–142. https://doi.org/10.1080/0158791042000212495.
Yadav, A., Subedi, D., Lundeberg, M. A., & Bunting, C. F. (2011). Problem-based Learning: Influence on
Students’ Learning in an Electrical Engineering Course. Journal of Engineering Education, 100(2), 253–
280. https://doi.org/10.1002/j.2168-9830.2011.tb00013.x.
Highlights

• We evaluate the efficacy of the online flipped classroom


• We compare six iterations of a campus-based and online flipped course
• The online flipped classroom led to an oscillation in transactional distance
• The average student performance did not differ
• The oscillation in transactional distance led to a polarization in performance
• Implications for research and teaching are discussed
Author Contribution Statement

Christian Stöhr: Conceptualization, Methodology, Formal analysis, Validation, Writing-Original


draft, Writing-Review & Editing

Christophe Demazière: Conceptualization, Methodology, Formal analysis, Validation, Data


curation, Investigation, Writing-Original draft, Writing-Review & Editing,
Investigation

Tom Adawi: Conceptualization, Methodology, Writing-Original draft, Writing-Review &


Editing, Supervision

You might also like