Professional Documents
Culture Documents
2021 - Weiss, McDermott, Hand 2021 Cha
2021 - Weiss, McDermott, Hand 2021 Cha
To cite this article: Kathleen A. Weiss, Mark A. McDermott & Brian Hand (2021): Characterising
immersive argument-based inquiry learning environments in school-based education: a systematic
literature review, Studies in Science Education, DOI: 10.1080/03057267.2021.1897931
Introduction
During the last two decades there has been a major shift in orientation to the learning of not
only science, but also of other discipline areas such as mathematics and language. This shift
places the epistemic practice of argumentation as a central component of learning for all
students (Winn et al., 2016). For example, the Programme for International Student
Assessment (PISA) cites ‘argument from evidence’ as an important cognitive practice for
global understanding, referencing the role argument plays in the development of epistemic
knowledge in science (OECD, Organization for Economic Cooperation and Development
(OECD), 2018). The TIMSS Science Framework includes ‘making an argument from evidence’
as one of five foundational scientific inquiry practices with which students need to engage
in order to better understand the world around them as well as how science operates as
a process (Mullis & Martin, 2017, p. 55). This perspective appears in other discipline areas, for
example, the Common Core State Standards in Mathematics (CCSSM) cite ‘argument and
critique’ as one of the eight standards for mathematical practice (Common Core State
Standards Initiative (CCSSI), National Governors Association Center for Best Practices &
Council of Chief State School Officers, 2010a); the Common Core English Language Arts
Standards (CCSS-ELA) contain language about writing arguments in support of claims
throughout multiple grade levels(CCSSI, 2010b); and A Framework for K-12 Science
Education and the Next Generation Science Standards (NGSS) take on argumentation as
one of the eight science and engineering practices, which are integrated with core content
ideas and crosscutting concepts to promote three-dimensional learning (NGSS Lead States,
2013; National Research Council (NRC), 2012). Argumentation is a central practice of critical
thinking and, more specifically, the scientific endeavour of understanding the world. Thus,
students need to understand the process of argumentation as well as engage in it as an
epistemic tool as they seek to understand scientific concepts (Mercier et al., 2017).
Even though this shift to emphasise argumentation within science classrooms is
emphasised in new curricula, unfortunately, past studies have found argument rarely
occurs in science classrooms (Newton et al., 1999). The response to this call to implement
argumentation has seen the generation of a range of different approaches utilised in
classrooms. For example, Osborne and colleagues focus on argument based on a Toulmin
structure of claims, evidence, backing, and warrants (Erduran et al., 2004; Osborne et al.,
2004); Krajick, McNeill, and colleagues focus on the Claim, Evidence, Reasoning approach
(Lizotte et al., 2004; McNeill & Krajcik, 2008); Sampson and colleagues use an Argument-
Driven Inquiry approach (Sampson, Grooms & Walker, 2011); and Hand and colleagues,
the Science Writing Heuristic approach (Keys, Hand, Prain & Collins, 1999; Martin & Hand,
2009). Cavagnetto (2010) in reviewing these approaches has classified them into three
distinct categories: structured, socio-scientific and immersive. While socio-scientific
approaches do incorporate some forms of argument they tend to incorporate cultural
and political perspectives as part of the inquiry process. Cavagnetto differentiates
between structured and immersive approaches by the way they engage with argumenta
tion use within the classroom. Structured approaches are centred around the concept of
students having to know the structure of the argument before they can use it within an
inquiry setting. Immersive approaches are based on the concept that students need to be
using argumentation processes as a means of simultaneously learning about argument
and the science concepts. Table 1 summarises key differences among the three classroom
argumentation perspectives (Cavagnetto, 2010).
This distinction between structured and immersive approaches is important because it
aligns with the work of Norris and Phillips (2003) who highlight two different forms of
science literacy – the derived and fundamental. They argue that the focus of much of science
education is on the derived sense which is centred on the outcomes or products of inquiry.
This focus addresses the ability of students to be able to replicate the structure of argument
as a means of demonstrating understanding of argument. However, Norris and Phillips
(2003) have argued for the need to engage with the fundamental processes of science,
that is, on the ways in which knowledge is generated. This reflects the immersive argument
put forward by Cavagnetto (2010) and aligns with the idea that students should be able to
use the epistemic practices of science as a means to understand how science knowledge is
constructed. As argued by Cavagnetto and Norris and Phillips, there is a need for more
research to better understand how these immersive environments support the fundamental
processes involved in helping students generate understanding of science. Immersive
STUDIES IN SCIENCE EDUCATION 3
approaches are rich with opportunities for students to generate knowledge. This knowledge
generation will be defined and described in depth within the theoretical framework.
We believe the emerging evidence of student gains in content knowledge within the
discipline of science, within other disciplines, and in regards to critical thinking support
the need for immersive approaches (Adey & Strayer, 2015, Hand, Chen, & Suh, 2020).
Therefore, we think immersive argument-based inquiry (ABI) approaches have the most
potential to help students accomplish the generative learning goals of current science
education reform initiatives. While previous work has been done to define the learning
environments which support argument-based science learning as a whole (Duschl &
Osborne, 2002) and the learning environments associated with online (Andriessen et al.,
2003; Clark et al., 2007) and structured (Berland, 2011; McNeill, 2011) argument-based
approaches, we have been unable to find studies which specifically characterise immer
sive argument-based learning environments. The unique way in which immersive
approaches support students in the practice of argumentation embedded within inves
tigation requires a narrower, focused review to uncover what characterises these unique
learning environments. While commonalities across the various types of argument-based
approaches likely exist, we argue it is imperative to clearly understand their distinctions.
Studies related to one particular immersive ABI approach have clearly shown that
students benefit from these learning environments, with relationship to students’
achievement (Choi et al., 2013; Hand, Chen & Suh, 2020), critical thinking skills (Hand
et al., 2018, Hand, Chen & Suh, 2020), science process skills (Hand, Chen & Suh, 2020), and
student access and power (Schoerning, Hand, Shelley& Therrien, 2015). However, little
research exists that synthesises the findings related to the diverse set of immersive ABI
approaches that are utilised in classrooms. This systematic literature review of immersive
ABI approaches seeks to identify common characteristics. We believe such a review will
provide a foundation for future efforts designing these learning environments and asses
sing student engagement within them.
4 K. A. WEISS ET AL.
Theoretical framework
The practice of argumentation is first considered broadly, then at the classroom level, and
finally at the individual student level.
Argumentation
Consideration of argument-based inquiry (ABI) begins with a clear definition of what is
meant by argumentation. For many in science education, the definition of argumentation
originally developed by Toulmin (Toulmin, 1958, as cited in Erduran et al., 2004) empha
sises a structural perspective which appeals for its analytical utility. The Toulmin model
breaks argument down into the components of data, claim, warrant, backing, and
rebuttal. Many researchers use these components to analyse student and teacher dialo
gue or written work in their studies (eg. Erduran et al., 2004; McNeill & Knight, 2013). Some
find it difficult to teach these components to students, due to confusion in the structure
and definition of components (McNeill & Knight, 2013). While the utility of defined
components of argumentation is recognised, this study is instead aligned with the work
of Cavagnetto and Hand (2011), who note that Toulmin’s argument pattern may be too
structured for practical use in learning situations. This study is framed by the ideas of Ford
(2008) in emphasising the importance of student understanding of scientific practice
instead of explicit instruction on argument structure.
Walton (1990), in contrast to Toulmin, approaches argumentation from a reasoning
standpoint, defining reasoning as a process of ‘making premises and moving towards
conclusions’ (p. 403). He places reasoning within the framework of argument, noting the
goal-directed and dialogic nature of argumentation. Asterhan and Schwarz (2016) further
define types of argumentation based on goals, including co-consensual, disputative, and
deliberative argumentation. For co-consensual argumentation, the goals are agreement
and explanation or expansion of ideas. Disputative argumentation involves competition
and the weakening of one another’s ideas. Deliberative argumentation, however, allows for
disagreement in a non-confrontational way, in which people seek understanding of one
another’s ideas, collaboration, and persuasion. This study draws from both Walton’s (1990)
model of argumentation as a reasoning process and Asterhan and Schwarz (2016) delib
erative goal of argumentation, along with Ford’s (2008) ideas emphasising the practices of
science. This study integrates important components of these perspectives of argumenta
tion to support a general view of argument as a reasoning tool which can be naturally
combined with other science practices with students working in a collaborative manner to
seek explanations for answers to questions about nature. Now, we consider how these
perspectives of argumentation play out in the science classroom by considering three
classifications of argument-based approaches in more depth (Cavagnetto, 2010).
active role recognises the learner as being in control of their own learning (Driver &
Oldham,1986). This control can also be thought of as ownership of learning, an idea which
research has shown to be related to student motivation (Kentish, 1995; O’Neill, 2010;
O’Neill & Barton, 2005), autonomy (Fleming & Panizzon, 2010; Mortimer & Scott, 2003;
O’Neill & Barton, 2005), interest (Fleming & Panizzon, 2010), attitude (Prain & Hand, 1999),
and engagement in learning (Bandura, 1997, as cited in Kentish, 1995; O’Neill, 2010;
O’Neill & Barton, 2005).
The operational definition for student ownership of learning used in this study is: student use
of self-developed questions, ideas, or processes which drive student learning. This definition
emphasises student control and choice (Enghag & Niedderer, 2008), responsibility (Milner-
Bolotin, 2001), and agency (O’Neill & Barton, 2005), as well as power balance, dialogue, and safe
or non-threatening learning environment (Ardasheva et al., 2015; Rainer & Matthews, 2002).
Rainer and Matthews (2002) locate student ownership of learning as ‘a central and cohesive
element of knowledge construction’ (p. 22). In practice, many immersive ABI approaches
engage students in active learning strategies, or those which require learners to do something
with the ideas under consideration instead of simply memorising them. This active construc
tion of understanding can be defined as knowledge generation. In other words, learners
generate knowledge, in this case, of science concepts using previous knowledge and experi
ence. This definition draws from multiple models, including Wittrock’s (1974) generative model
of learning, Chi and Menekse’s ICAP (Interactive, Constructive, Active, Passive) Framework for
engagement activities (Chi & Menekse, 2015), and Fiorella and Mayer’s work on generative
learning (Fiorella & Mayer, 2016), which references Mayer’s Select-Organise-Integrate (SOI)
model of generative learning (Mayer, 2014, as cited in Fiorella & Mayer, 2016).
This study explores approaches aligned with the constructive cognitive processes in
which knowledge generation takes place, where the ‘student generates some new knowl
edge and inferences beyond what was presented in the materials’ (Chi & Menekse, 2015,
p. 264) and consistent with what Fiorella and Mayer (2016) call ‘generative learning tasks.’
According to these researchers, ‘generative learning involves actively constructing mean
ing from to-be-learned information, reorganizing it and integrating it with one’s existing
knowledge’ (Fiorella & Mayer, 2016, p. 717). Knowledge generation can be thought of on
a continuum of ‘less generative’ to ‘more generative.” ‘Less generative’ learning oppor
tunities may involve reproduction, repetition, or copying of information or processes.
These may include rote memorisation of facts, copying notes down verbatim from
a lecture, or following a pre-defined set of experimental procedure steps.
For this paper, the ‘more generative’ side of the knowledge generation continuum will
be represented by generative opportunities. We operationalise generative opportunities as
learning opportunities which enable students to engage in knowledge generation and
those in which the teacher recognises and plans for student authorship or ownership of
ideas. A concrete example in science classrooms cited in literature includes student
generation of questions and investigation procedures to answer their questions.
Students take roles ‘as participants become investigators collaborating in search of
understanding’ (Enghag & Niedderer, 2008, pp. 650–651), and they develop important
practices similar to those employed by scientists. Designing an investigational procedure
requires students to actively construct ideas beyond those given to them. Through this
process, they build cognitive connections with previous (internal) knowledge and new
(external) knowledge (Wittrock, 1974).
STUDIES IN SCIENCE EDUCATION 7
Methods
Systematic literature review search
In order to achieve the goal of this study, which is to identify common characteristics of
immersive argument-based inquiry learning environments, an education database search
was conducted, providing a sufficiently large set of articles using the search term combi
nations ‘argument’ and ‘science’ (4997 articles) and ‘argumentation’ and ‘science’ (1115
articles), with many overlapping articles between the two sets. The Education Source –
EBSCOHost database was chosen because it is well-known and indicated as one of the two
‘Best Bet’ education databases within the institution’s library. Other databases were
considered initially, but sufficient overlap was observed between preliminary database
searches, and Education Source provided the most focused coverage of full-text, educa
tion-related, peer-reviewed journal articles.
Five inclusion criteria were developed to narrow the list of articles by requiring articles
to address: (1) School-based education, (2) ‘argument-based inquiry’ or argumentation, (3)
scientific investigation, (4) in-person learning environments, and (5) a detailed description
of the learning or teaching approach.
Articles were narrowed based on the inclusion criteria through title, abstract, and
article content scans. After title and article scans aimed to narrow articles based on
criterion one, 331 research journal articles remained which potentially described immer
sive ABI approaches, while 45 positional or theory papers remained. Theory papers and
articles from practitioner journals (29) were excluded at this stage, as they rarely included
a sufficient description of an instructional approach (criterion 5). The remaining articles
went through content scans and were excluded if they did not fit inclusion criteria two
and three. Assessment and professional development articles were excluded as they did
not relate as closely to the learning processes occurring within the school-based science
classroom (criterion 1) nor were they likely to include descriptions of learning or teaching
approaches (criterion 5). School-based science classrooms were identified as those which
involve primary students, secondary students, or both. This designation was not intended
to exclude school systems which use various classification systems for school levels, but
merely to exclude higher education and non-school-based learning environments.
Criteria 2 and 3 were considered together to designate an approach as immersive ABI.
The definition of ‘immersion-oriented’ or immersive was based upon a previous literature
review conducted by Cavagnetto (Cavagnetto, 2010, p. 351), which classified ABI inter
ventions or approaches into three categories as described in the theoretical foundation
for this study. These approaches best matched the theoretical perspective described in
the previous section. After applying the inclusion criteria, eight of the immersive articles
from Cavagnetto’s review (Cavagnetto, 2010) were included in this review, while others
were excluded because they addressed virtual learning environments (criterion 4).
Outside of the articles included from Cavagnetto’s review, all remaining articles were
narrowed further to only those published in or after 2013, aligned with the publication year
of one major international reform document, the Next Generation Science Standards (NGSS
Lead States, 2013). The NGSS is an important policy document that continues to inform the
study and development of immersive ABI approaches and learning environments. While this
narrowing based solely on the NGSS has potential to limit the scope of this review, it was
8 K. A. WEISS ET AL.
determined that the NGSS was adequately representative of similar international reform
initiatives occurring at the time, including those in the United Kingdom and Australia. Each
of these reform initiatives sought common goals including embedding scientific practices
within content learning and integration of ‘strands’ or ‘dimensions’ (Australian Curriculum,
Assessment and Reporting Authority, 2010; Department for Education, United Kingdom, 2014).
In addition, the NGSS focuses explicitly on the scientific practice of argumentation, which
aligns with the aim of this study.
We recognise that the literature review search method and inclusion criteria imposed
limits the breadth of literature we considered for this systematic review. Our review is also
limited by the preference of articles written in English, as well as our emphasis on the
NGSS versus other international science initiatives. As described above, we argue that the
core aspects of the recent curriculum initiatives have many similarities, with the NGSS
using language associated with argumentation more than the others. In the midst of
these limitations, we argue the reasoning we have provided for these decisions and the
variety of articles resulting from our search enable a rigorous and a fruitful analysis of the
immersive ABI literature.
Thirty-nine peer-reviewed research articles representing a total of 16 unique immersive
ABI approaches were ultimately selected for this systematic literature review. Articles were
grouped by approach to allow for approach-level analysis. Some approaches had a name
provided in the articles (e.g. ADI, Concept Cartoons, Promoting Argumentation, SWH).
Approaches without an assigned name were given a title for use in this review (e.g. ‘Open
inquiry instruction,’ ‘Small group work’). Articles were arranged alphabetically by
approach title, then in order by year of publication within each approach, as year of
publication informed the order in which articles were analysed. The article(s) which
provided the most in-depth description of each approach were identified as ‘primary’
for that approach. A list of primary articles and their associated approaches is shown in
Table 2. A more complete list of articles included in this review is available in the reference
section (marked with asterisks (*)).
text, typically to the beginnings of statements. For example, these included statements
implying student actions or opportunities (e.g. ‘Students were given the opportunities
to . . . ’). Statements were collected from the first round of coding and placed in an Excel
spreadsheet. Then, the second round of coding was conducted to seek more clarity and
detail about the processes or actions taking place in the learning environments. In this
way, the 1st-level codes were broader, while the 2nd-level codes helped narrow the focus.
This broad to narrow approach is depicted by two ‘funnel’ lines in Figure 1. The 2nd-level
codes were typically applied to the ends of the statements, where actions were described.
For example, the statement which began ‘Students were given opportunities to . . . ’ was
be followed with actions such as ‘engage in argumentation, conduct investigations, and
discuss with classmates.’ Therefore, the 2nd-level codes of ‘engage in argumentation,’
10 K. A. WEISS ET AL.
‘conduct investigation,’ and ‘discussion’ were applied to this statement. After the two
rounds of coding, the compiled list of resulting 1st- and 2nd-level codes were inspected
together for how they characterised the learning environments (examples provided in
Table 3). At this time, ‘priority codes’ were selected from the compiled code list because
they most clearly and comprehensively addressed the nature of the learning environ
ments. These ‘priority codes’ were used in phase 2, approach-level analysis and are
explained further in the following paragraph.
Through the open coding process, five priority codes emerged which most directly
characterised the learning environment. These priority codes included: (1) learning envir
onment (LE), (2) teacher role (TR), (3) scientific community (SC), (4) group dynamics (GD),
and (5) immersive ABI (iABI). The LE code was applied to statements which most directly
described the learning environment created with the approach. The TR code was applied
when the author discussed teacher roles, either by naming them directly (e.g. facilitator
role), or by describing their nature theoretically or through study findings. The SC code
was developed for use with any statement which addressed how the purpose of the
approach or actions of students were meant to parallel the actions of the scientific
community or be in line with scientific disciplinary norms. While authors typically used
these statements to qualify the nature of their learning environments, it was helpful to
isolate these statements from those coded with LE to allow for more detailed analysis. The
GD code was used when authors’ statements directly addressed the nature of student
interactions within group work. In this way, students’ actions were more directly influen
cing the learning environment. Finally, a few articles directly defined and addressed
immersive ABI approaches and learning environments. Therefore, the iABI code was
established as a priority code to identify these explicit descriptions.
After identifying priority codes from the open coding process, further analysis was
needed in order to make sense of how these codes characterised immersive argument-
STUDIES IN SCIENCE EDUCATION 11
based learning environments. Up to this point, analysis was conducted with the unit of
analysis of the article. However, the number of articles varied between approaches. In
order to avoid representation bias among approaches, approach-level analysis was con
ducted next.
making (Ryan & Bernard, 2000). Axial coding allowed for inductive grouping of open
coding results, as meaning emerged from the data.
Axial grouping of the data built upon the goal of the initial open coding, with a focus
on actions or opportunities for action, in line with ‘process coding’ (Saldana, 2016). As
findings from the open-coding phase were compared, the role of the actor was consid
ered as a way to make meaning of the action. Two actors emerged: students and teachers.
In addition, both actors were observed playing roles within generative opportunities in
the learning environments, a foundational aspect of our theoretical framework. These
observations informed the formation of the axial groups, which provided an organisa
tional frame to better understand the nature of immersive argument-based learning
environments. The axial groups and examples will be provided within the results section.
Results
Approach-level findings
The approaches studied within this review spanned primary and secondary school grades,
with articles from 6 approaches targeting primary or elementary grade levels, and articles
from 12 approaches targeting secondary grade levels. Within these, articles from 2
approaches spanned both elementary and secondary grade levels (epiSTEMe, SWH).
Some approaches represented pre-designed curricula or textbooks (epiSTEMe, ICS,
Stanford Project, Inquiry-type chemistry experiment), while others represented frame
works informing teacher planning or creation of the learning environment (FCL as ex of
PDE, SWH). Two approaches emphasised model development (MMD-based,
Representational approach), while one approach addressed scientific processes over
content (Paper Chain Inquiry). Some of the articles focused on ideas unique to the teacher
or classroom in which the approach was implemented (Open inquiry instruction, Small
group work, Wild Backyard, Paper Chain Inquiry, Inquiry-based argumentation), while
others attempted to address more widely applicable materials (FCL as ex of PDE,
ICS, SWH).
Priority codes identified within each approach are displayed in Table 4. Priority codes
are indicated if they are mentioned at least once within the article(s) associated with each
approach. The priority code iABI, which most directly addressed immersive ABI learning
environments, was left out of the approach-level findings as it was only found in one
approach. Examples of findings relevant to the iABI code will be provided within the cross-
approach findings.
At the approach level, the main findings stemmed from the priority codes referenced
within the articles representing each approach. However, these findings on their own
proved insufficient in characterising immersive ABI learning environments because of the
variability inherent across the approaches. The findings of the cross-approach analysis
served to deepen and clarify the approach-level findings and are reported in the following
section.
STUDIES IN SCIENCE EDUCATION 13
Below, each student action is defined as addressed in this study. Then, examples of
indicator or identifier words from the articles are provided. If the action is directly related
to the inclusion criteria for the review, that is addressed next. Examples are given from the
approaches in order to better characterise the student action and its underlying purpose.
When applicable, quotations are provided from the literature which explicitly define
‘immersive ABI’ (SWH).
science through practice (ADI, Wild Backyard), language learning through science
literacy practices (ICS, SWH), and argumentative reasoning or discourse as a central
focus (ICS, SWH, Wild Backyard). More specifically, the NGSS included a science and
engineering practice dedicated to ‘Engaging in Argument from Evidence’ (NGSS Lead
States, 2013).
This study’s inclusion criterion of argument-based inquiry or argumentation guided
the definition of this student action. As a result, all 16 approaches included in the review
referenced the student action of ‘engage in argumentation.’ Within these approaches,
students engaged in verbal argumentation through discussion (epiSTEMe), written
argument (ADI), small group or whole class work (Promoting Argumentation), construc
tion of claims (Paper chain inquiry), evaluation of arguments based on evidence (ICS),
and successful engagement in argumentation without formal training (Concept
Cartoons). The purpose behind engaging in argumentation included improving argu
mentative writing skills (ADI), considering different views (Concept Cartoons, epiSTEMe),
explaining phenomena (ICS, Inquiry-type chemistry experiment), and developing gen
eral skills in argumentation as a scientific process (Promoting Argumentation, Stanford
Project, SWH).
‘Immersive ABI,’ when directly referenced in the SWH articles, most frequently meant
student engagement in argumentation in conjunction with another student action of
investigation. For example, Ardasheva et al. (2015) defined immersive ABI approaches as
those ‘focusing on helping students grasp scientific practices while simultaneously gen
erating understandings of disciplinary big ideas through reasoning and argumentation’
and those which ‘reflect the nature of science as inquiry and argument’ (p. 206). Students
need to ‘live argument as they learn about argument’ (Ardasheva et al., 2015, p. 232).
Other SWH authors referenced Cavagnetto’s (2010) definition of argument ‘as an
embedded component to scientific practice’ (p. 350).
ideas at the beginning of the unit or lesson (Small group work), while others restated
student statements within dialogue to direct the discussion (Wild Backyard). The purpose
behind highlighting important ideas was to avoid student misconceptions
(Representational approach, Small group work), summarise thoughts at the end of
a discussion (Inquiry-type chemistry experiment), or lead students in constructing evi
dence or arguments (Wild Backyard).
maps and question brainstorming served as the foundation for the unit. Being able to use
their own questions gave students authorship and ownership of exploration and learning
in the unit.
The classification of approaches along the continuum was determined first based on
the articles reviewed by this study. When article descriptions were insufficient to code the
initial activity and investigations involved in the approach as ‘less generative,’ or ‘more
generative,’ additional teaching materials were sought. Only those additional teaching
materials which were freely available online were considered. It is possible that, if the full
intent and sets of materials from the various approaches were to be considered, the group
placement would have shifted slightly. In addition, it is acknowledged that individual
teacher implementation of each approach could vary greatly, with some teachers imple
menting the approach in a way which provided more generative opportunities and others
limiting the generative opportunities available to students, even when based on the same
curriculum materials. In fact, one approach considered in this study explicitly gave
teachers the choice of whether to have students design their own investigation or follow
a set of predefined steps (Stanford Climate Change Project; Holthuis et al., 2014). While
the explicit mention of varying implementation was rare within these studies, when it did
arise, the approaches were classified as ‘intermediate,’ with evidence for both ‘more
generative’ and ‘less generative’ designations.
Example: ‘More generative’
The Concept Cartoons approach (Keogh & Naylor, 1999) was provided as an example of
a ‘more generative’ approach, specifically with two pieces of data which support oppor
tunities for student knowledge generation. In an initial activity, Concept Cartoons were
used to present various ideas in pictorial form with dialogue bubbles. Each dialogue
bubble contained an alternate idea about the science phenomenon and how it worked. In
addition, a blank bubble was left to allow for student ideas to be added. In effect, there
was not a focus on one ‘correct’ answer. The overall initial activity involved students using
the existing dialogue bubbles and their own ideas to shape their understanding of the
science phenomenon. This provided support for the approach to be classified as ‘more
generative,’ as students played a significant role in authoring the ideas and problems to
be considered for the initial activity and, thus, the remainder of the unit. Concept
Cartoons were used as tools to encourage students to figure out their own understanding
of the scientific phenomenon. When considering the investigation design, the author
stated that students were eager to test their ideas through designing an investigation.
Naylor, Keogh, & Downing (2007) noted that the ‘nature of investigation was left up to
24 K. A. WEISS ET AL.
[the students]’ (p. 21). This provided support for the approach to be classified as ‘more
generative,’ as students generated their own procedure or ways of testing their ideas
through investigation. Taken together, these data indicated that Concept Cartoons would
be characterised as ‘more generative’ with support of two pieces of data (‘more gen
erative (2)’).
Discussion
Following Cavagnetto’s (2010) review on argument-based approaches, we recognised
a need for a more in-depth analysis of immersive ABI approaches and the characteristics
underlying the learning environments which support them. We chose immersive argu
ment-based approaches because they enable students to use the epistemic practices of
science to not only learn science concepts but also understand how science knowledge is
constructed. We think immersive ABI approaches have the most potential to help stu
dents accomplish the generative learning goals of the NGSS and similar international
reform documents.
The findings of this study not only provide insights into immersive ABI learning
environments, but also provide a basis for comparison among other argument-based
learning environments. First, the findings of this systematic literature review clearly show
that not all immersive ABI approaches are equal. For example, while all approaches
engage students in small group work, some include a focus on science concepts, with
others emphasising science process (see Table 5). While all approaches engage students
in the integrated epistemic practices of investigation and argumentation, some provide
more generative opportunities than others. This is evident by the continuum of knowl
edge generation presented in Figure 3. Ultimately, all sixteen approaches analysed in this
systematic review have potential to support students’ immersive argument-based learn
ing. However, some may better enhance the learning benefits associated with knowledge
generation than others.
Our list of common elements of immersive ABI learning environments presents key
actions and opportunities characteristic of sixteen existing immersive approaches. While
rooted in theory around argumentation, student ownership of learning, and knowledge
generation, the list of common elements is presented as a practical, concise list which can
bridge the gap between theory and practice within the science classroom. When com
pared to previously developed frameworks addressing argument-based learning environ
ments, some similarities arise. For instance, the common element discuss aligns with
‘context which fosters dialogic discourse’ in Duschl and Osborne (2002, p. 61). The
common element share authority may relate to ‘provide students access to plural
accounts of phenomena’ (Duschl & Osborne, 2002, p. 61), as students engage in compar
ison of those accounts and play a role in authoring knowledge instead of merely accept
ing the teacher as the central knowledge authority. Clark et al. (2007) compiled a list of
conditions for supporting student argumentation, which builds upon Duschl & Osborne’s
work and adds conditions similar to the common elements work as small group, focus on
conceptual understanding, among others. When comparing these frameworks of general
argument-based learning environments to our list of characteristics specific to immersive
26 K. A. WEISS ET AL.
ABI, there are clearly similar elements or conditions among the various types of argument-
based interventions. However, the relative emphasis on more generative common ele
ments may set immersive ABI learning environments apart, which requires further study.
In addition, the ways in which the common elements work together to shape student
argumentation may provide further insight into the differences between immersive ABI
and other varieties of argument-based learning environments.
Rather than seeking the complex connections underlying immersive ABI learning
environments, we were interested in the broad categories that can help simplify an
understanding of these environments and can be utilised by educators. We argue that
the simplicity of the list of common elements sets it apart from other frameworks or lists of
conditions to support student argumentation. Not only does it narrow the focus to
immersive ABI learning environments, but it also provides a concrete list of actions and
opportunities to support, enact, or look for within a classroom. The list of common
elements of immersive ABI learning environments (Figure 2), along with the continuum
of knowledge generation (Figure 3) transform the ideas put forth by theoretical frame
works into practical tools which can be used by various educational stakeholders. We
believe these actions may help educators better adopt and adapt the required learning
environments that will help achieve the aims of the new national curricula that have been
put forward in a range of countries.
Conclusion
The central aim of this study was to develop a better understanding of immersive ABI
learning environments based on a systematic literature review of current approaches.
Three main ideas have emerged. First, while all immersive ABI approaches provide
students the opportunity to engage in the practices of argumentation and investigation
in an integrated way, some provide more opportunities for student ownership of learning
and knowledge generation than others. Second, with its roots in a rigorous literature
review analysis, the broad categories of the resulting common elements list give it
potential to be a bridge between theory and practice, simplifying the understanding of
these learning environments. Third, the common elements list and continuum of knowl
edge generation can be used by various stakeholders within primary and secondary
school education, including researchers, professional development leaders, principals,
and teachers. Further research is needed to better understand what makes immersive
ABI learning environments unique from other argument-based interventions. However,
the findings of this study can help move the field forward towards a better understanding
of these argument-based learning environments and their impact on student learning of
science.
Implications
The common elements of immersive ABI learning environments can be used by three
main stakeholders in education: researchers, professional development leaders, and
school-based principals and teachers.
STUDIES IN SCIENCE EDUCATION 27
Researchers
Researchers can use the common elements to identify unique aspects of different
immersive ABI approaches, making comparisons between approaches to measure their
differential impacts on student learning. For example, approaches which emphasise
writing (ADI, SWH) can be compared to those which emphasise representational models
(Representational approach). In the same way, these approaches can be compared for the
generative opportunities they provide students, identifying additional generative oppor
tunities and exploring their impact on student learning. Immersive ABI approaches which
have been previously shown to promote student learning or conceptual understanding
can be examined in light of the generative opportunities they provide and how these may
help explain the benefits afforded to students.
Further study can be conducted on the potential interactions between common
elements. It can be predicted that the student action of ‘engage in argumentation’ is
related to the teacher action of ‘encourage argumentation,’ however further research is
needed to uncover the most effective ways for teachers to introduce and support student
argumentation. Another potential interaction exists between the generative opportu
nities and the teacher action of ‘share authority.’ Future study can explore ways in which
teachers share their intellectual authority and recognise student ownership in their
learning. This study could then examine the generative opportunities provided to the
students within these learning environments and explore the interaction between the
teacher’s role and student engagement in knowledge generation. More comprehensively,
interactions between various common elements can help point to ideal combinations of
elements which promote specific student benefits, targeted to certain grade levels and
school contexts.
Disclosure statement
No potential conflict of interest was reported by the authors.
Notes on contributors
Kathleen A. Weiss is an Instructional Design and Educational Support Specialist in the Center for
Educational Enhancement at Des Moines University. Her current work is focused on development of
faculty and student supports in medical and health science education. Her research focuses on
generative learning environments.
Mark A. McDermott is a clinical professor of Science Education at the University of Iowa. His
research focuses on STEM education and methods for improving pre-service teacher learning.
Brian Hand is a professor of Science Education at the University of Iowa. His research interests are
centred on improving learning through understanding the complexity of generative learning
environments.
ORCID
Kathleen A. Weiss http://orcid.org/0000-0001-5890-8853
References
Adey, P., & Strayer, M. (2015). The effects of cognitive acceleration. In Socializing intelligence through
academic talk and dialogue (127–142). American Educational Research Association
Andriessen, J., Baker, M., & Suthers, D. (2003). Argumentation, computer support, and the educa
tional context of confronting cognitions. In Arguing to learn (pp. 1–25). Springer
Ardasheva, Y., Norton-Meier, L., & Hand, B. (2015). Negotiation, embeddedness, and
non-threatening learning environments as themes of science and language convergence for
STUDIES IN SCIENCE EDUCATION 29
*Engle, R. A., & Conant, F. R. (2002). Guiding principles for fostering productive disciplinary engage
ment: Explaining an emergent argument in a community of learners classroom. Cognition and
Instruction 20(4), 399–483. https://doi.org/10.1207/S1532690XCI2004_1
Erduran, S., Simon, S., & Osborne, J. (2004). TAPping into argumentation: Developments in the
application of Toulmin’s argument pattern for studying science discourse. Science Education 88
(6), 915–933. https://doi.org/10.1002/sce.20012
*Eymur, G. (2018). Developing High School Students’ Self-Efficacy and Perceptions about Inquiry
and Laboratory Skills through Argument-Driven Inquiry. Journal of Chemical Education 95(5),
709–715. https://doi.org/10.1021/acs.jchemed.7b00934
Fiorella, L., & Mayer, R. E. (2016). Eight ways to promote generative learning. Educational Psychology
Review 28(4), 717–741. https://doi.org/10.1007/s10648-015-9348-9
Fleming, K., & Panizzon, D. (2010). Facilitating students’ ownership of learning in science by
developing lifelong learning skills.Teaching Science: The Journal of the Australian Science
Teachers Association 56(3), 27–32.
Ford, M. (2008). Disciplinary authority and accountability in scientific practice and learning. Science
Education 92(3), 404–423. https://doi.org/10.1002/sce.20263
*Grooms, J., Sampson, V., & Enderle, P. (2018). How concept familiarity and experience with scientific
argumentation are related to the way groups participate in an episode of argumentation. Journal
of Research in Science Teaching 55(9), 1264–1286. https://doi.org/10.1002/tea.21451
Hand, B., Chen, Y. C., & Suh, J. (2020) Does a knowledge generating approach benefit students: A
systematic review of the science writing heuristic approach. Educational Psychology Review., 1–
43. https://doi.org/10.1007/s10648-020-09550-0
Hand, B., Park, S., & Suh, J. K. (2018). Examining teachers’ shifting epistemic orientations in improving
students’ scientific literacy through adoption of the science writing heuristic approach. In Tang,
K.-S. & Danielsson, K (Eds.), Global developments in literacy research for science education (pp.
339–355). Springer.
*Hofstein, A., Navon, O., Kipnis, M., & Mamlok-Naaman, R. (2005). Developing students‘ ability to ask
more and better questions resulting from inquiry-type chemistry laboratories. Journal of Research
in Science Teaching, 42, 791–806. 7 https://doi.org/10.1002/tea.20072
*Herrenkohl, L. R., & Cornelius, L. (2013). Investigating elementary students‘ Scientific and historical
argumentation Journal of the Learning Sciences 22(3), 413–461. https://doi.org/10.1080/
10508406.2013.799475
*Hand, B., Norton-Meier, L., Gunel, M. M., & Akkus, R. (2016). Aligning teaching to learning: A 3-year
study examining the embedding of language and argumentation into elementary science
classrooms. International Journal of Science & Mathematics Education 14(5), 847–863. https://doi.
org/10.1007/s10763-015-9622-9.
*Hand, B., Shelley, M. C., Laugerman, M., Fostvedt, L., & Therrien, W. (2018). Improving critical
thinking growth for disadvantaged groups within elementary school science: A randomized
controlled trial using the science writing heuristic approach. Science Education 102(4), 693–710.
https://doi.org/10.1002/sce.21341
Holthuis, N., Lotan, R., Saltzman, J., Mastrandrea, M., & Wild, A. (2014). Supporting and under
standing students‘ Epistemological discourse about climate change. Journal of Geoscience
Education 62(3), 374–387. https://doi.org/10.5408/13-036.1
*Katchevich, D., Hofstein, A., & Mamlok-Naaman, R. (2013). Argumentation in the chemistry labora
tory: Inquiry and confirmatory experiments. Research in Science Education, 43(1), 317–345. https://
doi.org/10.1007/s11165-011-9267-9
*Howe, C., Ilie, S., Guardia, P., Hofmann, R., Mercer, N., & Riga, F. (2015). Principled improvement
in science: Forces and proportional relations in early secondary-school teaching.
International Journal of Science Education 37(1), 162–184. https://doi.org/10.1080/09500693.
2014.975168
*Jang, J. Y., & Hand, B. (2017). Examining the value of a scaffolded critique framework to promote
argumentative and explanatory writings within an argument-based inquiry approach. Research
in Science Education 47(6), 1213–1231. https://doi.org/10.1007/s11165-016-9542-x
STUDIES IN SCIENCE EDUCATION 31
*Kang, E. J., Swanson, L. H., & Bauler, C. V. . (2017). “Explicame”: Examining emergent bilinguals’
ability to construct arguments and explanations during a unit on plate tectonics. Electronic
Journal of Science Education, 21(6), 12–45. mhttp://ejse.southwestern.edu
Kentish, B. (1995). Hypotheticals: Deepening the understanding of environmental issues through
ownership of learning. Australian Science Teachers Journal 41(1), 21–25.
*Keogh, B., & Naylor, S. (1999). Concept cartoons, teaching and learning in science: An evaluation.
International Journal of Science Education, 21(4), 431–446. https://doi.org/10.1080/
095006999290642
*Keys, C. W., Hand, B., Prain, V., & Collins, S. (1999). Using the science writing heuristic as a tool for
learning from laboratory investigations in secondary science. Journal of Research in Science
Teaching 36(10), 1065–1084. https://doi.org/10.1002/(SICI)1098-2736(199912)36:10<1065::AID-
TEA2>3.0.CO;2-I
*Kim, H., & song, J. (2006). The features of peer argumentation in middle school students’ scientific
inquiry. Research in Science Education 36(3), 211–233. https://doi.org/10.1007/s11165-005-9005-2
*Larraín, A., Moreno, C., Grau, V., Freire, P., Salvat, I., López, P., & Silva, M. (2017). Curriculum materials
support teachers in the promotion of argumentation in science teaching: A case study. Teaching
and Teacher Education, 67, 522–537. https://doi.org/10.1016/j.tate.2017.07.018
Lead States, N. G. S. S. (2013). Next generation science standards: For states, by states. The National
Academies Press.
LeCompte, M. D. (2000). Analyzing qualitative data. Theory into Practice 39(3), 146–154. https://doi.
org/10.1207/s15430421tip3903_5
Lizotte, D. J., McNeill, K. L., & Krajcik, J. (2004). Teacher practices that support students’ construction
of scientific explanations in middle school classrooms. In Kafai, Y.B, Sandoval, W.A., Enyedy, N.,
Scott Nixon, A., & Herrera F., (Eds.), Embracing diversity in the learning sciences: Proceedings of the
sixth international conference of the learning sciences (p. 310). Lawrence Erlbaum Associates.
*Martin, A. M., and Hand, B. (2009). Factors affecting the implementation of argument in the
elementary science classroom. A longitudinal case study. Research in Science Education, 39(1),
17–38. https://doi.org/10.1007/s11165-007-9072-7
*Manz, E. (2016). Examining evidence construction as the transformation of the material world into
community knowledge. Journal of Research in Science Teaching, 53(7), 1113–1140. https://doi.org/
10.1002/tea.21264
*Manz, E., & Renga, I. (2017). Understanding how teachers guide evidence construction
conversations. Science Education, 101(4), 584–615. https://doi.org/10.1002/sce.21282
Mayer, R. E. (2014). Cognitive theory of multimedia learning. In edited by R. E. Mayer, The cambridge
handbook of multimedia learning (Second Ed., pp. 43–71). Cambridge University Press.
McNeill, K. L. (2009). Teachers‘ use of curriculum to support students in writing scientific arguments
to explain phenomena. Science Education, 93, 233–268. 2 https://doi.org/10.1002/sce.20294
McNeill, K. L. (2011). Elementary students‘ views of explanation, argumentation, and evidence, and
their abilities to construct arguments over the school year Journal of Research in Science Teaching
48(7), 793–823. https://doi.org/10.1002/tea.20430
McNeill, K. L., & Knight, A. M. (2013). Teachers’ pedagogical content knowledge of scientific
argumentation: The impact of professional development on K–12 teachers. Science Education
97(6), 936–972. https://doi.org/10.1002/sce.21081
McNeill, K. L., & Krajcik, J. (2008). Inquiry and scientific explanations: Helping students use evidence
and reasoning. Science as inquiry in the secondary setting, 121–134. primaryconnections.org.au.
Mendonça, P. C. C., & Justi, R. (2011). Contributions of the Model of Modelling Diagram to the
learning of ionic bonding: Analysis of A case study. Research in Science Education, 41(4), 479–503.
https://doi.org/10.1007/s11165-010-9176-3
*Mendonça, P. C. C., & Justi, R. (2013). The relationships between modelling and argumentation from
the perspective of the model of modelling diagram. International Journal of Science Education 35
(14), 2407–2434. https://doi.org/10.1080/09500693.2013.811615
*Mendonça, P. C. C., & Justi, R. (2014). An instrument for analyzing arguments produced in
modeling-based chemistry lessons. Journal of Research in Science Teaching 51(2), 192–218.
https://doi.org/10.1002/tea.21133
32 K. A. WEISS ET AL.
Mercier, H., Boudry, M., Paglieri, F., & Trouche, E. (2017). Natural-born arguers: Teaching how to make
the best of our reasoning abilities. Educational Psychologist 52(1), 1–16. https://doi.org/10.1080/
00461520.2016.1207537
Milner-Bolotin, M. (2001). The effects of topic choice in project-based instruction on undergraduate
physical science students’ interest, ownership, and motivation. University of Texas at Austin.
Mortimer, E., & Scott, P. (2003). Meaning making in secondary science classrooms. McGraw-Hill
Education.
Mullis, I. V., & Martin, M. O. (2017). TIMSS 2019 Assessment Frameworks. International Association for
the Evaluation of Educational Achievement. Herengracht 487, Amsterdam, 1017 BT, The
Netherlands.
National Governors Association Center for Best Practices, & Council of Chief State School Officers.
(Common Core Standards Initiative (CCSSI), 2010a). Common Core State Standards for
Mathematics: Mathematical practices. http://www.corestandards.org/Math/Practice/
National Governors Association Center for Best Practices, & Council of Chief State School Officers.
(Common Core Standards Initiative (CCSSI), 2010b). Common Core State Standards for English
Language Arts: Writing, Grades 9–10. http://www.corestandards.org/ELA-Literacy/WHST/9-10/
National Research Council (NRC). (2012). A framework for K-12 science education: Practices, cross
cutting concepts, and core ideas. National Academies Press.
*Naylor, S., Keogh, B., & Downing, B. (2007). Argumentation and primary science. Research in Science
Education, 37, 17–3. https://doi.org/10.1007/s11165-005-9002-5
*Naylor, S., & Keogh, B. (2013). Concept cartoons: What have we learnt? Journal of turkish science
education (TUSED), 10(1), 3–9. tused.org.
Newton, P., Driver, R., & Osborne, J. F. (1999). The place of argumentation in the pedagogy of school
science. International Journal of Science Education, 21(5), 553–576. https://doi.org/10.1080/
095006999290570
Norris, S. P., & Phillips, L. M. (2003). How literacy in its fundamental sense is central to scientific
literacy. Science Education, 87(2), 224–240. https://doi.org/10.1002/sce.10066
*Oliveira, D. K. B. S., Justi, R., & Mendonça, P. C. C. (2015). The use of representations and argumen
tative and explanatory situations. International Journal of Science Education 37(9), 1402–1435.
https://doi.org/10.1080/09500693.2015.1039095
O’Neill, T. B. (2010). Fostering spaces of student ownership in middle school science. Equity &
Excellence in Education 43(1), 6–20. https://doi.org/10.1080/10665680903484909
O’Neill, T. B., & Barton, A. C. (2005). Student ownership in an urban middle school science video
project. School Science and Mathematics 105(6), 292–302.
Organization for Economic Cooperation and Development (OECD). (2018). Preparing our youth for
an inclusive and sustainable world: The OECD PISA global competence framework. https://www.
oecd.org/education/Global-competency-for-an-inclusive-world.pdf.
Osborne, J., Erduran, S., & Simon, S. (2004). Enhancing the quality of argumentation in school
science. Journal of Research in Science Teaching 41(10), 994–1020. https://doi.org/10.1002/tea.
20035
Prain, V., & Hand, B. (1999). Students perceptions of writing for learning in secondary school
science. Science Education, 83(2), 151–162. https://doi.org/10.1002/(SICI)1098-237X(199903)
83:2<151::AID-SCE4>3.0.CO;2-S
Rainer, J. D., & Matthews, M. W. (2002). Ownership of learning in teacher education. Action in Teacher
Education 24(1), 22–30. https://doi.org/10.1080/01626620.2002.10463264
Ryan, G. W., & Bernard, H. R. (2000). Techniques to identify themes in qualitative data. Handbook of
qualitative research. (2nd ed., pp.1–15) Sage Publications.
Sadler, T., Chambers, F. W., & Zeidler, D. (2004). Student conceptualizations of the nature of science
in response to a socioscientific issue. International Journal of Science Education, 26, 387–409. 4
https://doi.org/10.1080/0950069032000119456
Saldana, J. (2016). The coding manual for qualitative researchers / Johnny Saldana. (3rd ed.). SAGE.
*Sampson, V., Grooms, J., & Walker, J. P. (2011). Argument-Driven inquiry as a way to help students
learn how to participate in scientific argumentation and craft written arguments: An exploratory
study. Science Education 95(2), 217–257. https://doi.org/10.1002/sce.20421
STUDIES IN SCIENCE EDUCATION 33
*Sampson, V., Enderle, P., Grooms, J., & Witte, S. (2013). Writing to learn by learning to write during
the school science laboratory: Helping middle and high school students develop argumentative
writing skills as they learn core ideas. Science Education 97(5), 643–670. https://doi.org/10.1002/
sce.21069
*Swanson, L. H., Bianchini, J. A. ., & Lee, J. S. (2014). Engaging in argument and communicating
information: A case study of english language learners and their science teacher in an urban high
school. Journal of Research in Science Teaching 51(1), 31–64. https://doi.org/10.1002/tea.21124
*Schoerning, E., Hand, B., Shelley, M., & Therrien, W. (2015). Language, access, and power in the
elementary science classroom. Science Education 99(2), 238–259. https://doi.org/10.1002/sce.
21154
*Suh, J. K., & Park, S. (2017). Exploring the relationship between pedagogical content knowledge
(PCK) and sustainability of an innovative science teaching approach. Teaching & Teacher
Education 64, 246–259. https://doi.org/10.1016/j.tate.2017.01.021
*Çetin, S. P., Eymur, G., Southerland, S. A., Walker, J., & Whittington, K. (2018). Exploring the
effectiveness of engagement in a broad range of disciplinary practices on learning of Turkish
high-school chemistry students. International Journal of Science Education 40(5), 473–497. https://
doi.org/10.1080/09500693.2018.1432914
Toulmin, S. (1958). The layout of arguments. The uses of argument, 94–145.
*Waldrip, B. B., Prain, V. V., & Sellings, P. P. (2013). Explaining Newton’s laws of motion: Using student
reasoning through representations to develop conceptual understanding. Instructional Science
41(1), 165–189. https://doi.org/10.1007/s11251-012-9223-8
Walker, K. A., & Zeidler, D. L. (2007). Promoting discourse about socioscientific issues through
scaffolded inquiry. International Journal of Science Education, 29, 1387–1410. 11 https://doi.org/
10.1080/09500690601068095
Walton, D. N. (1990). What is reasoning? What is an argument? The Journal of Philosophy, 87(8),
399–419. https://doi.org/10.2307/2026735
*Watson, J. R., Swain, J. R. L., & McRobbie, C. (2004). Students’ discussions in practical scientific
inquiries. International Journal of Science Education 26(1), 25–45. https://doi.org/10.1080/
0950069032000072764
Winn, K. M., Choi, K. M., & Hand, B. (2016). Cognitive language and content standards: Language
inventory of the common core state standards in mathematics and the next generation science
standards. International Journal of Education in Mathematics, Science and Technology 4(4),
319–339. https://doi.org/10.18404/ijemst.26330
Wittrock, M. C. (1974). Learning as a generative process 1 Educational Psychologist 11(2), 87–95.
https://doi.org/10.1080/00461527409529129
*Yerrick, R. K. (2000). Lower track science students’ argumentation and open inquiry instruction.
Journal of Research in Science Teaching 37(8), 807–838. https://doi.org/10.1002/1098-2736
(200010)37:8<807::AID-TEA4>3.0.CO;2-7
*Yun, S., & Kim, H.-B. (2015). Changes in students’ participation and small group norms in scientific
argumentation. Research in Science Education 45(3), 465–484. https://doi.org/10.1007/s11165-
014-9432-z