Professional Documents
Culture Documents
Bailey Al Maher Ca Wilkinson LC Eds Language Literacy and Le
Bailey Al Maher Ca Wilkinson LC Eds Language Literacy and Le
With a focus on what mathematics and science educators need to know about
academic language used in the STEM disciplines, this book critically synthesizes the
current knowledge base on language challenges inherent to learning mathematics
and science, with particular attention to the unique issues for English learners.
These key questions are addressed: When and how do students develop mastery
of the language registers unique to mathematics and to the sciences? How do
teachers use assessment as evidence of student learning for both accountability and
instructional purposes? Orienting each chapter with a research review and drawing
out important Focus Points, chapter authors examine the obstacles to and latest
ideas for improving STEM literacy, and discuss implications for future research and
practice.
Typeset in Bembo
by Apex CoVantage, LLC
“To all students learning STEM in a new language and the
teachers who support them.”
CONTENTS
PART I
Language in the STEM Disciplines 11
PART II
Literacy in the STEM Disciplines 53
PART III
Summative and Formative Assessment
in the STEM Disciplines 141
Afterword 261
Alison L. Bailey, Carolyn A. Maher, and Louise C.Wilkinson
This volume contains five chapters (Chapters 2, 4, 6, 8, and 10) that focus on
teaching mathematics to English learners (EL students). The research-based advice
provided for teachers and, by extension, for teacher educators includes practical
recommendations to support EL students such as asking students to revoice others’
arguments and apply their own reasoning to that offered by others and structuring
instruction to build formal language on students’ everyday language. Notably, the
five chapters spotlight the importance of preserving the integrity of the mathemat-
ics while tailoring instruction to the needs of EL students.
The authors issue a clarion call that expectations remain high for EL students.
Moschkovich underscores the inherent danger of a narrow view of language that
can limit EL students’ access to high-quality curriculum. For example, a focus on
the meaning of single words can limit students’ access to complex mathematical
ideas; insistence on using formal language to convey mathematical ideas can limit
the resources upon which EL students can draw. Avalos, Medina, and Secada rec-
ommend that teachers of EL students should focus on connections among registers
(e.g., everyday language, mathematical representation, school mathematical lan-
guage, symbolic language). Echoing this recommendation, Moschkovich advocates
for developing students’ facility with multiple representations through extended
classroom discourse that engages students in finding and articulating mathematical
patterns, making generalizations, and using representations to support their math-
ematical claims. She posits that mathematical activity centered on evidence-based
argumentation contributes to conceptual understanding.
In addition to addressing representation and communication, authors portray
language as a tool for thinking and sense making. For example, Barwell argues that
writing in mathematics helps students organize their thinking about mathematics
as they structure arguments that interrelate natural language, mathematical symbols,
and visual representations. He views learning to write in mathematics as inextrica-
bly linked to learning mathematics and as more than a routine exercise.
Foreword xi
Language is both a gift and a trap. It is through language that we connect with others,
that we form friendships and resolve disagreements, that we learn new things, that
we organize our thoughts so we can remember what we have learned, and that we
construct world views and interrogate our own thinking. That is the gift. But we
also rely on others’ use of language in judging them, and we too often conclude
that people who don’t speak our language well are deficient in some way. That can
be a trap for all of us, and in particular for teachers working valiantly to convey
complex content to their students. When the stakes are high, as they almost always
are in classroom settings, misunderstanding or lack of understanding can generate
frustration and negative affect. Teachers are constantly confronted with the need
to distinguish the complexity of the content from the complexity of the language
used to convey the content. This is a very difficult task, precisely because the lan-
guage complexity is a mechanism for conveying the content efficiently.
The default approach to educating second language learners of English has been
to focus on language, often in separate immersion or ESL classrooms where speak-
ing, understanding, reading, and writing English become in effect the entire cur-
riculum. Achieving proficiency in English is seen as a prerequisite to accessing
curricular content in math, science, or social studies—because, of course, those
content areas are complex and because students typically learn them by listening to
the teacher speak in English or by reading texts written in English.
The chapters in this volume give a collective overview of how complex science
and math concepts generate the language complexity that teachers and students
must grapple with. At the same time they offer practices and strategies designed to
ensure that all students, in particular English learners (ELs), can navigate through
the language to the content. Students who speak English as a second language
are often provided with simpler language by virtue of simplifying the content,
thus limiting their access to grade-level material. That is a recipe for ensuring they
never catch up with monolingual peers. Engaging topics, excellent instruction,
Foreword xiii
well-designed cumulative curricula, and access to support through the home lan-
guage, through cooperative learning, and through hands-on lessons can ensure that
content learning becomes a mechanism for language learning rather than an activ-
ity postponed until after language learning has been accomplished.
The editors and authors who have contributed to this volume deserve a vote
of thanks for having taken on a challenging set of issues, and having responded
with research-based and usable information. They have considered many dimen-
sions relevant to their work—not just analyzing the challenge, but also exploring
implications for instruction, for teacher education, and for assessment. The new
college- and career-ready standards embraced by American educators hold the
promise of improving educational outcomes for all students in the U.S., but they
also bring with them the danger of exacerbating the gaps between native speakers
and second language learners of English. Information such as that compiled in this
volume will be of great help in ensuring positive outcomes for EL students and for
their monolingual classmates.
Harvard Graduate School of Education
Cambridge, MA
PREFACE1
moves the field toward combining both summative and formative assessment while
upholding high-quality standards of reliability and validity.
The book concludes with an afterword by Bailey, Maher, and Wilkinson that
synthesizes key ideas to emerge from these chapters.We examine whether the work
of the authors suggests that implementation of “best practices” for instruction and
assessment of STEM disciplines differs for EL students and non-EL students or
whether indeed the distinction is one of emphasis in teaching and assessment prac-
tices with EL students. Finally, we offer suggestions for further research.
Note
1 The editors of this volume are listed in alphabetical order; all contributed equally to this
volume.
References
Ellis, N. C., & Larsen-Freeman, D. (2009). Language as a complex adaptive system (Special
Issue). Language Learning, 59 (Supplement 1).
Halliday, M. A. K. (1978). Language as a social semiotic. London: Arnold.
Hayes, J. R., & Berninger, V. (2014). Cognitive processes in writing: A framework. In M. J.
Schleppegrell (Ed.), The language of schooling: A functional linguistics perspective. Mahwah, NJ:
Lawrence Erlbaum.
Schleppegrell, M. J. (2004). The language of schooling: A functional linguistics perspective. Mahwah,
NJ: Lawrence Erlbaum Associates.
Snow, C. E. (1999). Social perspectives on the emergence of language. In B. MacWhinney
(Ed.), The emergence of language (pp. 257–276). Mahwah, NJ: Erlbaum.
ACKNOWLEDGMENTS
First and foremost, we owe great debts of gratitude to our husbands, Frank, Jim,
and Alex. Once again they suffered the absence of their spouses cheerfully and sup-
portively, but we hope they also forged their own bonds of new or closer friendship
as a result of being thrown together several times in the making of this book. We
thank Alejandro and William for their willing participation in Chapter 1—their
efforts have helped to illustrate firsthand the intersection of language and math-
ematics. Our thanks go also to the chapter contributors for each being so willing
and enthusiastic about the volume and their work for the educational communities
the volume is designed to inform. We also gratefully acknowledge M. Kathleen
Heid, Catherine Snow, and Rodolfo Dirzo for their contributions of putting the
work within their respective contexts of mathematics, language development, and
science learning. Finally, we thank former education publisher Naomi Silverman
at Routledge/Taylor & Francis Group, who got this volume under way, as well as
Karen Adler, her successor, and Emmalee Ortega during the production stages.
1
INTRODUCTION
Language, Literacy, and Learning
in the STEM Disciplines
This volume synthesizes and critically interprets the extant research on the lan-
guage and literacy inherent to learning the STEM disciplines of science, tech-
nology, engineering, and mathematics. In addition, the volume addresses how the
language of mathematics and of the sciences may present specific challenges to the
learning and assessment of English learners (EL students). The chapters of this vol-
ume focus on the following questions:
to use multiple domains of language and literacy, including media and technology,
all to support their thinking critically.The so-called anchor standards emphasize the
integration of communication processes into the disciplines and refer to literacy
(reading, writing) and oral language (speaking and listening).
While the articulation of the role of language in mastery of content is an
improvement over prior academic standards, it is regrettable that language is largely
defined narrowly in these standards. For the most part, language is treated as the
development of general academic and domain-specific vocabulary knowledge and
effective use of language conventions across multiple modes of expression. Nev-
ertheless, these integrated processes of language and STEM content function as
the foundation for the grade-specific academic standards, which reference require-
ments for mastery by the end of each academic year in content areas.
It is notable that this standards-based reform is occurring at the same time as
the evolving composition of the student population in the U.S., with EL students
showing the fastest growing cohort (National Clearinghouse for English Lan-
guage Acquisition, 2017). Throughout their schooling in the U.S., the standardized
achievement scores of these children and youth are significantly below their peers
(see Tables 1.1 and 1.2).
TABLE 1.1 2015 PISA Mathematics and Science Performances by Match Between Language
Spoken in the Home and Test Language
15-Year-Olds
OECD average:
Mathematics 452 496
Science 448 500
U.S:
Mathematics 438 477
Science 459 506
Source: OECD (2015)
TABLE 1.2 2015 NAEP Mathematics and Science Performances by Grade and ELL Status,
Public and Nonpublic School Students Combined*
Mathematics 218 (43) 243 (15) 246 (69) 284 (26) 115 (79) 153 (37)
Science 121 (59) 158 (20) 110 (81) 157 (29) 105 (86) 152 (38)
* English language learner (ELL) is terminology used by NAEP.
Source: USDOE (2015)
4 Alison L. Bailey et al.
Perhaps the most intractable problem in U.S. education is the achievement gap
that exists between groups of children who differ by home language, socioeco-
nomic status, race, and/or ethnicity. The origins of this gap may be due to lack of
opportunity to learn STEM content or due to the linguistic and cultural differences
among students with varying language backgrounds. Students in these categories
achieve far below their peers on standardized achievement tests of mathematics, sci-
ence, and literacy learning. For example, long-standing gaps in achievement for EL
students, who are learning English at the same time that they are learning academic
content such as science, appear early and become amplified as they progress from
first grade through high school.
From a global perspective, reform of the way science is taught and learned in U.S.
schools could not have come soon enough. As a matter of policy and practicality,
students in the U.S. are not doing well in learning how to reason or communicate
like a scientist. Their average PISA science score (496) again was not significantly
different from the OECD average (493) (Kastberg, Chan, & Murray, 2016). This
finding held despite a positive decrease in the variance explained by socioeconomic
status (SES), a measure of equity. The variance for science performance attribut-
able to SES for U.S. students decreased from 17% in 2006 to 11% in 2015 (OECD,
2016). In this sense, relative to its recent large-scale educational reform, the U.S. still
lacks policy and pedagogical “smarts” in the teaching of science (Ripley, 2016), a
predicament in the process of resolution through federal funding of new models for
science teaching, learning, and fairer assessments (USDOE, 2016).
The primary premise of the PISA science assessment is that the immensity of
today’s information flow and rapid changes in technology mean that laboratory-
bound experiments no longer define the whole of scientific practices (OECD,
2016). Rather science is now viewed as the basis for the everyday tools available to
enhance individual quality of life while simultaneously expanding global econo-
mies, from clean drinking water to more productive farming, from climate change
to space exploration. Hence, in this era of “fake” news, there is great urgency for
students to “think like a scientist” (OECD, 2016, p. 2) in considering evidence,
reaching principled conclusions, and understanding that scientific truths can change
over time as new discoveries emerge.
A global assessment of scientific literacy, which is defined as knowledge of
the purposes, procedures, and products of science and science-based technology
(OECD, 2015), the PISA requires students to apply their science knowledge to
solve problems set in everyday, real-world contexts. The PISA therefore assumes
students’ mastery of specialized academic English, in which the multiple levels of
language must be coordinated in precise ways. For example, at the level of mean-
ing and syntax, the linguistic complexity of individual test items, directions, and
questions can include: (a) technical vocabulary; (b) obscure semantic relation-
ships among word meanings; and (c) complex syntactic forms, such as dense noun
phrases, nominalizations, multiple embedded clauses, and passive voice construc-
tions (Silliman, Wilkinson, & Brea-Spahn, this volume). Of note, the PISA compe-
tencies required for scientific literacy (OECD, 2015) do not include any linguistic
or discourse dimensions of the specialized academic language that serves as the
mechanism for interpreting text and translating these understandings into written
expressions. The three competencies are: explain phenomena scientifically, evaluate
Introduction 5
and design scientific inquiry, and interpret data and evidence scientifically. It should
be noted that PISA scores are not disaggregated by EL student status (or equivalent
across the different nations) (OECD, 2015). There is however, an accompanying
survey item that asks whether the assessment is taken in a language that matches the
student’s home language.While this is not identical to knowing whether a test taker
is proficient in the language in which the test is conducted (students after all can
be proficient speakers of the language used in school while exposed to a different
home language), it may be a close proxy. As a result of disaggregating the mathemat-
ics and science assessments by the match between the test and home language, 2015
test scores show large differences for both the OECD average for participating
nations as a whole and for the U.S. specifically (see Table 1.1). In every instance, the
subgroup of students who experienced a mismatch between the language of the
test administration and their home language scored lower on average for both the
PISA mathematics and the PISA science assessments (OECD, 2015).
While these assessment performances may be discouraging for students learning
the language of school as an additional language at the same point they are learn-
ing new academic content, it is still premature to evaluate progress, considering
the implementation of the new academic standards in the U.S. Presumably, it will
take some time before improvement in instruction based on the new academic
standards shows up in student outcomes on large-scale assessments. Meanwhile, we
need to move forward to determine whether the integration of language and the
STEM disciplines has enhanced the learning of EL students. We can take immedi-
ate action by asking: What knowledge of STEM and EL students are teachers receiving?
What instructional and assessment practices are promising? These are questions that we
now address.
Instruction
In instruction, task design and implementation procedures are central concerns
in establishing optimal conditions for students’ learning. During instruction, there
are opportunities for students to interact with each other and the teacher and
thereby question and scaffold their own learning utilizing talk and text as tools.
One element of task design is the composition of the participant structure utilized:
6 Alison L. Bailey et al.
individual learner, dyads, small groups, and whole groups. In the case of mathemat-
ics, specific kinds of tasks tend to elicit certain forms of reasoning, in which students
are required to provide oral and/or written justification for their solutions (Mueller,
Yankelewitz, & Maher, 2010). Most effective for students’ learning are tasks requir-
ing students both to convince themselves and others about their solutions and also
to articulate, using language and other forms of representation, why these solutions
are correct and complete (Maher & Yankelewitz, 2017).
The new college- and career-readiness standards clearly set high expectations
for teacher and student uses of language during STEM instruction even while they
do not elaborate on how language and content can best be integrated. For exam-
ple, the CCSS for Mathematics includes Mathematical Practice 3 that states, “Construct
viable arguments and critique the reasoning of others.” CCSS for English Language
Arts & Literacy in History/Social Studies, Science, and Technical Subjects includes
the following standard for ninth-tenth grades: “Determine the central ideas or con-
clusions of a text; trace the text’s explanation or depiction of a complex process,
phenomenon, or concept; provide an accurate summary of the text.” CCSSO’s
English Language Proficiency Development Framework (2012) expressly provides
descriptions of the language practices found within both the CCSS and NGSS,
including the following: “Describe a model using oral and/or written language as
well as illustration.” This framework is intended to define key language needs of
EL students.
The instructional gap for teachers is calling out how we can have classrooms that
set up situations to give students the opportunity to have rich content discussion;
a second gap is in student exposure to teachers who facilitate in pushing students’
thinking and language to new heights. By the same token, while much is made of
instructional gaps for students, there is also an opportunity to learn gap since not all
students have available to them the kinds of classrooms that support the sustained
and collaborative interactions called for in the academic standards quoted earlier.
With these considerations in mind we turn to how mathematics and science can
be integrated with language to form best practices for assessment of STEM and
language learning.
Assessment
Summative assessment practices reveal successes of programming and aggregate stu-
dent progress to be reported periodically (e.g., annual, large-scale mathematics and
science assessments). Formative assessment practices reveal how teachers respond
contingently to student learning in the moment by adjusting their teaching and
providing feedback for student learning or planning for next steps in decision mak-
ing. In contrast with instruction, during assessment contexts students are predomi-
nantly responsible for producing independent work in their display of knowledge.
This often requires their mastery of decontextualized language in both oral and
written forms, especially in standardized summative assessments. Such test protocols
are mainly unassisted, without mediation or scaffolding from others and with no
opportunities for clarification or immediate feedback; thus, this represents a very
different skill set for students to master.
Introduction 7
National level assessments of STEM and language show in greater detail the
gaps surrounding academic achievement, language proficiency, and the opportu-
nity to learn in U.S. classrooms. The National Assessment of Educational Progress
(NAEP) provides a comparison of both mathematics and science for EL and non-
EL student performances (USDOE, 2015). While all students performed less well
on these STEM disciplines over time, EL students’ performances by 12th grade are
particularly troubling, with the vast majority of students scoring below basic in both
mathematics and science (see Table 1.2).
At the state level, most states have adopted the Smarter Balanced Assessment
Consortium (SBAC) or Partnership for Assessment of Readiness for College and
Careers (PARCC) consortia assessments that were developed for the Race to the
Top initiative of the former federal government’s program to monitor state pro-
gress toward meeting the college- and career-ready standards. Reauthorized under
ESSA (2015), mathematics continues to be assessed annually in third through
eighth grades and once in high school. Science must be assessed once at each of the
third–fifth, sixth–eighth, and ninth–Twelfth grade clusters.
The most glaring shortcoming of the implementation of state standards-based
summative assessments with EL students is the fact that EL students have tradi-
tionally left the pool of EL test takers when they succeed in English language
programming (i.e., are redesignated as fluent English proficient). This means their
successful performances were never captured and credited to the programs they exit
(Saunders & Marcelletti, 2013). Under new federal legislation, former EL students
are now followed for up to two years to better understand their progress and the
success of the programs that serve them.
Student responses to a state standards-based mathematics assessment item illus-
trate the communicative demands inherent in the new standards-aligned assess-
ments. This SBAC released item requires students to first read a word problem and
then construct an explanation of their mathematical reasoning. The responses of
Alejandro and William reveal the types of language and literacy skills students must
command to display their mathematics abilities.The item asks them to explain why
five-eyed space creatures cannot joint a contest to make up a group of 24 total eyes.
The five eyed space creatures cannot join the contest because 5 × 5 = 25 and
5 × 4 = 20 so it cannot be 24.
(Alejandro, a 9-year-old, recently redesignated
EL student from a Spanish-speaking home)
The five eyed creatures could not join the contest because they are in five like 5–10–
15–20–25 and you see that the five eyed creature cant join the contest.
[Original punctuation] (William, an 8-year-old, Spanish second
language learner from an English-speaking home)
Neither boy explicitly states that “24 cannot be divided by multiples of five,” but
their responses do show mathematical understanding of the word problem. Both
boys focus on the fact that multiples of five do not allow for the sum of 24 total
eyes. Alejandro gives examples of the adjacent multiplication operations by five
8 Alison L. Bailey et al.
that skip over the value of 24 and asserts that it “cannot be 24.” William states “they
are in fives” and then elaborates with an example (“like”) using either a repeated
addition model by adding on fives or listing multiples of five—either way, perhaps
implying by its omission from this list that 24 is not a possibility.
Linguistically, both boys use complete sentences, beginning with the full noun
with its adjective modifiers (five-eyed space creatures) that had been given in the
word problem. Alejandro’s response is shorter and chains together two causal clauses
(“because they are . . .” and “so it cannot . . .”), and his choice of tense for the
auxiliary verb “cannot” remains in the present tense of the word problem prompt.
William’s verb usage contrasts with Alejandro’s where he uses the conditional tense
for the auxiliary verb “could not,” marking the contingent nature of the space crea-
tures’ ability to join the contest. He also uses an embedded causal clause “because
they are. . .,” but his writing still has an oral language quality to it when he writes
“they are in five like 5–10–15 . . .” and when he directly addresses the reader with
“and you see that. . . .” In some sense, the responses by both boys, but William’s
choice of the word ‘see’ especially, are suggestive of the fact that they have cho-
sen to give (different) examples to show rather than explicate in words that 24 is
not a viable option. A scoring rubric that anticipates the use of fully explanatory
responses (i.e., explicitly stating that 24 is indivisible by five, rather than give exam-
ples of the impossibility) could miss the understanding that these two boys have.
However, this discussion also highlights the kinds of language opportunities that
the students may need in the future and the work of teachers to prepare students
linguistically for such tasks.
Summative assessment like the NAEP mathematics and science assessments
and SBAC state standards-based assessment item described earlier contrasts sharply
with formative assessment approaches to understanding student progress, with the
focus on assessment for learning not only of learning (e.g., Black & Wiliam, 2010;
Black, Wilson, & Yao, 2011). Formative assessment occurs during instruction and
comprises the information that teachers can glean from their conversations with
students about their work, from overhearing student-to-student discussions, and
from observing students as they complete tasks so that they can modify instruc-
tion accordingly.This approach to assessment is particularly pertinent to instruction
with EL students. Formative assessment can serve as an important complement
to summative assessment with EL students because it can provide teachers with
knowledge not only of what a student says or writes in terms of mathematics or
science content but also of how a student is using language to express learning
(Bailey, 2017).
In the following excerpt of a kindergarten classroom, the teacher (Ms. Escobar)
has shown her Spanish-dominant EL students the plant root system, and later, dur-
ing small group time, she moves around the classroom to observe the students and
ask questions about their work (Bailey, Huang, & Escobar, 2011). One small group
has been given the task of using wooden blocks to represent the root system.
Escobar: Is this one yours, Julia? Let’s see, sit down with it and show me. Show
me what you’ve created. Tell me about your construction. Show me here.
Where is the seed?
Introduction 9
[Julia points to the blocks and correctly identifies the part of her con-
struction that represents the seed.]
Escobar: OK. And where is the primary root?
[Julia points to the root hairs in her block representation.]
Escobar: Are they primary? Las primera que salio? (The first one to come out?)
[Julia then points to the primary root in her representation.]
Escobar: Yes. And where are the secondary roots?
[Julia points to the secondary roots in her representation.]
Escobar: Yes. And where are the root hairs?
[Julia points to the root hairs in her representation.]
Escobar: Excellent.
What is most striking about this exchange is that Julia, as a very beginning
EL student, is able to participate actively in her learning and in her teacher’s forma-
tive assessment of that learning. Escobar’s questions enable Julia to indicate, with
the help of her model, her understanding of English and science content through
nonverbal participation. Escobar is able to monitor her receptive English skills and
uses Spanish as a first language support where necessary so that she can still effec-
tively assess Julia’s science content knowledge.
We have illustrated with our analyses of these brief examples how language and
literacy may either obfuscate or clarify children’s efforts to develop understanding
and to display that understanding via language and nonlanguage tools. One exam-
ple focused on the display of mathematical understanding in summative assessment,
while the second was a display of scientific understanding using visual representa-
tions appropriate to a beginning English level during formative assessment. In the
preface to this volume, we provided an overview of the goals, organization, and
basic details of our approach to the language challenges inherent to learning the
STEM disciplines. The following chapters focus on the authors’ findings for how
the language of mathematics and of the sciences presents challenges for all students
and in particular EL students.
References
Bailey, A. L. (2017). Progressions of a new language: Characterizing explanation develop-
ment for assessment with young language learners. Annual Review of Applied Linguistics,
37, 241–263.
Bailey, A. L., Huang,Y., & Escobar, M. (2011). I can explain: Academic language for science
among young English language learners. In P. Noyce & D. Hickey (Eds.), New frontiers in
formative assessment. Cambridge, MA: Harvard Education Press.
Black, P., & Wiliam, D. (2010). Inside the black box: Raising standards through classroom
assessment. Phi Delta Kappan, 92(1), 81–90.
Black, P., Wilson, M., & Yao, S.Y. (2011). Road maps for learning: A guide to the navigation
of learning progressions. Measurement: Interdisciplinary Research and Perspectives, 9, 71–123.
Council of Chief State School Officers. (2012). Framework for English Language Proficiency
Development Standards corresponding to the Common Core State Standards and the Next Gen-
eration Science Standards. Washington, DC: CCSSO.
Every Student Succeeds Act. (2015, December 10). Public Law No. 114–195, 114th Con-
gress, 1st session.
10 Alison L. Bailey et al.
Kastberg, D., Chan, J.Y., & Murray, G. (2016). Performance of US 15-year-old students in Science,
reading, and Mathematics literacy in an international context: First look at PISA 2015. NCES
2017–2048. Washington, DC: National Center for Education Statistics.
Maher, C. A., & Yankelewitz, D. (Eds.) (2017). Children’s reasoning while building fraction ideas.
Heidelberg/Dordrecht/Rotterdam: Sense Publishers.
Mueller, M., Yankelewitz, D., & Maher, C. (2010). Promoting student reasoning through
careful task design: A comparison of three studies. International Journal for Studies in Math-
ematics Education, 3(1), 135–156.
National Clearinghouse for English Language Acquisition. (2017). Profiles of English learners.
Washington, DC: Office of English Language Acquisition.
National Conference of State Legislatures. (2016). College and career readiness standards legisla-
tion. Retrieved from www.ccrslegislation.info/CCR-State-Policy-Resources/common-
core-status-map
National Governors Association Center for Best Practices, Council of Chief State School
Officers. (2010). Common Core state standards for English language arts & literacy in history/
social studies, science, and technical subjects and Common Core state standards for Mathematics.
Washington, DC: Author.
NGSS Lead States. (2013). Next Generation Science Standards: For states, by states. Washington,
DC: The National Academies Press.
No Child Left Behind Act. (2001, December 13). Public Law No. 107–110, 107th Congress,
1st Session.
Organization for Economic Cooperation and Development. (2015). Program for International
Student Assessment (PISA): Mathematics and Science literacy. Paris: Author.
Organization for Economic Cooperation and Development. (2016). Program for International
Student Assessment (PISA): Mathematics and Science literacy. Paris: Author.
Ripley, A. (2016, December 8).What the U.S. can learn from other nation’s schools. The New
York Times, p. A3.
Saunders, W. M., & Marcelletti, D. J. (2013). The gap that can’t go away: The catch-22 of
reclassification in monitoring the progress of English learners. Educational Evaluation and
Policy Analysis, 35(2), 139–156.
U.S. Department of Education. (2015). National assessment of educational progress: 2015 Math-
ematics and Science performances. Washington, DC: National Center for Education Statistics.
U.S. Department of Education. (2016). National assessment of educational progress science assess-
ment. Retrieved from https://nces.ed.gov/nationsreportcard/science/
PART I
Judit Moschkovich
Focus Points
Learning math with understanding:
• Resources are available (many free and online) for designing mathemat-
ics lessons that support EL students in talking to learn mathematics with
understanding.
• Teachers can support student talk during whole-class discussions by using a
variety of teacher “talk moves.”
• Considering how to include a variety of participation structures beyond
whole-class discussions, such as pairs and small groups, is essential for support-
ing EL students in talking to learn mathematics with understanding. Teachers
can work in teams, collaborating with mathematics, language, and ESL special-
ists to design, observe, and polish lessons that support EL students in talking to
learn mathematics with understanding.
• Teachers can use interpreters, translated materials, and cognates to support
students who have had instruction in mathematics in their first language.
Chapter Purpose
This chapter summarizes what we know about the role of oral language in learning
mathematics with understanding. The chapter focuses on why student talk is impor-
tant for conceptual understanding, what kinds of talk support learning mathematics
with understanding, and how teachers can support student participation in math-
ematical discussions focused on understanding. Central issues include hearing the
mathematical content in students’ everyday ways of talking, building on that everyday
language, and supporting more formal ways of talking. The chapter includes a list of
resources for teachers to learn to orchestrate mathematical discussions and address the
needs of students who are bilingual, multilingual, and/or learning English.
A first step in supporting EL students in talking to learn mathematics with
understanding is to shift from a simple view of mathematical language as single
words to a broader definition of academic literacy—not just learning words but
learning to communicate mathematically. This shift from a simplified view of aca-
demic language as single words to an expanded view of academic literacy in math-
ematics (Moschkovich, 2015a, 2015b) that integrates mathematical proficiency and
practices is crucial for students who are learning English (Moschkovich, 2013a,
2013b). Research and policy have repeatedly, clearly, and strongly called for mathe-
matics instruction for this student population to maintain high standards (American
Educational Research Association, 2004) and use high- cognitive- demand tasks
(AERA, 2006). In order to accomplish these goals, mathematics instruction for
students who are learning English needs to shift from defining academic literacy
in mathematics as low-level language skills (i.e. vocabulary or single words) or
mathematical skills (i.e. arithmetic computation) and use an expanded definition of
academic literacy in mathematics to design lessons that support talking to learn math-
ematics with understanding.
Talking to Learn Mathematics 15
One is that teachers and students attend explicitly to concepts, and the other is that
students wrestle with important mathematics (Hiebert & Grouws, 2007).
Mathematics lessons for EL students need to include the full spectrum of math-
ematical proficiency, balance computational fluency with high- cognitive- demand
tasks that require conceptual understanding and reasoning, and provide students
opportunities to participate in mathematical practices (Moschkovich, 2013a,
2013b). Instruction should allow students to use multiple resources (such as modes
of communication, symbol systems, registers, or languages) for mathematical rea-
soning (Moschkovich, 2014a, 2014b) and support students in negotiating meanings
for mathematical language that are grounded in student mathematical work, instead
of giving students definitions separate from mathematical activity (Moschkovich,
2015a, 2015b).
Overall, research provides a few guidelines for instructional practices for teaching
EL students mathematics. Mathematics instruction for EL students should:
Research shows that EL students, even as they are learning English, can partici-
pate in discussions where they grapple with important mathematical content (for
examples of lessons where EL students participate in mathematical discussions, see
Khisty, 1995; Khisty & Chval, 2002; Moschkovich, 1999, 2007a, 2011). Instruction
for this population should not emphasize low-level language skills over oppor-
tunities to actively communicate about mathematical ideas. One of the goals of
mathematics instruction for students who are learning English should be to support
all students, regardless of their proficiency in English, in participating in oral dis-
cussions that focus on important mathematical concepts and student mathematical
reasoning rather than on pronunciation, vocabulary, or low-level linguistic skills. By
learning to recognize how EL students express their mathematical ideas as they are
learning English, teachers can maintain a focus on mathematical reasoning as well
as on language development.
Research has documented a variety of language resources that EL students use
to communicate mathematical ideas: their first language, everyday language, ges-
tures, and objects. When communicating mathematically, students use multiple
resources from experiences both in and out of school (Forman, McCormick, &
Donato, 1997; O’Connor, 1999; Moschkovich, 2010). Everyday language, ways
of talking, and experiences are, in fact, resources that we can expect students to
use as they participate in mathematical discussions (Moschkovich, 1996, 2007c,
2010). For example, students have been documented using their first language to
repeat an explanation or mixing Spanish and English to explain a mathematical idea
(Moschkovich, 2000).
Talking to Learn Mathematics 17
the linguistic resources teachers can use to teach mathematics and students can use
to learn mathematics with understanding. Separating language from mathematical
proficiency and practices and focusing instruction on words, vocabulary, or defini-
tions limits EL students’ access to the five strands of mathematical proficiency and
curtails these students’ opportunities to participate in mathematical practices (for
examples of instruction for EL students that focused on word activities and lacked
mathematical content, see de Araujo, 2012a and 2012b). Not allowing EL students
to use informal language, typically acquired before more formal ways of talking, also
limits the resources they can use to communicate mathematically. Lastly, focusing on
correct vocabulary also curtails the opportunities for EL students to express them-
selves mathematically as they learn English in what are likely to be imperfect ways,
especially initially. In contrast, the view of academic literacy in mathematics described
in this chapter provides a complex and expanded view of mathematical language
that is connected to the five strands of mathematical proficiency, includes the CCSS
mathematical practices, and includes informal ways of talking as resources.
but also to use language (and other symbol systems) to communicate—talk, read,
and write—about mathematics. The eight mathematical practices are not language
intensive in and of themselves. The language intensity of any of the mathematical
practices depends on the ways each mathematical practice is enacted in a lesson and
whether the activity structure includes opportunities to use language.
The definition used in this chapter expands academic literacy in mathematics
beyond narrow views of mathematical language as words. Narrow views of aca-
demic language have characteristics that limit EL students’ access to a high-quality
curriculum: (a) a focus on single words or vocabulary limits access to both complex
texts and high-level mathematical ideas, as well as opportunities for students to
understand and make sense of those texts; (b) the assumption that meanings are static
and given by definitions limits students’ opportunities to make sense of mathematics
texts for themselves; and (c) the assumption that mathematical ideas should always
and only be communicated using formal language limits the resources (including
informal language) that students can use to communicate mathematically. In con-
trast, the view of mathematical language used here assumes that meanings for aca-
demic language are situated and grounded in the mathematical activity that students
are actively engaged in. For example, the meanings for the words in a word problem
do not come from a definition in a word list provided by the teacher before a lesson
using a word problem; instead, students develop these meanings as they work on
the problem, communicate about the word problem with their peers, and develop
their solutions. A complex view of mathematical language also means that lessons
need to include multiple modes (not only reading and talking, but also other modes
such as listening and writing), multiple representations (gestures, objects, drawings,
tables, graphs, symbols, etc.), and multiple ways of using language (formal school
mathematical language, home languages, and everyday language).
In addition, the definition used in this chapter expands academic literacy in
mathematics beyond simplified views of mathematics as computation. First, this
definition includes mathematical practices. Second, this definition includes the full
spectrum of mathematical proficiency, balancing procedural fluency, conceptual
understanding, and reasoning.
I will address only the first two components, procedural fluency and conceptual
understanding, since they are the basis for teaching mathematics for understand-
ing.2 Fluency in performing mathematical procedures or calculations is what most
people imagine when we say “learning mathematics.” Conceptual understanding
is more difficult to define and less well understood by parents, administrators, and
beginning teachers. Conceptual understanding involves the connections, reason-
ing, and meaning that learners (not teachers) construct. Conceptual understanding
is more than performing a procedure accurately and quickly (or memorizing a
definition or theorem). It involves understanding why a particular result is the cor-
rect answer and what that results means, i.e., what the number, solution, or result
represents—for example, explaining (or showing using a picture) why the result of
multiplying 1/2 by 2/3 is smaller than 1/2.
Another aspect of conceptual understanding involves connecting representa-
tions (such as words, drawings, symbols, diagrams, tables, graphs, equation, etc.),
procedures, and concepts (Hiebert & Carpenter, 1992). For example, if students
understand addition and multiplication, we would say they have learned to make
connections between these two procedures and expect that they would be able to
explain how multiplication and addition are related (for example, that multiplica-
tion can sometimes be described or modeled as repeated addition). If they under-
stand the procedures for operations with negative numbers, we would say they have
learned to make connections among these procedures and expect that they would
be able to explain, for example how the procedures for multiplication and addition
are similar or different and explain why.
Talking to Learn Mathematics 21
learning mathematics (for example, see Cuevas, 1983; Mestre, 1988; Spanos et al.,
1988; Spanos & Crandall, 1990). Multiple meanings of the same word were
hypothesized to create obstacles in mathematical conversations, because students
often use the colloquial meanings of terms, while teachers (or other students)
may use the mathematical meaning of terms. An example is the word “prime,”
which can have different meanings depending on whether it is used to refer to
“prime number,” “prime time,” or “prime rib.” Other examples are the words
“function” and “set.”
The notion of register as proposed by Halliday (1978) is much more than a set
of lexical items and also includes phonology, morphology, syntax, and semantics as
well as non-linguistic behavior. Most importantly, the notion of register includes
the situational context of utterances. Although words and phrases do have multiple
meanings, these words and phrases appear in talk as utterances that occur within
social contexts, and speakers use situational resources to derive the meaning of
an utterance. For example, the phrase “give me a quarter” uttered at a vending
machine clearly has a different meaning than saying “give me a quarter” while
looking at a pizza. When imagining that students face difficulties with multiple
meanings in mathematical conversations, it is important to consider how resources
from the situation, such as objects and gestures, point to one or another sense,
whether “quarter” means “a coin” or “a fourth” (Moschkovich, 2002).
A third challenge in using the notion of register is that, although it is easy to
set up a dichotomy between the everyday and the mathematics registers, these two
registers should not be treated as dichotomous. During mathematical discussions
students use multiple resources from their experiences across multiple settings, both
in and out of school. Forman (1996) offers evidence of this in her description of
how students interweave the everyday and academic registers in classroom discus-
sions.Thus, everyday meanings should not be seen only as obstacles to participation
in academic mathematical discussions. The origin of some mathematical mean-
ings may be everyday experiences, and some aspects of everyday experiences may
actually provide resources in the mathematics classroom. For example, climbing
hills is an experience that can be a resource for describing the steepness of lines
(Moschkovich, 1996). Other everyday experiences with natural phenomena also
may provide resources for communicating mathematically.
While differences between the everyday and mathematics registers may some-
times be obstacles for communicating in mathematically precise ways and everyday
meanings can sometimes be ambiguous, everyday meanings and metaphors can also
be resources for understanding mathematical concepts. Rather than emphasizing
the limitations of the everyday register in comparison to the mathematics register,
it is important to understand how the two registers serve different purposes and
how everyday meanings can provide resources for mathematical communication
and learning mathematics with understanding.
language as a sociocultural activity and a resource for doing and learning mathe-
matics. Instead of viewing language as separate from mathematical activity, research
now considers language as part and parcel of mathematical thinking and learning.
Rather than viewing language only as an obstacle for learning mathematics,
research now considers language as one of the multiple resources that learners use
to understand mathematics and construct mathematical meaning. In order to focus
on the mathematical meanings learners construct, rather than the mistakes they
make, researchers and practitioners need frameworks for recognizing the math-
ematical knowledge, ideas, and learning that learners are constructing in, through,
and with language. Several such frameworks are available, for example functional
systemic linguistics (O’Halloran, 1999, 2000; Schleppegrell, 2007), a communica-
tion framework for mathematics instruction (Brenner, 1994), a situated and socio-
cultural perspective on bilingual mathematics learners (Moschkovich, 2002, 2007a)
and a definition of academic literacy in mathematics (Moschkovich, 2015a, 2015b).
These can serve as frameworks for recognizing oral mathematical contributions by
students and shift the focus from looking for deficits to identifying the mathematics
evident in student contributions (e.g., Moschkovich, 1999).
For example, Brenner (1994) provides useful distinctions among different kinds
of communication in mathematics classrooms and describes three components:
“Communication about mathematics” involves describing one’s thinking, “Com-
munication in mathematics” involves mathematical symbols, and “Communica-
tion with mathematics” involves applying mathematics to meaningful problems
(p. 241). Herbel-Eisenmann et al. (2013) also remind us that not all mathematical
talk is formal, and whether students use more or less formal ways of talking depends
on the setting. They provide a useful framework that highlights the variety of oral
communication students can produce in the classroom, depending on the different
communication settings. They describe how students may use more informal talk
that involves pointing and deictic terms (Why did you do that? When I did this,
I got the wrong answer) when talking in a small group with writing or computa-
tions in front of them. That talk may become less deictic and a bit more formal
when presenting a solution at the board (When I multiplied by seven, I got the
wrong answer). And, finally, when presenting a final solution in writing, that talk
would then become even more formal and begin to “sound” more like a textbook
(My calculation was initially wrong, but I changed the operation from multiplica-
tion to division and then the result made more sense).
To summarize, mathematics instruction needs to support EL students both to
reason mathematically and to express that mathematical reasoning orally. However,
it is important to note that, for students learning mathematics, informal language is
important, especially when students are exploring a mathematical concept or first
learning a new concept or discussing a math problem in small groups. Informal lan-
guage can be used by students (and teachers) during exploratory talk (Barnes, 1992;
Barnes & Todd, 1995) or when working in a small group communication context
(Herbel-Eisenmann et al., 2013). Such informal language can reflect important stu-
dent mathematical thinking (for examples, see Moschkovich, 1996, 1999). In other
situations, for example, when making a presentation, developing a written account
of a solution, using more formal academic mathematical language becomes more
important.
Talking to Learn Mathematics 25
Best Practices
How can teachers plan lessons for EL students that balance attention to concep-
tual understanding, mathematical practices, and the language demands of talking
to learn mathematics with understanding? In order to tackle the complex issue of
mathematics instruction for this student population, lesson design needs to draw on
exemplary and high-quality practices and tools that are based on current research
in relevant fields and use a complex view of classroom mathematical language.
In this section, I point to several resources that teachers can use to design lessons
that include attention to both conceptual understanding and student talk. These
resources include research on mathematical discussions (Smith & Stein, 2011; Stein,
Engle, Smith, & Hughes, 2008) and Chapin et al.’s teacher talk moves (2009). I then
briefly describe the Framework for English Language Proficiency Development
Standards3 and several open-source sample student activities that can be used to
plan lessons that attend to oral language.
Step 1. Helping individual students clarify and share their own thoughts
Step 2. Helping students orient to the thinking of other students
Step 3. Helping students deepen their reasoning
Step 4. Helping students to engage with the reasoning of others
Several teacher moves (Michaels & O’Connor, 2015) have been described that
can support student participation in a discussion: revoicing, asking for clarification,
accepting and building on what students say, probing what students mean, and using
students’ own ways of talking. Teachers can use multiple ways to scaffold and sup-
port more formal language, including revoicing student statements (Moschkovich,
2015c).
Revoicing (O’Connor & Michaels, 1993) is a teacher move describing how
an adult, typically a teacher, rephrases a student’s contribution during a discussion,
expanding or recasting the original utterance (Forman et al., 1997). Revoicing has
been used to describe teacher talk moves in several studies (for example, Enyedy
et al., 2008; Herbel-Eisenmann et al., 2009). A teacher’s revoicing can support stu-
dent participation in a discussion as well as introduce more formal language. First,
it can facilitate student participation in general by accepting a student’s response,
26 Judit Moschkovich
using it to make an inference, and allowing the student to evaluate the accuracy of
the teacher’s interpretation of the student contribution (O’Connor and Michaels,
1993). This teacher move allows for further student contributions in a way that the
standard classroom initiation—response—evaluation (IRE) pattern (Mehan, 1979;
Sinclair & Coulthard, 1975) does not. Revoicing can build on students’ own use
of mathematical practices, or a student contribution can be revoiced to reflect new
mathematical practices. Revoicing also provides opportunities for students to hear
and then use more formal mathematical language.
TABLE 2.1 Productive Language Functions for Math Practice 1 “Make sense of problems
and persevere in solving them”
• Create, label, describe, and use multiple written representations of a problem in presenting
solutions to a math problem;
• Explain in words orally or in writing relationships between quantities and multiple
representations of problem solutions;
• Present information, description of solutions, explanations, and arguments to others;
• Respond to questions or critiques from others; and
• Ask questions about others’ solutions, strategies, and procedures for solving problems.
Talking to Learn Mathematics 27
7. Teachers need to select tasks that provide students opportunities for talking to
learn with understanding and use teacher talk moves that support both talk
and understanding.
Notes
1 For more details, see www.ccsstoolbox.com/
2 Other strands of mathematical proficiency, for example strategic competence and adaptive
reasoning, also require opportunities for students to engage in mathematical discussions.
However, this chapter focuses on conceptual understanding as a contrast with procedural
Talking to Learn Mathematics 31
References
American Educational Research Association. (2004). Closing the gap: High achievement for
students of color. Research Points, 2(3), 1–4.
American Educational Research Association. (2006). Do the math: Cognitive demand makes
a difference. Research Points, 4(2), 1–4.
Barnes, D. (1992). From communication to curriculum. Portsmouth, NH: Boynton/Cook Pub-
lishers Heinemann.
Barnes, D. R., & Todd, F. (1995). Communication and learning revisited: Making meaning through
talk. Portsmouth, NH: Boynton/Cook Publishers Heinemann.
Blachowicz, C., & Fisher, P. (2000). Vocabulary instruction. In M. Kamil, P. Mosenthal, P. D.
Pearson, & R. Barr (Eds.), Handbook of reading research, 3, 503–523. Mahwah, NJ: Lawrence
Erlbaum Associates.
Boaler, J., & Staples, M. (2008). Creating mathematical futures through an equitable teaching
approach: The case of Railside School. Teachers College Record, 110(3), 608–645.
Bransford, J., Brown, A., & Cocking, R. (2000). How people learn: Brain, mind, experience, and
school. Washington, DC: National Academy Press.
Brenner, M. (1994). A communication framework for mathematics: Exemplary instruction
for culturally and linguistically diverse students. In B. McLeod (Ed.), Language and learning:
Educating linguistically diverse students (pp. 233–268). Albany: SUNY Press.
Chapin, S., O’Connor, C., & Anderson, N. (2009). Classroom discussions: Using math talk to help
students learn, Grades K-6. Sausalito, CA: Math Solutions.
Cohen, E. G., & Lotan, R. A. (2014). Designing groupwork: Strategies for the Heterogeneous Class-
room Third Edition. New York: Teachers College Press.
Common Core State Standards. (2010a). Common Core State Standards for Mathematical Prac-
tice. Retrieved from www.corestandards.org/Math/Practice
Common Core State Standards Initiative. (2010b). Common Core State Standards for Math-
ematics. Washington, DC: National Governors Association Center for Best Practices and
the Council of Chief State School Officers.
Council of Chief State School Officers. (2012). Framework for English Language Proficiency
Development Standards corresponding to the Common Core State Standards and the Next
32 Judit Moschkovich
Focus Points
• Science classroom communities of practice can be rich environments for both
science and oral language learning with English learners (EL students), as they
use language to do science in socially mediated activity.
• The NGSS present key instructional shifts by promoting: (a) a focus on explain-
ing phenomena in the natural world or designing solutions to problems in the
designed world; (b) three-dimensional learning by blending science and engi-
neering practices, crosscutting concepts, and disciplinary core ideas; and (c)
learning progressions of student understanding over the course of instruction.
• Socially oriented perspectives in second language acquisition offer key instruc-
tional shifts by promoting: (a) use of language for purposeful communica-
tion in the science classroom; (b) meaningful participation of all EL students,
regardless of their English proficiency levels, in rigorous science learning; and
(c) a conceptualization of talk in the science classroom that considers registers,
modalities, and interactions.
• Science instructional shifts promote language learning with EL students, while
language instructional shifts promote science learning with EL students. Rec-
ognizing these shifts as mutually supportive can lead to better and more coher-
ent instructional approaches that promote both science and language learning
for all students, especially EL students.
Chapter Purpose
The new wave of standards in recent years has raised the bar for learning with
the goal of preparing all students for college and career readiness. With regard
36 Okhee Lee et al.
the practices of the community and supported in developing the language as well
as other meaning-making resources needed to carry out those practices and express
their emerging ideas and understandings. In short, as students use language to do
science in socially mediated activity, they develop their science and language profi-
ciency in tandem.
relevant CCC to frame the question (for example, a question involving the crosscut-
ting concept of cause and effect to explain a phenomenon) and engage in a relevant
SEP (for example, develop a model to explain the cause and effect), which results
in an understanding of a DCI(s). This new understanding, in turn, generates a new
subquestion toward answering the driving question. Over the course of instruction,
students develop deeper and more sophisticated understanding of science to make
sense of the anchoring phenomenon for the unit of science instruction.
As students develop deeper and more sophisticated science understanding, their
language use becomes more precise (NRC, 2014; Quinn, Lee, &Valdés, 2012). Students
learn that the level of precision needed to engage in SEPs demands a comparable level of
precision in language use. This demand for precision goes beyond the meaning of
technical vocabulary to the logic of connecting cause and effect and the validity
of claims and evidence. As students develop deeper and more sophisticated science
understanding, their language use also becomes more explicit (NRC, 2014; Quinn
et al., 2012). Science often involves communicating about objects and events not
immediately present, and explicitness makes language use more effective with “dis-
tant” audiences. In the next section (language instructional shift 3), we expand upon
the importance of precise and explicit language use in the NGSS science classroom.
With the social turn in SLA, structuralist views of language learning have been
called into question, and a variety of more dynamic and socially situated concep-
tualizations have emerged. One well-known example is Ellis and Larsen-Freeman’s
(2009) view of language as a complex adaptive system, which recognizes the fun-
damentally social function of language use. This view proposes that the structural
elements of language are acquired not as a precursor to, but as a result of, engage-
ment in cooperative activity and social interaction. In another line of work, Lantolf
and colleagues (e.g., Lantolf & Poehner, 2014) have taken a more sociocultural
approach, emphasizing the mediating role of language in all human activities. This
work builds on Vygotskian sociocultural theory to examine how learners appropri-
ate and internalize ways of using language through participation in carefully scaf-
folded interactions. In a similar vein, research on language socialization has offered
valuable insights into how learners become competent members of their commu-
nities and the role of language in this process (see Duff & Talmy, 2011 for a review).
This body of research has drawn attention to how interactions with more proficient
community members facilitate not only language learning but also socialization
into the values, identities, norms, and practices of the community. Collectively, these
conceptualizations converge in their common belief that language is learned in use.
In other words, language learning occurs as a product of purposeful communica-
tion in the context of joint activity.
In classrooms that embrace socially situated conceptualizations, teachers gener-
ate opportunities for learners to do things with language in pursuit of a common goal.
The NGSS science classroom offers fertile ground for generating such opportuni-
ties. By anchoring learning in phenomena, teachers provide students with a reason
to communicate and a compelling context in which to express their ideas and
emerging understandings. As new understandings lead to new questions, students
are motivated to use language to plan and carry out investigations, analyze and
interpret data, and argue based on evidence to refine their explanation of the phe-
nomenon under study. In both small group and whole-class discussions, they build
on each other’s ideas and collaboratively co-construct meaning, with the goal of
advancing the shared knowledge and resources of the science classroom commu-
nity. Unlike traditional language classrooms, where language is primarily the object
of study, language use in these classrooms is best understood as a form of action
(Walqui & vanLier, 2010). The role of the teacher is no longer to “teach” language
but to “instigate” opportunities for purposeful communication and joint action that
are supportive of both science and language learning.
Lave and Wenger’s (1991) work, a legitimate peripheral participation view of learn-
ing strongly rejects this practice, focusing instead on what learners can do, even
with limited proficiency, when provided adequate and appropriate support. This
perspective recognizes EL students as capable of participating meaningfully in the
science classroom community with less than perfect English (Lee et al., 2013). By
adopting this inclusive stance, teachers facilitate their students’ access to opportu-
nities for apprenticeship and interaction that support both science and language
learning.
In the NGSS science classroom, EL students carry out sophisticated SEPs, such
as constructing explanations and arguing based on evidence, through their emerg-
ing English. Importantly, their contributions to the community are valued for their
meaning and substance rather than their linguistic accuracy. EL students are per-
fectly capable of engaging in the type of cognitively demanding instruction called
for by the NGSS despite needing varying degrees of support in order to demon-
strate what they know and can do. They also bring with them to the science class-
room a vast array of cultural and community resources that help them make sense
of the natural and designed world (González, Moll, & Amanti, 2005).
TABLE 3.1 Registers, Modalities, and Interactions Typical of the NGSS Science Classroom
Below, we provide the overview of the unit and then the analysis of the unit with
regard to the science instructional shifts and language instructional shifts described
in the previous section. The names of students are pseudonyms.
Overview of Unit
Implemented over the course of 2 weeks, the unit revolves around the Channeled
Scablands, a barren landscape in eastern Washington State formed by a series of
megafloods thousands of years ago. Though it is now widely accepted that the
landscape was eroded by mass flooding, this was not always thought to be the case.
In 1928, a rebel geologist by the name of J Harlen Bretz proposed the theory of
the megaflood but was met with harsh criticism and opposition. Bretz’s theory
sparked an intense debate among geologists that lasted nearly four decades and
became one of the most well-known controversies in the history of earth science.
Eventually, Bretz’s once outlandish theory was accepted by the scientific commu-
nity, and he was credited with solving the elusive mystery of the Scablands. Because
the Scablands are located in a neighboring state on the West Coast and Whitten’s
students are interested in learning more about their origins, she decides to make
this mystery the anchoring phenomenon of the unit.
Whitten launches the unit by showing the class a short video clip about the
Scablands and Bretz’s theory of how the landscape was formed. She stops the video
before the mystery is revealed and asks students what they observed and what
questions they have about the landscape. In small groups, students discuss their
observations, record their questions on sticky notes, and then place those sticky
notes on the board in the front of the room. As each group shares their questions
with the class, Whitten encourages students to think about how each new ques-
tion relates to the other questions. The class organizes the questions into different
categories on the board (known as the “Driving Question Board”) and soon comes
to a consensus that all of the questions are related to one larger question: How were
the Scablands formed? This question becomes the unit’s Driving Question. The class
agrees that their work as scientists will be to answer this question. At the end of the
unit, students will watch the rest of the video to find out whether they’ve reached
the same conclusion as Bretz.
Having identified the anchoring phenomenon and established the driving ques-
tion of the unit, Whitten asks the class how they might go about solving the mys-
tery of the Scablands. One student suggests testing Bretz’s theory that water played
a role in shaping the landscape, and the class agrees this is a good place to start. Over
the course of the two weeks, students engage in three investigations to answer the
Driving Question.
Students carry out the first investigation to determine what happens when rocks
are exposed to water.They place rocks in a jar and shake for 3 minutes.They record
the size of the rocks before and after shaking and notice that some of the rocks
have worn away. Then, they add water to the jar and shake for another 3 minutes.
They observe that adding water makes the rocks wear away faster, which Whitten
identifies as an example of erosion.
Using evidence from the investigation, students work collaboratively in small
groups to develop initial models of how the Scablands were formed. These models
Talk in the Science Classroom 45
represent students’ current thinking and will be revised and refined as new evidence
is collected and new understandings emerge over the course of the unit. Once
students have completed their models in groups, Whitten convenes what she calls a
“board meeting” where students share their group model, ask clarifying questions
of other groups, and argue based on evidence for their explanation of the phenom-
enon. Camila shares her group model with the class:
Camila: (pointing to the group model) Lot of water. Rocks get smaller.
Whitten: Interesting. So, Camila’s model shows how water wore down, or eroded,
the rocks in the Scablands. It passed over the rocks again and again and
again (gesturing with her hands) and made the rocks smaller. It eroded
the rocks. What does everybody else think? Did other groups also
include that in their models?
The discussion continues and, at the end of the board meeting, the class agrees
that water is powerful enough to shape the landscape, but there are still gaps in their
models. They will need to continue collecting evidence in order to figure out how
the Scablands were formed. Whitten guides students to consider what other factors
related to water might have affected the erosion of the landscape.
Over the next week, students engage in two follow-up investigations to examine
the erosive effects of water. Students build stream tables to test their ideas about
erosion. While the procedures and instructions for the first investigation of the unit
were mostly determined in advance, Whitten asks students to take a more active
role in planning and carrying out these follow-up investigations.
In the second investigation of the unit, students test whether the angle of the
stream table affects the rate of erosion as water is poured into the table. For example,
when the table is positioned at a 30-degree angle versus a 10-degree angle, will the
water create a canyon more quickly? To test this idea, students plan an investiga-
tion in which they identify variables to control (e.g., amount of sand, amount of
water, pouring speed).Then, they test how the angle of the table affects the amount
of time it takes for a canyon to form. For different trials of the investigation, they
decide on increments for adjusting the angle of the table. All of these decisions
are made collaboratively within each group and under the guidance of Whitten.
While holding the table at various angles, students observe how long it takes for
the canyon to form.They record their observations and plot their results on graphs.
The graphs show that as the angle of the table increases, the canyon is created more
quickly. Students conclude that elevating the angle of the table increases the speed
of the water traveling through the table and the rate of erosion.
Intrigued by these results, Whitten’s students wonder what other factors, in
addition to the angle of the table, might also affect the rate of erosion. Some stu-
dents noticed during the previous investigation that each group poured a different
amount of water into the stream table and that this might have affected the rate of
erosion. For the third investigation, the class agrees to test this possibility. As in the
previous investigation, students ensure a fair test by holding all variables constant
except the amount of water poured into the table. They record their observations
and plot their results on graphs. The graphs show that as the amount of water
increases, the canyon is created more quickly (i.e., the rate of erosion increases).
46 Okhee Lee et al.
Working in groups, students use data from the two follow-up investigations to
construct an argument with a claim, evidence, and reasoning about factors that
affect the rate of erosion. In the whole-class discussion that follows, Dylan argues
that increasing the angle of the table increases the rate of erosion. When asked to
support his claim with evidence, Dylan responds, “The water kept going faster like
when it was 10 degrees, it took 4 seconds, but when it was 55 degrees, it took 1
second.” The class agrees with Dylan’s claim and finds his evidence compelling. As
the discussion comes to a close, Whitten congratulates her students on uncovering
important science ideas and encourages them to think about how they can apply
their new understandings to solving the mystery of the Scablands.
In the final lesson of the unit, each group revises its initial model of the Scablands
and develops a final group model. As they work in small groups, students use evi-
dence from the investigations to arrive at the most complete and coherent explana-
tion of the phenomenon. The following discussion takes place among a group of
three students collaborating to develop their final model:
Brian: Those holes are from those rocks and then the water moved those rocks
to a different area and then made multiple. That’s what I think.
Alonso: What do you think, Phuong?
Phuong: I go with Brian’s idea.
Alonso: Experiment 1 which, uh, most represents the landforms.
Brian: I think it was number 3 [experiment on the amount of water] because
number 3 added more water, and it was like more efficient. Maybe there
was more water than less water. I think number 3 was more efficient
because there was like a little delta, and the whole thing almost collapsed!
Chen: We think that a flood caused the canyon and carried the boulders to
random places . . . and the stream table experiment supports our ideas
because when the water hit the land, it made a canyon and caused
erosion.
Whitten: Any questions for this group?
Jasmine: How did the bits of rocks make the ash?
Chen: The water chipped away bits of rock.
lesson of the unit, students finish watching the video clip from the beginning of the
unit and are excited to learn that they, along with J Harlen Bretz, have solved the
mystery of the Scablands.
Analysis of Unit
that all children come to school with rich knowledge of the natural and designed
world and the ability to think and reason scientifically in both school settings and
informal environments (NRC, 2007). Although there is this foundational research,
learning science according to the vision of the NGSS is new. Learning language
while it is being used to learn science is even more so. Realizing this vision will
require innovative approaches to classroom teaching, curriculum design, assessment,
and teacher preparation and professional development. As the education system
embarks on this new vision, a new research agenda to promote rigorous science
learning and rich language use is needed with the exciting potential to meet the
needs of all students in our increasingly diverse classrooms.
Science and language instructional shifts also offer implications for classroom prac-
tice. The classroom video described in this chapter illustrates the synergistic relation-
ship between science learning and language learning with EL students. As science
and language instructional shifts are new to many teachers and may represent sig-
nificant departures from current classroom practice, case studies and vignettes play an
important role in offering concrete and accessible examples of how these shifts can
be enacted in the classroom and, in particular, how EL students at varying levels of
proficiency can participate in rigorous science learning through their emerging Eng-
lish. Importantly, teachers should not be expected to change their classroom practice
all at once or on their own. Implementing these shifts is an ongoing process that will
require support at multiple levels of the education system. In the case of Whitten, she
has developed NGSS-aligned science instruction through her participation in two
professional development projects over the past three years.
One commonly asked question about classroom practice is whether good
teaching with EL students is any different from good teaching with all students.
While teaching practices aimed at supporting EL students are likely to benefit
all students, the NGSS instructional shifts and language instructional shifts dis-
cussed in this chapter provide specific affordances with EL students where many
traditional approaches have failed. As EL students make sense of phenomena or
problems in local contexts, they leverage their cultural and linguistic resources.
As they use language to do science, their contributions are valued not for their
linguistic accuracy but for their ideas in the discourse. In this way, inclusive class-
rooms that respect diversity and value the cultural and linguistic resources of EL
students fundamentally change a deficit view to an asset view.
Findings and insights from research and classroom practice will influence edu-
cational policy, since the NGSS are principally a policy initiative. As the NGSS
require key instructional shifts for both science and language learning with EL stu-
dents, it will take time for classroom practice to take shape. In addition to providing
resources for research and classroom practice, educational policy should be patient
with the time it takes for implementation in the education system. Moreover, in
the EL student policy context, federal legislation requires that English language
proficiency (ELP) standards align to content standards. In other words, ELP stand-
ards must reflect the language needed to master college- and career-ready standards
such as the NGSS. This policy highlights increasing recognition of the close rela-
tionship between content and language learning.
Implementing the instructional shifts presented here requires understanding of
disciplinary knowledge and practices as well as theories and practices of second
Talk in the Science Classroom 51
language acquisition. As teachers are asked to assume a variety of roles and respon-
sibilities in their work, this may seem to be a tall order. What we have attempted
to illustrate in this chapter is that synergy across these shifts can help teachers enact
more conceptually sound and practically feasible instruction at the intersection of
content and language learning. As the rapidly growing EL student population is
expected to achieve rigorous college- and career-ready standards, this chapter high-
lights the need for more substantive collaboration between content areas, such as a
science, and the field of EL student education. By providing conceptual grounding
for how science and language instructional shifts are mutually supportive, we hope
to invite researchers, practitioners, and policy makers from both areas into produc-
tive dialogue with the shared goal of supporting all students, especially EL students,
in developing science and language proficiency.
Note
1 Expansion of K-6 NGSS Instructional Specialists Program (William Beck and Carol
Biskupic Knight, Principal Investigator and Co-Principal Investigator), Portland State
University Center for Science Education, funded through the Oregon Mathematics and
Science Partnership Program.
References
Avery, L. M. (2013). Rural science education: Valuing local knowledge. Theory Into Practice,
52(1), 28–35.
Biber, D., & Conrad, S. (2009). Register, genre, and style. Cambridge: Cambridge University
Press.
Duff, P. A., & Talmy, S. (2011). Language socialization approaches to second language acquisi-
tion: Social, cultural, and linguistic development in additional languages. In D. Atkinson
(Ed.), Alternative approaches to second language acquisition (pp. 95–116). NewYork: Routledge.
Ellis, N., & Larsen-Freeman, D. (Eds.) (2009). Language as a complex adaptive system. Oxford:
Wiley-Blackwell.
Gándara, P., & Hopkins, M. (2010). Forbidden language: English learners and restrictive language
policies. New York: Teachers College Press.
González, N., Moll, L. C., & Amanti, C. (2005). Funds of knowledge: Theorizing practices in
households, communities, and classrooms. Mahwah, NJ: Erlbaum.
Kopriva, R., Gabel, D., & Cameron, C. (2011). Designing dynamic and interactive assessments
for English leanrers that directly measure targeted science constructs. Evanston, IL: Society for
Research on Educational Effectiveness.
Krajcik, J. S. (2015). Three-dimensional instruction: Using a new type of teaching in the sci-
ence classroom. Science and Children, 53(3), 6–8.
Krajcik, J. S., Codere, S., Dahsah, C., Bayer, R., & Mun, K. (2014). Planning instruction to
meet the intent of the Next Generation Science Standards. Journal of Science Teacher Educa-
tion, 25, 157–175.
Krajcik, J. S., & Czerniak, C. (2013). Teaching science in elementary and middle school classrooms:
A project-based approach (4th ed.). London: Routledge.
Krajcik, J. S., McNeil, K. L., & Reiser, B. (2008). Learning-goals-driven design model: Devel-
oping curriculum materials that align with national standards and incorporate project-
based pedagogy. Science Education, 92(1), 1–32.
Lantolf, J. P., & Poehner, M. E. (2014). Sociocultural theory and the pedagogical imperative in L2
education:Vygotskian praxis and the research/practice divide. London: Routledge.
52 Okhee Lee et al.
Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. Cambridge:
Cambridge University Press.
Lee, O., & Miller, E. (2016). Engaging in phenomena from project-based learning in a place-
based context in science. In L. C. de Oliveira (Ed.), The Common Core state standards in
literacy in history/social studies, science, and technical subjects for English language learners: Grades
6–12 (pp. 59–73). Alexandria,VA: Teaching English to Speakers of Other Languages.
Lee, O., Quinn, H., & Valdés, G. (2013). Science and language for English language learners
in relation to Next Generation Science Standards and with implications for Common
Core State Standards for English language arts and mathematics. Educational Researcher,
42(4), 223–233.
National Center for Education Statistics. (2016). The condition of education 2016 (NCES
2016–2144). Washington, DC: U.S. Department of Education.
National Research Council. (1996). National science education standards. Washington, DC:
National Academy Press.
National Research Council. (2007). Taking science to school: Learning and teaching science in
grades K-8. Washington, DC: National Academies Press.
National Research Council. (2012). A framework for K-12 science education: Practices, crosscutting
concepts, and core ideas. Washington, DC: National Academies Press.
National Research Council. (2014). Literacy for science: Exploring the intersection of the Next
Generation Science Standards and Common Core for ELA standards: A workshop summary.
Washington, DC: National Academies Press.
Next Generation Science Standards Lead States. (2013a). Next Generation Science Standards:
For states, by states. Washington, DC: National Academies Press.
Next Generation Science Standards Lead States. (2013b). Appendix D—“All standards, all
students”: Making Next Generation Science Standards accessible to all students.Washington, DC:
National Academies Press.
Quinn, H., Lee, O., & Valdés, G. (2012). Language demands and opportunities in relation to Next
Generation Science Standards for English language learners:What teachers need to know. Stanford,
CA: Stanford University, Understanding Language Initiative (ell.stanford.edu).
Reiser, B. J., Michaels, S., Moon, J., Bell, T., Dyer, E., Edwards, K. D., McGill, T. A. W., Novak,
M., & Park, A. (2017). Scaling up three-dimensional learning through teacher-led study
groups across a state. Journal of Teacher Education, 23, 280–298.
Smith, G. (2002). Place-based education: Learning to be where we are. Phi Delta Kappan, 83,
584–594.
U.S. Census Bureau. (2012). Statistical abstract of the United States, 2012. Washington, DC:
Government Printing Office. Retrieved from www.census.gov/compendia/ statab/cats/
education.html
Valdés, G. (2015). Latin@s and the intergenerational continuity of Spanish:The challenges of
curricularizing language. International Multilingual Research Journal, 9(4), 253–273.
vanLier, L. (2004). The ecology and semiotics of language learning: A sociocultural perspective. Boston,
MA: Kluwer Academic.
Walqui, A., & vanLier, L. (2010). Scaffolding the academic success of adolescent English language
learners: A pedagogy of promise. San Francisco, CA: WestEd.
Wenger, E. (1998). Communities of practice: Learning, meaning, and identity. Cambridge: Cam-
bridge University Press.
Zuengler, J., & Miller, E. (2006). Cognitive and sociocultural perspectives: Two parallel SLA
worlds? TESOL Quarterly, 40(1), 35–58.
PART II
Focus Points
• Many English learners (EL students) achieve at lower levels than most of their
peers in mathematics, and typically, they do not enroll in advanced secondary-
mathematics classes.
• A better understanding of how EL students and lower-achieving students
approach reading semiotics or sign systems in mathematics (i.e., language,
symbols and notation, visual representations) should lead to a better under-
standing of how these discipline-specific features may create “entry points”
or “exit points” for their problem solving, thereby helping low-achieving
students improve.
• The language of mathematics is complex and creates interrelated meanings; it
incorporates technical terms specific to mathematics, everyday language with
mathematical meaning, synonymous words and phrases, and complex strings of
words. Symbols and notation are used in mathematics to enable precise commu-
nication of mathematical meaning and content; many students struggle with
understanding mathematics symbols and notation.
• We report results from a small qualitative study in which we explore how
higher and lower proficient EL students and non-EL students read mathemat-
ics semiotics when thinking aloud while solving two problems to determine
what the students thought were helpful “entry points” or unhelpful “exit
points” during problem solving.
• We provide suggestions for best practices that promote and engage EL students
with opportunities to flexibly use semiotics and conceptual understanding to
solve mathematics problems.
56 Mary A. Avalos et al.
Chapter Purpose
EL students consistently score lower than their white, English-speaking peers on
mathematics assessments (U.S. Department of Education, 2015). Problem solving and
mathematics practices (e.g., perseverance, reasoning, and communication skills) are
key to mathematics achievement in school, as advocated by learning standards (e.g.,
National Council of Teachers of Mathematics [NCTM], 1991, 2001; Common Core
State Standards [CCSS], National Governors Association Center for Best Practices,
Council of Chief State School Officers, 2010). Additionally, mathematics achieve-
ment is necessary for advanced educational opportunities (Morgan, Farkas, & Wu,
2011; Murnane, Willett, Braatz, & Duhaldeborde, 2001). Students may do well with
computation-type exercises but not do well on word problems (Kintsch, 1987); for EL
students this problem is even more evident (Abedi, 2011; Lager, 2006; Wiest, 2008).
There is little published research that investigates students’ use of semiotics or
meaning-making systems for mathematics (i.e., language, symbols, notation, and
visual representations) while problem solving, and how they may assist or hinder
reading and solving mathematics word problems (Avalos, Bengochea, & Secada,
2015; Schleppegrell, 2010).This chapter looks closely at the extent to which middle
school EL students and English proficient (EP) students with varying mathematics
proficiency used the problems’ semiotics to read, understand, and solve two math-
ematics problems.We begin by providing an overview of challenges for EL students’
mathematics problem solving based on research investigating how semiotics inhibit
or promote access to mathematical understanding. We then present the methods
and findings of a small study we conducted with middle school EL and EP students.
Practical implications to facilitate EL students’ reading of mathematical semiotics
to make meaning of and solve mathematics word problems conclude the chapter.
on mathematics word problems was poorer when the problems were written with
dense, complex sentences than on the same problems written with less complex lan-
guage. Problems that were both linguistically and mathematically complex were the
most challenging, indicating that EL students face additional cognitive demands when
solving word problems in a second (nondominant) language. In another study, the
combined demands of both academic language and content knowledge in analyzed
word problems seemed to be even more challenging for EL students, whereas non-
EL students appeared to have more advanced consolidated meaning-making systems
to meet the interpretive demands of word problems with more and varied semiotics
than EL students (Solano-Flores, Barnett-Clarke, & Kachchaf, 2013). Previous work
has also identified syntactic, lexical, cultural, and test or text layout to be challenging
word-problem features for EL students (Martiniello, 2009; Schleppegrell, 2007; Secada,
1991).
Language
Importantly, according to semiotic approaches, language is more than a tool for
representation and communication; it is a tool for thinking and making meaning
(Schleppegrell, 2010). As shown in Table 4.1, mathematics uses language that is
commonly found in everyday use and contexts, along with specialized language,
such as technical terms and distinct grammatical patterns, to make meaning (see
Schleppegrell, 2004 for a detailed discussion).
While everyday and mathematical language are intertwined, school-mathematics
language may draw on everyday language to make meaning in ways that are dif-
ferent than what is typically found when completing outside-of-school tasks (Bar-
well, 2013; Halliday, 1978). This borrowing of everyday language for mathematical
purposes goes beyond the word level; it requires students to learn, to understand,
and to use what has been called the mathematics register (Halliday, 1978). A reg-
ister is the configuration of lexical (vocabulary) and grammatical resources that
58 Mary A. Avalos et al.
are appropriate for particular language use within a particular discourse context
(Schleppegrell, 2004). The mathematics register allows for discursive practices, or
patterns of language use, to be established for multiple purposes, including profes-
sional and educational mathematics.
The acquisition of the mathematics register often takes place within the math-
ematics classroom, while students are engaged in mathematical learning; however,
simply experiencing mathematics instruction does not necessarily prompt the
acquisition of the mathematical register (Halliday, 1978; Huang & Normandia,
2008; Moschkovich, 2015). Sfard and Lavie (2005) demonstrate that learning the
Reading Mathematics Problems 59
mathematics register requires the use of language while engaging with concepts,
and it is realized by what they call objectification. Objectification takes place when
terms like number words are used as nouns (e.g., six is less than seven) rather than
as determiners (e.g., five cubes). EL students need to acquire concepts prior to
owning and using the mathematics register (Chapman, 2003; Sfard, 2000; Sigley &
Wilkinson, 2015). Uptegrove (2015) found that the uptake of the mathematics
register and abstraction come with increased conceptual understanding and experi-
ence/exposure; students in her longitudinal study increasingly used abstract, discur-
sive patterns as they progressed through the grades, and especially in high school.
Research investigating how the language of word problems might be modi-
fied to make problem solving more accessible to EL students has found mixed
results. Abedi and Lord (2001) modified national test items to explore how fewer
nominal (noun) phrases, more explicit conditional relationships, simpler questions
with active (rather than passive) voice, and more familiar vocabulary would assist in
comprehending and solving the problems. Most participating eighth-grade students
selected the modified problems as those they would do first. Moreover, the students
were more successful solving modified word problems than solving the originals.
EL students and students of lower socioeconomic status (SES) benefitted more
from the linguistic modifications than English speakers and students from middle
to higher SES backgrounds. In another study, three opportunity-to-learn variables
(i.e., students’ report of content coverage in class, their teacher’s content knowledge,
and students’ previous mathematics ability) were compared to see which better
predicted EL student mathematics achievement (Abedi, Courtney, Leon, Kao, &
Azzam, 2006). Although all three were significant predictors of students’ math-
ematics performance, previous mathematics ability and teacher content knowledge
showed greater effects on EL students’ achievement than modified word problems.
The syntax of mathematics word problems presents several challenges for EL stu-
dents including the use of comparatives (such as “x times as much” or “is greater
than”), and when some EL students’ translation of a problem’s text is literal, from left-
to-right (so that “There are 3 times as many boys (b) as girls (g) in a class” is incorrectly
translated as 3b = g, or when 15/4 is translated by Spanish speakers as “Dividir 15 entre
4” or “Divide 15 into 4”) (Celedón-Pattichis, 1999). Castellanos (1980) provides other
examples of syntax challenges for EL students such as when the mathematics symbols
of a given word problem do not have a one-to-one correspondence with the words
they represent (i.e., 15/4 is read as “4 goes into 15”) or in instances involving substitu-
tion (such as when “Substitute 3 for y, meaning to use 3 anytime y was used, is literally
translated into Spanish as “Sustituya 3 por y”, which instructs to use y anytime 3 was
used—the complete opposite of the English instructions). Even minor syntactic vari-
ations on a semantic structure can pose cognitive challenges for EL students, such as
understanding and distinguishing the question being asked in “How many were there
altogether?” “How many were left?” “How many fewer were left?” or “How many
more were there before?” (Celedón-Pattichis, 1999). Finally, the logical connectives
used to link the propositions in reasoning arguments such as “either . . . or,”“if . . . then,”
and “only if ” can also create difficulties for EL students (Celedón-Pattichis, 1999).
Another possible challenge for EL students’ word problem solving is related to
how the problems are contextualized. Word problem contexts are created to help
students link mathematics to the real world; however, these adult-contrived contexts
60 Mary A. Avalos et al.
can create confusion for comprehension and limit accessibility to successful problem
solving for students who struggle with mathematics ( Jackson, Garrison, Wilson,
Gibblons, & Shahan, 2013; Staub & Reusser, 1995). Thus, as an example, contexts
that situate problems in determining the most cost-effective rate for cellular phones
are likely to be foreign for students who lack the experience of and/or responsibili-
ties for making such decisions, even though the use of cellular phones lies within
their realm of experience (and expertise). Additionally, EL students may be unfa-
miliar with cultural contexts commonly used and understood by their native Eng-
lish-speaking peers, such as “spelling bee” competitions (Martiniello, 2008, p. 28).
Visual Representations
Students use visual representations (VRs) as tools that assist in their understanding
and applying mathematical knowledge (Swanson, 2010). With so many types of
mathematical VRs (Table 4.3), EL students may require explicit instruction in order
to help them use and better understand how VRs represent mathematics content
(Avalos, Bengochea, & Secada, 2015). EL students benefit from seeing and hearing
teachers and peers model the language necessary to name and precisely describe the
numbers, shapes, symbols, and reason through the properties and explanations of
solutions (Battey, Llamas-Flores, Burke, Guerra, Kang, & Kim, 2013; Moschkovich,
1999). In turn,VRs familiar to EL students from prior instruction in their first lan-
guage may serve as the basis upon which EL students contribute to mathematical
ideas and problem solving while they acquire a second language (Moschkovich,
2007). EL students have taken greater academic risks when interacting with new
content in class and have used drawings outside of class as a result of lessons that
incorporated VRs such as geometric figures, number lines, clocks, and place-value
charts (Young & Marroquin, 2008). Teachers who use VRs consistently reach more
of their diverse learners where they are, use experiences they bring into the class-
room, use real-life objects that help validate, recall, and transfer mathematics topics
being taught, and are able to more seamlessly infuse higher-level thinking into les-
sons (Young & Marroquin, 2008).
Several studies have found that encouraging students to create independent
drawings and represent aspects of word problems they view to be key, and explicitly
teaching students about specific diagram types, resulted in higher student achieve-
ment on problem solving (Lewis, 1989;Van Essen & Hamaker, 1990; Wolters, 1983;
Zweng, Geraghty, & Turner, 1979). Other studies have looked at the types of rep-
resentations that students create on assessments. Schematic drawings reflect a rela-
tively accurate appearance, structure, or workings of an object or a space by using
proportional spacing between the components represented in the diagram. Non-
schematic drawings, on the other hand, may contain aspects that are unrelated to
TABLE 4.3 Description of Visual Images
86,500
vertically. 67,500
10,484 5,806 Asia & Oceania
Central & South
3,304 America
Eurasia
58,000 21,426
735 1,241 Europe
19,472
2,577
48,500 North America
39,000 16,769
13,328
17,394
29,500
11,193
27,723 33,279
22,559
20,000
1980 2000 2014
Year
33,279
Africa
drawn horizontally or
21,426
21,000
19,472
Asia & Oceania
17,394
16,769
10,484 Europe
10,500
11,193
North America
5,806
7,000
4,688
3,304
2,577
3,500
1,241
2,038
735
0
1980 2000 2014
Year
Reading Mathematics Problems 63
or visually dissimilar from that which was originally represented. Edens and Potter
(2007) have found schematic diagrams to be more effective in assisting students’ rep-
resentational understanding than nonschematic drawings. Martiniello (2009) found
that schematic VRs mitigate the challenges of linguistic complexity for mathematics
word problems by helping EL students make meaning of the texts that they read.
Summary
The semiotics specific to mathematics are comprised of highly and semitechnical terms,
dense noun phrases, mathematics-specific and precise conjunctions, and complex sub-
ordinated logical relationships (Schleppegrell, 2007). The complex, interconnected, and
constantly evolving nature of mathematics symbols, technical and natural language, and
visual, nonlinguistic representations (O’Halloran, 2005) gives rise to the difficulties that
most students have in learning to communicate their mathematics understanding in
both classroom and assessment settings. In order to consistently and successfully evidence
higher-level secondary-mathematics understandings and practices, students need to
become adept at using the mathematics register (Schleppegrell, 2007; Wilkinson, 2015)
64 Mary A. Avalos et al.
and semiotics, such as symbols, notation, and VRs (O’Halloran, 2005, 2011; Schleppe-
grell, 2010). Some mastery of the mathematics register and semiotics seems to be nec-
essary for students to efficiently read and solve mathematics problems (Barbu & Beal,
2010; Solano-Flores, Barnett-Clarke, & Kachchaf, 2013; Uptegrove, 2015).
Our Study
This small, secondary study was part of a larger research and development effort to
integrate language teaching within mathematics education for EL students (Secada &
Avalos, 2010). The Language in Math project included a comprehensive professional
development (Avalos, Zisselsberger, Langer-Osuna, & Secada, 2015) and classroom
intervention (Avalos, Medina, & Secada, 2015) with the overarching goal of increas-
ing teacher and EL student awareness of mathematical language and literacy. In this
secondary study, we examined middle school students’ audio-recorded think-aloud
interviews to explore how three semiotic features found in mathematics problems
facilitated or hindered higher- and lower-achieving EL and FEP students’ problem
solving.We built on Sigley and Wilkinson’s (2015) work to explore how middle school
students’ “critical events,” or verbalized mathematical ideas and/or shifts in thinking,
indicate a resolution of the problem (p. 80).We use the term “critical reading events” to
highlight students’ think-aloud statements that were related to reading the semiotics
found in the problems. Statements that were mathematically, conceptually, or factually
correct, reflected the information given in the original problem, and/or contributed
to the student’s progress toward a correct answer were deemed to be helpful and
problem-solving “entry points,” and statements that were mathematically, conceptu-
ally, or factually incorrect, that did not reflect the information given in the original
problem, and/or did not contribute to the student’s progress toward a correct answer
were deemed to be not-helpful and problem-solving “exit points,” as they did not
successfully advance students’ problem solving. As such, we address the research ques-
tion, What semiotic systems (e.g., language, symbols, visual representations) facilitate EL and EP
students’ problem-solving capacity when reading and solving mathematics word problems?
Participants
Twenty seventh (n = 9) and eighth grade (n = 11) students participated in this study.
Of the 20 students, five were seventh-grade EL students (three Spanish, one Spanish/
Russian, and one Urdu home language backgrounds) with intermediate English pro-
ficiency.The remaining 15 students were considered English proficient (EP). Fourteen
of the 15 EP students were from homes where a language other than English was spo-
ken (Spanish, Creole, Urdu, and Hindi).An equal number of boys and girls participated
in this study overall. All 20 participating students completed standardized assessments
that were used as diagnostic indicators, including the Test of Mathematical Abilities-
2nd Edition (TOMA-2; Brown, Cronin, & McEntire, 1994). For this study, we used
TOMA-2 Math Quotient scores to select and group the participating students by
their mathematics-proficiency levels into two groups, as determined by the TOMA-2’s
mean score of 100: the Higher Proficient/Average group (n = 10; High Mathematics
Proficiency) scored between 90 and 120; and the Below Average/Poor group (n = 10;
Reading Mathematics Problems 65
Low Mathematics Proficiency) scored between 70 and 89. All five of the EL students
were in the Low Mathematics Proficiency group.
Think-Aloud Interviews
Students solved four grade-appropriate multiple-choice mathematics word-and-
computation problems during an individual, audio-recorded cognitive interview.
We report on students’ thinking for two of the four problems in this chapter. Stu-
dents were given pencils and interview packets comprised of a practice problem
and four problems, with ample workspace to be used as necessary. After view-
ing a brief video of how to “think aloud,” students were provided with a sample
problem and given time to practice thinking aloud. Once the practice problem
was completed and the students’ questions were answered, they proceeded to the
four problems, reading each problem aloud before solving it and thinking aloud
10
9
8
Elevation in feet
7
6
5
4
3
2
1
x
0 1 2 3 4 5 6 7 8 9 10
Distance in feet
The length of the base of the ramp is 10 feet and the height at the end
of the ramp is 2.5 feet. Which ofthe following is the slope of the ramp?
1 1
A. 8 B. 4 C. 4 D. 8
FIGURE 4.2 Problem 2: Mrs. Rafferty Put Four Expressions on the Board and Asked
Her Students to Simplify Them
This sample FCAT/FCAT 2.0 question appears by permission of the Florida Department of Education,
Office of Assessment, Tallahassee, Florida 32399–0400.
while solving it. If a student stopped verbalizing his/her thinking while problem
solving, interviewers prompted, “What are you thinking?” after about five seconds
of silence.
The two problems were verbatim, standardized test questions released by the Flor-
ida Department of Education as sample items from its annual high-stakes assessment
at the time of this study. “Problem #1” pertained to Ratios and Proportions1 (Fig-
ure 4.1). “Problem #2” is linked to Number and Operations and to Expressions and
Equations2 (Figure 4.2). All the participating students read (decoded) the problems
with fluency and accuracy. We selected these two problems for this chapter because
they include semiotic resources commonly found in mathematics word problems
that, according to previous work discussed earlier, could be confusing for EL students.
Data Analysis
We analyzed students’ audio-recorded transcripts and extracted the “critical reading
events” (CREs) related to the problems’ text (i.e., language and numbers including
alpha-numeric aspects of the problem and multiple-choice answer options); graph (a
Cartesian coordinate system graph), and symbols (subtraction, radical or square root,
exponent or power, and absolute value bars symbols) to explore how these were
used (or not used) during students’ problem-solving processes. To determine the
CREs, transcripts were analyzed starting from the interviewer’s prompt for students
to begin (or continue on to) each problem. A student’s new statement (often with
Reading Mathematics Problems 67
a subject and predicate) typically identified the beginning of a CRE (and the end
of the previous CRE). Once the CREs were extracted, we analyzed the data using
grounded theory and constant comparison methods (Charmaz, 1995; Strauss &
Corbin, 1990) to iteratively examine students’ CREs linked to the mathematics
semiotics explained previously. We then looked for common patterns within stu-
dent mathematics- and language proficiency-groups.
Results
Eight of the ten High Mathematics Proficient students solved both problems correctly,
while three of the ten Low Mathematics Proficient students correctly solved both prob-
lems.Three of five EL students and 12 of 15 EP students answered Problem #1 correctly;
only one of five EL students and 11 of 15 EP students answered Problem #2 correctly.
The majority of the higher proficient students solved the interview problems
mentally; they used paper/pencil to illustrate their justification and explanations of
their problem solving. In contrast, most of the lower proficient EP and EL students
used paper/pencil to work out solutions. Interestingly, the EL students who were
unable to solve Problem #2 did not use the paper/pencil at all. This, in addition to
evidence from their think-aloud transcripts, indicates that the EL students had little
interaction with the text while solving the problem.
Students in the higher proficient group responded to language and semiot-
ics in ways similar to their lower mathematics-proficiency peers; however, unlike
their peers they were able to anticipate elements (formulae, data points, technical
terms, prerequisite information, mathematical confidence, and perseverance) neces-
sary to predict, validate, and synthesize mathematics concepts, patterns, inferences,
and solutions. In other words, their “entry points” included and then extended beyond
those of the lower proficiency group. For example, students in the Below Aver-
age/Poor group would read the word “slope” or would see an exponent, and they
would ask themselves if they knew the term, or if they could recall the formula or
the procedure, or if they could recite the phrase their teacher told them as a mne-
monic tool (“rise over run”).
For example, occasionally after long pauses of silence or explicitly rereading the
question, students stated:
“Well, like the slope, so you have to put things, the coordinates.”
“I forgot how to do slope.”
“Negative three times—negative three . . . you have to multiply three times
negative three.”
Students in the higher proficient group read the word “slope” and introduced
synonyms such as the term “steepness;” and/or they described the expected sign of
the slope based on its shape; and/or they narrated the slope as both a rate of change
and a predictor for where the line in question will intersect with more convenient
coordinates on the Cartesian graph (such as whole numbers); and/or they would
68 Mary A. Avalos et al.
recall the mnemonic “rise over run,” the slope formula, and make connections
between these and the patterns seen on the graph. For example:
“So since the elevation is the y, since the line is going diagonally to the left to the
right, it means the slope would be positive. That means at the end of the graph, 10
feet equals 2 and a half feet so Y with a change of X. So for 1 foot the elevation
doesn’t equal to 1 foot so it would have to be less than 1, so C and D would not
be the answer.”
“. . . well, I learned that the slope, as a fraction, is rise over run. So rise will
be—in the picture it shows—in the graph it shows that the rise is 2.5, because
height is the same thing as rise. So 20, I mean 2.5 over 10 is the base, so that’s
the same thing as run. So 2.5 over 10 will be the slope, but now I have to find
it in simplest form, because I see there’s no answers equivalent to mine, so I have
to simplify it . . . .”
Students in the lower proficient group would read discrete, contextualized val-
ues from a word problem (such as a base of 10 feet and an elevation of 2.5 feet)
and try to use those numbers and only those numbers in the slope formula, if they
could successfully recall that formula. Higher proficient students would take this
same information and foreground these data in the context of a graph, thereby
making these discrete values part of a continuous line with many useful values.
Comments by students in the lower proficiency group evidence a focus on num-
bers given or excerpted from the graph’s coordinates:
“Rise is 2.5 feet, and run’s 10 feet, so automatically, you would have to eliminate
A and D since that’s completely out of question. So I would most likely go with B
because of the rise over run concept.”
“Well, like the slope, so you have to put things, the coordinates. 4 and 1. And 8, 2.
Let’s see. B. I guess. I think it’s one-fourth ’cause it’s increasing by one-fourth.”
“I can see that, from the graph, that within—for every 1 foot of elevation in feet, it
moves a distance of 4 feet. So I know that it would be 1 over 4, so that would be one
move up. And four moves to the right. And to check myself, I’m going to move one
move up again, and four units to the right, again, which gets me to the next point on
the graph, which is 2, 8. So the answer for this would be 1/4th.”
Lower proficient students would read multiple-choice options and try to fit
their rote calculations into at least one choice that fit their intuition. Higher profi-
cient students would assign properties (fraction vs. whole number, positive vs. nega-
tive, etc.) to the multiple-choice options, and then they would use those properties
to reason an option for further consideration or eliminate an option because it
Reading Mathematics Problems 69
failed to meet preestablished criteria for the correct type of answer. For example, a
student of lower proficiency stated:
“Ten feet equals the length of the base of the ramp; 2.5 feet is the height. We need to
find out which slope—which is the slope of the ramp. I don’t know . . . . It’s 4 feet.
Is it B?”
On the other hand, a higher proficiency student would reason through using gen-
eral properties of the specific arithmetic results:
“I know the answer wouldn’t be 6 or 3 because they’re both positive and I have two
numbers, negative 27 and another negative number, so I would cancel out expression
two and three so I can remember expression two and three on my answers so I would
put an X next to them. I would now estimate 4 minus 5 would be 1 so it would be
negative 1 point something. I know that negative 27 is way lower than a negative 1
point something, so the answer would be D.”
“I’m gonna subtract . . . . Let me write it, again . . . . Because I thought the 5 was big-
ger, but I didn’t see that the 4.6507 was greater than 5.196 . . . . Because there’s more
number values. And I don’t get Expression 2, so I’ll move onto Expression 3 . . . it
says negative 2 minus 1, so it’s got negative 1, but I forgot what the bars mean . . . . I
think the answer is Expression 3.”
Study Conclusions
Although all 20 students used similar problem-solving strategies when initially
reading the two problems, higher proficient students were able to capitalize on
their strong understanding of the problems’ mathematics concepts and to use the
mathematics register to advance their CREs to a level of abstraction and efficiency
not demonstrated by students in the lower proficient group. Students in the lower
proficiency group attempted to solve problems in similar ways as the higher pro-
ficient students, but their lack of mathematical understanding and basic number
sense often served as the exit point that hindered their problem solving during the
interviews. This was demonstrated by the lower proficient EL students and non-
EL students in our sample.
70 Mary A. Avalos et al.
Best Practices
Instruction to promote EL students’ problem solving has called for explicit teach-
ing of mathematics semiotics (i.e., language, symbols and notation, visual representations)
while constructing multiple meanings through social interactions (Avalos et al.,
2015; Battey et al., 2013; Morgan, 2004, 2006; Solano-Flores et al., 2013). With an
established and appropriate learning environment, classroom instruction utilizes
discourse and social interaction to focus on semiotics so as to allow students to
construct understanding and to learn content; therefore, learners play dual roles
as active participants and engaged observers communicating mathematical under-
standing, ultimately learning to explore and manipulate mathematics semiotics in
novel or different ways (Seeger, 2004).
To promote and foster EL students’ participation and flexible use of semiotics
during mathematics lessons, instruction should focus on conceptual understand-
ing and exploration of semiotics with discussion that builds on sense making
and shared meaning. Teachers can realize this goal for EL students’ by making
the content relevant to life outside of school (language, culture, and community)
and building on the language resources EL students bring with them to school
(Moschkovich, 2011). Teachers can also shift their discourse to support EL stu-
dents’ use of the math register by incorporating talk moves that revoice (Chapin,
O’Connor, & Anderson, 2009), clarify (Moschkovich, 1999), or probe (Herbel-
Eisenmann, Steel, & Cirillo, 2013) students’ understandings and pose questions
that promote thinking and reasoning during problem solving (Avalos & Jones,
2017; Herbel-Eisenmann et al., 2013; Moschkovich, 1999). It is also benefi-
cial and important for teachers to understand what EL students know about
Reading Mathematics Problems 71
only (Young & Marroquin, 2008). We would like to point out that the use of VRs
during math instruction in a participatory mathematics classroom may or may not
have an explicit role when students are assessed. This is because the social supports
and the peer discussions that are present in classrooms are absent during assessment
settings. One of the few constants EL students can bring to an assessment from
their instructional environment are some of the VR tools that they have learned
in class; therefore, incorporating VR instruction in meaningful ways (as suggested
earlier) and effectively scaffolding to minimal teacher/peer support is important for
preparing EL students to solve mathematics problems independently.
Acknowlegments
This work was made possible due to a grant by The Institute of Educational Sci-
ences, Award Number R305A100862. The content of this chapter does not nec-
essarily reflect the views or policies of IES or the U.S. Department of Education,
nor does mention of trade names, commercial products, or organizations imply
endorsement by the U.S. Government.We would like to acknowledge the collabo-
rative effort that made this work possible, including the teachers, students, and dis-
trict administrators who contributed invaluably to this project, as well as Margarita
Zisselsberger, Alain Bengochea, Naomi Iuhasz, Kristen Doorn, and Robin Shane.
Notes
1 C-PALMS State Standard MA.7.A.1.4—“Graph proportional relationships and identify
the unit rate as the slope of the related linear function” (Florida State University, 2007).
2 C-PALMS State Standards: MAFS.6.NS.3.7 (“Understand the absolute value of a rational
number as its distance from 0 on the number line; interpret absolute value as magnitude
for a positive or negative quantity in a real-world situation”), MAFS.6.EE.1.1 (“Write and
evaluate numerical expressions involving whole-number exponents”), MAFS.6.NS.2.3
(“Fluently add, subtract, multiply, and divide multidigit decimals using the standard algo-
rithm for each operation”, and MAFS.8.EE.1.2 (“. . . Evaluate square roots of small perfect
squares and cube roots of small perfect cubes . . .”) (Florida State University, 2007).
References
Abedi, J. (2004). The no child left behind act and English language learners: Assessment and
accountability issues. Educational Researcher, 33, 4–14.
Abedi, J. (2011). Assessing English language learners: Critical issues. In M. R. Basterra, E.
Trumbull, & G. Solano-Flores (Eds.), Cultural validity in assessment: Addressing linguistic and
cultural diversity (pp. 49–71). New York: Routledge.
Abedi, J., Courtney, M., Leon, S., Kao, J., & Azzam, T. (2006, November). English language
learners and math achievement: A study of opportunity to learn and language accommodation.
Technical Report 702, National Center for Research on Evaluation, Standards, and Stu-
dent Testing (CRESST), Center for the Study of Evaluation, Graduate School of Educa-
tion and Information Studies, Los Angeles, CA: University of California.
Abedi, J., & Lord, C. (2001). The language factor in mathematics. Applied Measurement in
Education, 14, 219–234.
Arcavi, A. (1994). Symbol sense: Informal sense-making in formal mathematics. For the
Learning of Mathematics, 14(3), 24–35.
Arcavi, A. (2003). The role of visual representations in the learning of mathematics. Educa-
tional Studies in Mathematics, 52(3), 215–241.
Arcavi, A. (2005). Developing and using symbol sense in mathematics. For the Learning of
Mathematics, 25(2), 42–47.
Avalos, M. A. (2017, March). Communication in whole class mathematics teaching: Shifting discourse
during problem-solving discussions. Paper presented at the Annual Meeting of the American
Association of Applied Linguistics, Portland, OR.
74 Mary A. Avalos et al.
Avalos, M. A., Bengochea, A., & Secada, W. G. (2015). Reading mathematics: More than
words and clauses, more than numbers and symbols on a page. In K. Santi & D. Reed
(Eds.), Improving comprehension for middle and high school students (pp. 49–74). Cham, Swit-
zerland: Springer International Publishing.
Avalos, M. A., & Jones, L. D. (2017, March). Facilitating diverse students’ discourse during math-
ematics discussions:What do teacher questions have to do with it? Paper presented at the Annual
Meeting of the American Association of Applied Linguistics, Portland, OR.
Avalos, M. A., Medina, E., & Secada, W. G. (2015). Planning for instruction: Increas-
ing multilingual learners’ access to algebraic word problems and visual graphics. In
A. Bright, H. Hansen-Thomas, & L. C. de Oliveira (Eds.), The Common Core state
standards in math for English language learners: High school (pp. 5–28). Alexandria, VA:
TESOL.
Avalos, M. A., Zisselsberger, M., Langer-Osuna, J., & Secada, W. G. (2015). Building teacher
knowledge of academic literacy and language acquisition: A framework for cross-
disciplinary professional development. In D. Molle, T. Boals, E. Sato, & C. A. Hedgspeth
(Eds.), Sociocultural context of academic literacy development for adolescent English language learn-
ers (pp. 255–276). New York: Taylor & Francis/Routledge Publishers.
Barbu, O., & Beal, C. R. (2010). Effects of linguistic complexity and math difficulty on word
problem solving by English learners. International Journal of Education, 2(2), 1–19.
Bartolini Bussi, M. G., & Mariotti, M. A. (2008). Semiotic mediation in the mathematics
classroom: Artifacts and signs after a Vygotskian perspective. In L. English, M. Bartolini
Bussi, G. Jones, R. Lesh, & D.Tirosh (Eds.), Handbook of international research in Mathematics
education (2nd ed., pp. 746–805). Mahwah: Lawrence Erlbaum.
Barwell, R. (2009). Mathematical word problems and bilingual learners in England. In R.
Barwell (Ed.), Multilingualism in math classrooms: Global perspectives (pp. 63–77). Bilingual
Education and Bilingualism, No. 73. Bristol: Multilingual Matters.
Barwell, R. (2013). The academic and the everyday in mathematicians’ talk: The case of the
hyper-bagel. Language and Education, 273, 207–222.
Battey, D., Llamas-Flores, S., Burke, M., Guerra, P., Kang, H. J., & Kim, S. H. (2013). ELL
policy and math professional development colliding: Placing teacher experimentation
within a sociopolitical context. Teachers College Record, 115(6), 1–44.
Brown, V. L., Cronin, M. E., & McEntire, E. (1994). Test of mathematical abilities (2nd ed.).
Austin, TX: Pro-Ed.
Carpenter, T. P., Fennema, E., & Franke, M. L. (1996). Cognitively guided instruction:
A knowledge base for reform in primary mathematics instruction. The Elementary School
Journal, 97(1), 3–20.
Castellanos, G. G. (1980). Mathematics and the Spanish-speaking student. Arithmetic Teacher,
28(3), 16.
Celedón-Pattichis, S. (1999). Constructing meaning:Think-aloud protocols of ELLs on English and
Spanish word problems. Paper presented at the annual Meeting of the American Educa-
tional Research Association, Montreal, Canada.
Chapin, S., O’Connor, C., & Anderson, N. (2009). Classroom discussions: Using math talk to help
students learn, grades K-6 (2nd ed.). Sausalito, CA: Math Solutions Publications.
Chapman, A. P. (2003). A social semiotic of language and learning in school mathematics. In
M. Anderson, A. Sáenz-Ludlow, S. Zellweger, & V.V. Cifarelli (Eds.), Educational perspectives
on math as semiosis: From thinking to interpreting to knowing (pp. 129–148). Brooklyn, NY
and Ottawa, ON: Legas.
Charmaz, K. (1995). Grounded theory. In J. A. Smith, R. Harre, & L.Van Langenhove (Eds.),
Rethinking methods in psychology (pp. 27–49). London: Sage.
Cipriano, J. (2011, August). 32 tips for ELLs. Scholastic Instructor, 121(1), 36–38.
Clarke, S. N. (2015). The right to speak. In L. B. Resnick, C. Asterhan, & S. Clarke (Eds.),
Socializing intelligence through academic talk and dialogue (pp. 167–180). Washington, DC:
American Educational Research Association.
Reading Mathematics Problems 75
Kress, G., & Van Leeuwen, T. (2001). Multimodal discourse: The modes and media of contemporary
communication. London: Arnold.
Kuhn, D., & Zillmer, N. (2015). Developing norms of discourse. In L. B. Resnick, C. Aster-
han, & S. Clarke (Eds.), Socializing intelligence through academic talk and dialogue (pp. 77–86).
Washington, DC: American Educational Research Association.
Lager, C. A. (2006). Types of mathematics-language reading interactions that unnecessarily
hinder algebra learning and assessment. Reading Psychology, 27(2–3), 165–204.
Lampert, M., & Cobb, P. (2003). Communication and language. In J. Kilpatrick, W.
G. Martin, & D. Schifter (Eds.), A research companion to principles and standards for
school mathematics (pp. 237–249). Reston, VA: The National Council of Teachers of
Mathematics.
Lemke, J. L. (1995). Textual politics. London: Taylor & Francis.
Lemke, J. L. (1998). Multiplying meaning: Visual and verbal semiotics in scientific text. In
J. R. Martin & R.Veel (Eds.), Reading science: Critical and functional perspectives on discourses
of science (pp. 87–113). New York: Routledge.
Lemke, J. L. (2003). Math in the middle: Measure, picture, gesture, sign and word. In M.
Anderson & A. Sáenz-Ludlow, S. Zellweger, & V.V. Cifarelli (Eds.), Educational perspectives
on math as semiosis: From thinking to interpreting to knowing (pp. 215–234). Ottawa, Canada:
Legas.
Lewis, A. B. (1989). Training students to represent arithmetic word problems. Journal of Edu-
cational Psychology, 81(4), 521–531.
Martiniello, M. (2008). Language and the performance of English-language learners in math
word problems. Harvard Educational Review, 78(2), 333–368.
Martiniello, M. (2009). Linguistic complexity, schematic representations, and Differential
Item Functioning for English language learners in math tests. Educational Assessment, 14,
160–179.
Morgan, C. (2004). Word, definitions and concepts in discourses of mathematics, teaching
and learning. Language and Education, 18, 1–15.
Morgan, C. (2006). What does social semiotics have to offer math education research? Edu-
cational Studies in Mathematics, 61, 219–245.
Morgan, P. L., Farkas, G., & Wu, Q. (2011). Kindergarten children’s growth trajectories in
reading and mathematics: Who falls increasingly behind? Journal of Learning Disabilities,
44, 472–488.
Moschkovich, J. N. (1999). Understanding the needs of Latino students in reform-
oriented mathematics classrooms. In L. Ortiz-Franco, N. G. Hernández, & Y. De La
Cruz (Eds.), Changing the faces of mathematics: Perspectives on Latinos (pp. 5–12). Reston,
VA: NCTM.
Moschkovich, J. N. (2007). Using two languages when learning mathematics. Educational
Studies in Mathematics, 64(2), 121–144.
Moschkovich, J. N. (2011). Language and mathematics education: Multiple perspectives and direc-
tions for research. Charlotte, NC: Information Age.
Moschkovich, J. N. (2015). Academic literacy in math for English learners. The Journal of
Mathematical Behavior, 40, 43–62.
Murnane, R. J., Willett, J. B., Braatz, M. J., & Duhaldeborde, Y. (2001). Do different dimen-
sions of male high school students’ skills predict labor market success a decade later?
Evidence from the NLSY. Economics of Education Review, 20, 311–320.
Musanti, S. I., & Celedón-Pattichis, S. (2014). Promising pedagogical practices for emergent
bilinguals in kindergarten: Towards a mathematics discourse community. Journal of Multi-
lingual Education Research, 4(1), 41–62.
National Council of Teachers of Mathematics. (1991). Professional standards for teaching math-
ematics. Reston,VA: Author.
National Council of Teachers of Mathematics. (2001). Principles and standards for school math-
ematics. Reston,VA: Author.
Reading Mathematics Problems 77
National Council of Teachers of Mathematics. (2009). Executive summary: Focus in high school
mathematics: Reasoning and sense making. Retrieved from www.nctm.org/uploadedFiles/
Math_Standards/FHSM_Executive_Summay.pdf
National Governors Association Center for Best Practices, Council of Chief State School
Officers. (2010). Common Core State Standards for Mathematical Practice. Washington, DC:
National Governors Association Center for Best Practices, Council of Chief State School
Officers. Retrieved from www.corestandards.org/Math/Practice
O’Halloran, K. L. (2005). Mathematical discourse: Language, symbolism, and visual images. New
York: Continuum.
O’Halloran, K. L. (2011). Multimodal discourse analysis. In K. Hyland & B. Paltridge (Eds.),
Companion to discourse. London: Continuum.
Pimm, D. (1987). Speaking mathematically: Communication in mathematics classrooms. New York:
Routledge & Kegan Paul.
Powell, S. R. (2011). Solving word problems using schemes: A review of the literature. Learn-
ing Disabilities Research & Practice, 26, 94–108.
Ron, P. (1999). Spanish-English language issues in the mathematics classroom. In L. Ortiz-
Franco, N. G. Hernandez, & Y. De La Cruz (Eds.), Changing the faces of mathematics: Perspec-
tives on Latinos (pp. 23–33). Reston,VA: NCTM.
Rubenstein, R. N., & Thompson, D. R. (2001). Learning mathematical symbolism: Chal-
lenges and instructional strategies. The Math Teacher, 94(4), 265–271.
Sackur, C. (1995). Blind calculators in algebra: Write false interviews. In E. Cohors-
Fresenborh (Ed.), Proceedings of the first European Research Conference on Mathematics Educa-
tion (ERCME ’95) (pp. 82–85). Osnabruck, Germany: European Society for Research in
Mathematics Education (formerly the European Research Conference on Education).
Schleppegrell, M. J. (2004). The language of schooling: A functional linguistics perspective. Mahwah,
NJ: Lawrence Erlbaum Associates.
Schleppegrell, M. J. (2007). The linguistic challenges of math teaching and learning: A research
review. Reading & Writing Quarterly, 23, 139–159.
Schleppegrell, M. J. (2010). Language in mathematics teaching and learning: A research
review. In J. N. Moschkovich (Ed.), Language and mathematics education: Multiple perspectives
and directions for research (pp. 73–112). Charlotte, NC: Information Age Publishing.
Secada,W. G. (1991). Degree of bilingualism and arithmetic problem solving in Hispanic first
graders. The Elementary School Journal, 92(2), 213–231.
Secada,W. G., & Avalos, M. A. (2010–2013). Language in math.Washington, DC: U. S. Depart-
ment of Education, The Institute of Education Sciences. Retrieved from http://ies.
ed.gov/ncer/projects/grant.asp?ProgID=59&grantid=1034
Seeger, F. (2004). Beyond the dichotomies: Semiotics in math education research. ZDM,
36(6), 206–216.
Seeger, F. (2005). Notes on a semiotically inspired theory of teaching and learning. In
M. H. G. Hoffman, J. Lenhard, & F. Seeger (Eds.), Activity and sign: Grounding Mathematics
education (pp. 67–76). New York: Springer Science+Business Media, Inc.
Sfard, A. (2000). Symbolizing mathematical reality into being: How mathematical discourse
and mathematical objects create each other. In P. Cobb, K. E.Yackel, & K. McClain (Eds.),
Symbolizing and communicating: Perspectives on mathematical discourse, tools, and instructional
design (pp. 37–98). Mahwah, NJ: Lawrence Erlbaum Associates.
Sfard, A., & Lavie, I. (2005). Why cannot children see as the same what grown-ups cannot
see as different? Numerical thinking revisited. Cognition and Instruction, 23(2), 237–309.
Sigley, R., & Wilkinson, L. C. (2015). Ariel’s cycles of problem solving: An adolescent acquires
the mathematics register. The Journal of Mathematical Behavior, 40, 75–87.
Sireci, S. G., Han, K.T., & Wells, C. S. (2008). Methods for evaluating the validity of test scores
for English language learners. Educational Assessment, 13, 108–131.
Smith, M. S., & Stein, M. K. (2011). Five practices for orchestrating productive mathematics discus-
sions. Reston,VA: National Council of Teachers of Mathematics.
78 Mary A. Avalos et al.
Solano-Flores, G. (2006). Language, dialect, and register: Sociolinguistics and the estimation
of measurement error in the testing of English language learners. Teachers College Record,
108(11), 2354–2379.
Solano-Flores, G., Barnett-Clarke, C., & Kachchaf, R. R. (2013). Semiotic structure and
meaning making: The performance of English language learners on math tests. Educa-
tional Assessment, 18, 147–161.
Spanos, G., Rhodes, N. C., Dale,T. C., & Crandall, J. (1988). Linguistic features in mathemati-
cal problem solving: Insights and applications. In R. R. Cocking & J. P. Mestre (Eds.),
Linguistic and cultural influences on learning math (pp. 221–240). Hillsdale, NJ: Lawrence
Erlbaum Associates.
Staub, F. C., & Reusser, K. (1995). The role of presentational structures in understanding and
solving mathematical word problems. In C. A.Weaver, S. Mannes, & C. R. Fletcher (Eds.),
Discourse comprehension (pp. 285–305). Hillsdale, NJ: Lawrence Erlbaum Associates.
Strauss, A., & Corbin, J. (1990). Basics of qualitative research: Grounded theory procedures and tech-
niques. Newbury Park, CA: Sage.
Swanson, P. E. (2010). The intersection of language and mathematics. Mathematics Teaching in
the Middle School, 15(9), 516–523.
U. S. Department of Education. (2015). National center for education statistics, National Assess-
ment of Educational Progress (NAEP), 1990, 1992, 1996, 2000, 2003, 2005, 2007, 2009,
2011, 2013, and 2015 math assessments. Retrieved from http://nces.ed.gov/programs/
digest/d15/tables/dt15_222.12.asp?current=yes
Uptegrove, E. B. (2015). Shared communication in building mathematical ideas: A longitu-
dinal study. The Journal of Mathematical Behavior, 40, 106–130.
Van Essen, G., & Hamaker, C. (1990). Using self-generated drawings to solve arithmetic
word problems. Journal of Educational Research, 83(6), 301–312.
Wiest, L. R. (2008). Problem-solving support for English language learners. Teaching Children
Mathematics, 14(8), 479–484.
Wilkinson, L. (2015). The language of learning mathematics—Introduction. Journal of Math-
ematical Behavior, 40, 2–5.
Wilson, A. A. (2011). A social semiotics framework for conceptualizing content area literacies.
Journal of Adolescent & Adult Literacy, 54(6), 435–444. doi:10.1598/JAAL.54.6.5
Wolf, M. K., & Leon, S. (2009). An investigation of the language demands in content assess-
ments for English language learners. Educational Assessment, 14, 139–159.
Wolfram Research, Inc. (2008). Functions. Retrieved from http://functions.wolfram.com
Wolters, M. A. D. (1983). The part-whole schema and arithmetical problems. Educational
Studies in Mathematics, 14, 127–138.
Young, E., & Marroquin, C. (2008, October). Mathematics on the playground. School Science
and Mathematics, 108(6), 279–283.
Zweng, M. J., Geraghty, J., & Turner, J. (1979). Children’s strategies of solving verbal problems. Final
report. Washington, DC: National Institute of Education (ERIC Document Reproduc-
tion Service No. ED 178 359).
5
READING AND UNDERSTANDING
SCIENCE TEXTS
Gina N. Cervetti and P. David Pearson
Focus Points
• Understanding what we read involves the simultaneous extraction and con-
struction of meaning in response to written text.
• The traditional texts and textbooks of classroom science learning are replete
with challenging linguistic features, which may present problems for all learn-
ers but especially for English learners (ELs).
• There are many well-documented instructional routines and scaffolding tools
that teachers can use to promote greater understanding of science text on the
way to deeper learning of science content and inquiry practices.
• Teaching EL students to analyze and unpack science texts may support their
comprehension of science content.
• Modifying science texts to include more transparent links among ideas and
more elaborations may support comprehension for EL students and all readers,
but modifying texts presents a risk of simplification at the cost of cohesion.
• Alternative texts may provide opportunities for students to engage in more
authentic approximations of scientific reading, reasoning, and practice and may
be more accessible for EL students and all readers.
Chapter Purpose
The purpose of this chapter is to provide a valid and useful account of reading in
science, one that will equip teachers, especially teachers of EL students, with the
knowledge and conviction to champion the inclusion of reading (and writing and
discourse) as important allies in helping students learn science. In carrying out our
persuasive purpose, we begin by traversing several facets of theory and research:
(a) the nature of reading, both in general and in science, (b) the nature and role of
80 Gina N. Cervetti and P. David Pearson
text in science reading, and (c) the efficacy of curriculum and pedagogy designed
to improve reading in science as a means of improving science learning, inquiry,
and achievement.
The RAND panel added that comprehension always occurs in a surrounding socio-
cultural context “that shapes and is shaped by the reader and that interacts with each
of the three elements” (p. 11).
Reader and text factors are not new.They have been a part of reading theory since
humans first set eyes on print (or stone or picture) to make meaning of symbols.
Along with context these three have they have been a staple of theories of reading
comprehension since the cognitive revolution in the early 1970s (Pearson & Cer-
vetti, 2015). Reader variables include cognitive capacity, knowledge, language, and
motivational dispositions. Text variables entail genre, structure, media, sentence and
word complexity, content, and engagingness.
The activity factor, while implicit in earlier work, signals a unique contribu-
tion to the field by emphasizing all of the “things we do with text”—before, as,
and after we read. For example, we summarize, answer questions, and write essays
to prove to others that we have understood what we read, and we use reading to
accomplish personal and social goals. Most important, activities tend to differ from
one setting and discipline to another.
Context includes physical location and the expectations they entail.These expec-
tations often include implicit or explicit purposes that can be found in classrooms,
workplaces, or community settings. Context includes disciplinary settings, such as
science, literature, and history. Context also includes other people, especially teach-
ers in school settings, and the support or interference they provide. Finally, context
includes broad cultural, political, and historical practices that shape what counts as
legitimate ways of making sense of texts. Of all of the elements of school context,
Reading and Understanding Science Texts 81
none is more influential than teachers, for they engage in a range of moves that
shape student success—selecting texts, designing activities, and scaffolding their
reading (see Duke, Pearson, Strachan, & Billman, 2011).
There is little reason to believe the general processes of the Rand model do
not operate in explaining reading texts in STEM domains as they do in literature
or history. Similarities aside, there are clear contrasts across disciplines. Something
as basic as getting the main point of a piece depends upon the discipline: In sci-
ence, main points are often the most superordinate idea in an explanation, but in
literature they are often more like themes of human experience.
What most clearly distinguishes reading in science from reading in other disci-
plines are the peculiar characteristics of the discourse we use to “do” science—how
we talk science, how we argue science, how we marshal evidence in science, how
we write science, and, of course, how we read science (see, for example, Cervetti,
Pearson, Moje, & Greenleaf, 2013). Shanahan, Shanahan, and Misischia (2011), in
a core investigation on reading across disciplines, argue that despite core reading
practices across all disciplines (mainly decoding words and figuring out what the
text says), what really counts in disciplines are those discourse practices. If it is true
that the traditions of discourse in a discipline like science are primary drivers of
reading and writing practices, then the nature of texts assumes a large role in shap-
ing reading practices. Thus, to understand reading in science, we must understand
how science texts embody these discourse traditions.
Technical Language
Science textbooks tend to include a high proportion of specialized vocabulary terms,
such as names (e.g., mineral), adjectives (e.g., nocturnal), and process verbs (e.g., con-
dense), and verb phrases that define and categorize information (e.g., belong to) (Fang,
2005). Further complicating the language load is the number of these specialized
words that are orthographically identical to words that have everyday meanings
(e.g., volume) or have flexible parts of speech (e.g., record, result) (Fang, 2006).
Complex Sentences
Science textbooks are often filled with complex sentences that have multiple
dependent clauses that result in complicated syntactic structures. Fang (2006)
provides the following example: “Stars shine with their own light, while Venus
shines because it is reflecting light from the sun, just as the other planets and moons
do” (Fang, 2006, p. 503). The logical links in sentences like this one can be chal-
lenging for all students to unpack.
Implicit Relationships
In an examination of the logical relations in middle school science textbooks,
Roman, Jones, Basaraba, and Hironaka (2016) found that only about one-quarter
of the clauses in these textbooks included logical connectives. Thus, in most
cases, students need to infer the relations among ideas based on their knowledge
of how concepts are introduced and discussed in science texts. Unsworth (2001)
demonstrates how relationships that could be made explicit in science textbook
explanations are often left implicit, creating some ambiguity in the text. For exam-
ple, Unsworth points out how temporal markers such as “then” or “next” are often
omitted, as in this example, in which the addition of “then” at the beginning of the
second sentence would make the temporal sequence more explicit: “As the object
moved to the right, it pushes or compresses the air particles next to it. The com-
pressed air particles push on the particles to their right . . .” (p. 589). As Davison &
Kantor (1982) and Pearson (1974–75) pointed out long ago, textbook authors often
omit these connectives in order to reduce sentence length and readability scores.
For example, Davison and Kantor (1982) demonstrate how simplifications aimed
at lowering readability, such as replacing unfamiliar vocabulary (e.g., Hippocrates
becomes “One of the most famous Greek doctors”) and pulling apart complex sen-
tences (e.g.,“. . . tree will heal its own wound by growing” becomes “. . . tree will heal
its own wounds. It will grow”) can actually result in more difficult-to-understand
texts (pp. 204, 192). As yet, we do not know a great deal about how specific features
of these texts impact students’ comprehension, though it seems that lexical com-
plexity (the presence of technical science words) is a particular challenge for both
EL and non-EL students (Arya, Hiebert, & Pearson, 2011).
It is logically possible that linguistic complexity and conceptual complexity may
not be orthogonal; that is, authors may need complex syntax and vocabulary to
do justice to the complexity of the ideas they are trying to convey. To attempt
to explain those ideas with simple syntax and common words may introduce
Reading and Understanding Science Texts 83
process to provide supports in the text (altering aspects of the text to render
them more comprehensible), in the instruction provided by the teacher (things
teachers DO before, during, or after reading), through the actual tasks students
do to demonstrate their understanding (e.g., answering factual questions, sum-
marizing, or using text-based ideas to solve a problem), or in the contextual
surround (e.g., changing the social nature of the reading through collaboration).
In searching for approaches that support EL students’ comprehension of science
text, and following the lead of Goldenberg (2008, 2011, 2013), we gathered
evidence from several potentially relevant lines of inquiry; namely, practices that
promote the reading comprehension of:
Even though the fourth area (science texts for EL students) is the ideal target
for this chapter in this volume, we wanted to leave no relevant stone unturned, so
we chose to examine the three other areas to mine insights that can be applied to
the EL students’ reading of science texts.
• Cognates (words with shared meanings that have common etymological roots,
such as geography and geografía); brief explanations in the home language (not
direct concurrent translations, which can cause students to “tune out” while
English is being spoken);
• Lesson preview and review (lesson content is previewed in students’ home
language to provide some degree of familiarity when the lesson is taught; fol-
lowing the lesson, there is a review in the home language to solidify and check
for understanding); and
• Strategies taught in the home language (reading, writing, and study strategies
are taught in the home language but then applied to academic content in
English) (p. 10).
linguistic features of science texts. Turkan, De Oliveria, Lee, and Phelps (2014)
argue that teachers should be prepared to help students, and especially EL stu-
dents, participate in disciplinary discourses and understand how textual meanings
are constructed in different disciplinary texts. They suggest that teachers need to
know what linguistic features (vocabulary, grammatical structures, syntactic struc-
tures, etc.) are associated with texts in different disciplines and how these might
challenge EL students’ comprehension and communication of disciplinary ideas.
This kind of knowledge enables teachers to model reasoning with texts, moving
back and forth between everyday language and the academic registers. In particular,
they should be able to unpack form-meaning relationships in text and make these
explicit for students in ways that support comprehension. Turkan et al. also dem-
onstrate this knowledge in action as they describe ways that teachers can help EL
students participate in the disciplinary discourse through supported writing experi-
ences, noting that “participation in the ways of conveying meaning could make a
significant difference in students’ opportunity to learn the content and the valued
ways of learning disciplinary knowledge” (p. 21).
Turkan et al. offer images of this kind of teaching in a middle school science
classroom. In one example segment, the teacher, Mr. Meadows, is working with an
EL student to revise her writing about a science experiment about the impact of
temperature on the chemical reaction between yeast, water, and sugar. The student
has written, “Every 10 minutes, we measured the water that the bags pushed out.”
Meadows uses questioning and rephrasing of language choices to guide the student
toward the use of more precise scientific terminology, e.g., “Did you measure the
water or the volume of the water?” “Do you mean displaced [with respect to “pushed
out”]?” (p. 19). As Meadows questions and rephrases the students’ writing, he pro-
vides guidance for increasing the scientific precision of the text.
Fang and Schleppegrell (2010) offer language-oriented strategies for supporting
students’ close reading of disciplinary texts. Their approach, functional language
analysis, uses the examination of the language in texts to help students understand
and discuss the patterns of language and associated meanings that are characteristic
of particular disciplines, such as science. The intent is to help students understand
how language works in the discipline, so they can “comprehend and critique” the
texts they encounter in secondary content area study (p. 588). Fang and Schleppe-
grell describe ways teachers can model how to read dense science texts. This
involves first teaching students a set of constructs for naming language features in a
text, such as processes (usually the verbs), participant (actors or goals) and types of
processes (e.g., being processes and doing processes). The teacher then leads students
in a process of identifying and reworking the text in the interest of meaning making
and critique. This includes unpacking dense phrases, identifying the processes and
associated participants/actors in the text, and rearranging passive sentences to make
clear the actors and goals. Ideally, students engage in active conversations about
the meaning of sentences and, through this process, gain insight into the construc-
tion of texts—e.g., that definitions often become long noun phrases that condense
information. The analysis also includes attention to the overall organization of the
text. Students often use visual displays to highlight the organization and flow of
information and identify the important themes in the text. Finally, the teacher leads
Reading and Understanding Science Texts 89
students in exploring choices in mood, modality, and the selection of words that
reveal something about the perspective of the author.
Others operating in a functional language analysis tradition have offered a set
of related strategies for helping students uncover and make sense of the organization
of scientific language in school texts. Fang (2006) and others have recommended
helping students develop an awareness about the specialized use of keywords and
phrases (e.g., prepositions) in the development of scientific arguments (p. 514).
This would include things like understanding how words and phrases such as like-
wise, unlike, in contrast, and similarly are used to introduce comparisons in science
(Fang, 2006, p. 515). It may also involve teaching students specific morphologi-
cal features (e.g., affixes, combining forms, and roots—especially Latin and Greek)
to build students’ knowledge of science vocabulary and their ability to infer word
meaning from word-internal (i.e., morphological) resources.
Unsworth (2001) and Fang (2006) recommend both unpacking or “talking out”
highly nominalized science texts to create more familiar noun-verb structures and
also modeling how to build nominalized forms and noun phrases. Unsworth points
to progressive cloze tasks that demonstrate how verbs become nominalized in sci-
ence texts, for example, how “the object . . . compresses” becomes “the compression
travels” later in a paragraph (p. 606).
Studies are beginning to demonstrate how language mediation can support EL
students in appropriating science registers in talk and writing (e.g., Gibbons, 2003).
In addition, using these approaches with EL students to support comprehension of
complex disciplinary texts has preliminary support in disciplinary-based research,
though not yet in science. For example, in the context of high school history, Achu-
gar and Carpenter (2012) used a design experiment to investigate how language
analysis could be used to support EL students’ comprehension of historical docu-
ments. The teacher in the study guided students through close readings of primary
texts, attending to how linguistic choices made by authors worked to make mean-
ing. The researchers were able to document changes in students’ comprehension
of historical documents. They were also able to document changes in students’
appropriation of features of academic language (e.g., grammatical intricacy and use
of technical vocabulary) in writing.
Though not in the tradition of systemic functional linguistics (SFL), some schol-
ars have demonstrated that teaching students to analyze the features and struc-
tures of scientific texts can support their comprehension and their science learning.
Romance and Vitale (2001) have developed rich routines for guiding students’
understanding of complex science texts.The routines involve text analysis in which
students learn to identify the various linguistic and visual features common to
science texts and how these are used to represent science ideas. Following the
text analysis, the teacher and students work together to develop concepts maps
that represent the ideas graphically. These maps are then used as the basis for stu-
dents’ writing about the science ideas. These analysis-mapping-writing sequences
not only helped students make sense of the concepts in the texts but also helped
students come to understand how these texts are organized. Experimental studies
of this approach showed positive impacts on students’ science learning and reading
comprehension.
90 Gina N. Cervetti and P. David Pearson
We have bolded the six NAS recommendations that overlap with the insights we
have garnered about reading instruction from our journey through this literature.
It is interesting to note the degree of congruence observed; moreover, had we
reviewed some other bodies of literature beyond just reading comprehension, we
might have incorporated even more, perhaps all, of the NAS practices.
examples from Davison and Kantor so aptly illustrate. Fillmore and Snow (2002)
have also cautioned that text simplifications for EL students often result in the kinds
of short, choppy sentences that compromise, rather than support, comprehension.
Crossley and McNamara (2016) assessed L2 high school and college students’
recall and comprehension of authentic and simplified texts (beginning and inter-
mediate level) on a variety of topics, including some science topics. The simplified
texts were designed for stronger cohesion (“more semantic similarity, noun overlap,
word repetition, syntactic similarity, and causality”) and simplified vocabulary. The
participants recalled more text-based propositions from the simplified texts. How-
ever, participants who read the intermediate and authentic text produced more
relevant elaborations in their retellings—i.e., possibly an indication that the partici-
pants were forming connections to prior knowledge and forming more coherent
mental representations of the text. The authors suggest that simplified texts may
be useful if the instructional goal is recall of information. However, if the goal is
learning from text, the more challenging, authentic texts appear to better support
this goal.
Rahimi and Rezaei (2011) examined the effects of two kinds of text modifications—
syntactic elaboration and syntactic simplification—on EL college students’ under-
standing of passages related to civil engineering. Syntactic elaboration involved
adding redundancy and paraphrasing to signal the thematic structure of the text.
For syntactic simplification, the researchers applied techniques such as replacing
complex sentences with two simple sentences and shifting sentences from passive to
active ordering. Both types of modifications resulted in improved comprehension
over the original textbook passages, though there was some suggestive evidence
that students benefited more from the elaborations.
Hall et al. (2015) found that secondary school students (12–13 years old) had
better comprehension of science textbook passages that were designed to be high
cohesion compared with low cohesion texts.Text cohesion is the degree to which a
text supports the reader in establishing a coherent mental model of the material by
including inference-supportive language features, such as connectives and overlap-
ping words and clear pronoun references (often other nouns to “rename” a concept
rather than pronouns). In addition, higher cohesion texts include elaborations on
concepts that increase explicitness. By way of example: In the Hall et al. study, a
high cohesion text contained the sentence “When you heat a solid the particles
within it start to vibrate faster” (p. 128). The parallel low cohesion version read,
“Heated solid particles vibrate faster” (p. 128). The lower cohesion version of the
text requires the reader to generate inferences that include information that is not
explicitly provided in the text in order to achieve meaning. In addition to recom-
mending the use of more cohesive texts, Hall et al. suggest pedagogical approaches
similar to those described earlier: Teachers lead students through question-driven
inquiries of how meaning is established and built in the texts. Manavathu and Zhou
(2012) developed a set of differentiated instructional texts for high school students
engaging in laboratory exercises in chemistry and physics. EL students read regu-
lar or modified instructions for the laboratory experiments and were interviewed
about their comprehension. The modified instructions were of two types. The first
included the same text but used illustrations to support understanding of the pro-
cedures.The second involved language modifications such as simplified English and
92 Gina N. Cervetti and P. David Pearson
the addition of simple definitions for new vocabulary words.The researchers found
that the students had better task comprehension when reading both kinds of modi-
fied instructions. They also preferred the modified handouts.
The small body of research on text modification for EL students in science
seems to suggest that modification can be helpful, but simplification may not be
the best approach. Most important is the finding that high cohesion and well-
elaborated texts may do more to support students’ learning from science text and
participation in science than attempts to simplify texts to the point where they no
longer carry key information. Visual representation and other elaborations appear
to achieve greater cohesion, leading to more coherent mental representations.
for students to engage with current science and to better understand the process
of conducting and reporting research. Baram-Tsabari and Yarden (2005) found
that, compared with popular secondary science literature, using adapted primary
literature helped high school students take a more agential stance toward the sci-
ence described in the article, raising more criticism of the researchers’ work and
methodology, and suggesting more future applications of the technology described
in the article. Hearkening back to our earlier claim that oversimplification may be
a mistake, it is important to note that in this work, the resulting “simplified” texts
were still rather sophisticated in comparison to conventional science texts; what
made them “alternative” was their adherence to the structure of reporting scientific
results from empirical investigations. Other studies have similarly shown that stu-
dents think more critically about the information presented in APL texts than in
more conventional texts (e.g., Norris, Stelnicki, & de Vries, 2012).
A second type of alternative text present science concepts as the product of sci-
entific experiments. For example, Kloser (2013) engaged high school students in
reading traditional textbook accounts and more “epistemologically considerate”
(EC) accounts of biology concepts. The EC texts were presented as narratives of
historical experiments that led to scientific claims about the biological concepts
addressed in the texts. In think alouds and interviews, some students described chal-
lenges in comprehending the EC texts in part because the genre was unfamiliar.
In general, however, students expressed stronger interest in the EC accounts and
higher levels of trust in these texts. Similarly, Arya and Maul (2012) compared mid-
dle school students’ recall and understanding of science concepts after reading one
of two versions of a text on two science topics. The first version was a traditional
expository text. The second was a “discovery narrative,” which presented the same
information as a narrative description of how scientists made the discoveries—
e.g., the process that led Marie Curie to her theory of radioactivity. Students who
read the discovery narratives had higher scores on the outcome measure for one of
the two texts immediately after reading and for both texts on a delayed posttest.
A third approach to alternative texts is to use trade books and trade-type texts.
Fang (2013) argues that trade books can be used to expose students to a wide variety
of reading materials in disciplinary learning. Fang suggests that, “compared to tradi-
tional textbooks, trade books are better able to accommodate the needs of students
with diverse backgrounds, interests, needs, and reading levels. They also provide more
contextualized, focused, in-depth, and up-to-date coverage of content” (p. 274). Fang
and Wei (2010) augmenting sixth-graders’ inquiry-based science curriculum with a
reading component involving explicit strategy instruction and the use of trade texts.
The strategies taught to students included predicting, questioning, concept mapping,
paraphrasing, and morphemic analysis. The trade books were selected from award-
winning science books on a wide variety of topics and at a wide variety of levels.The
sixth-graders who received the augmented curriculum made greater gains over six
months in reading comprehension (on a standardized measure) and knowledge of the
science content. Trade-type texts have also been used in a number of science inter-
ventions at the elementary level that have demonstrated positive impacts on students’
science (and literacy) learning (e.g., Guthrie, McRae, & Klauda, 2007).
In our own work in an integrated science-literacy curriculum, Seeds of Sci-
ence/Roots of Reading (Seeds/Roots), we developed texts that helped students to
94 Gina N. Cervetti and P. David Pearson
engage in inquiry and enhance their conceptual understandings while also learning
and applying strategies to read challenging texts. That is, activity of science should
shape the nature of the texts rather than the other way around. One of the main
principles underlying the Seeds/Roots model is that texts can and should be used in
a variety of roles in support of students’ inquiries.
We developed a framework for the roles that text can play in supporting stu-
dents’ involvement in inquiry-based science (Cervetti & Barber, 2008). We pro-
posed, for example, that texts can provide models of students’ investigations, of the
kinds of writing scientists and student-scientists do, and of the dispositions that
scientists bring to their work. Texts can also provide information to directly fuel
students’ firsthand experiences. This idea that texts can and should be used in a
variety of roles in support of students’ inquiries became one of the main principles
underlying the Seeds/Roots instructional model. In line with this approach, students
read texts that model the inquiry processes and writing genres that they will use
in their own investigations, they search handbooks for information that they can
leverage as they investigate in a firsthand way, and they read books that situate their
classroom-based inquiries in the wider natural world. The text roles framework
demonstrates how text and firsthand experiences can work together in support of
students’ learning. Using the text roles framework helped us to design texts that
students could approach as tools for their ongoing investigations—and could use
in ways that more authentically mirror scientists’ use of texts. It also enabled us to
design texts that were accessible for young students, including young EL students,
without the kinds of simplifications that compromise cohesion. Bravo and Cer-
vetti (2014) found that fourth- and fifth-grade EL students who used the Seeds/
Roots instructional materials made greater growth in their science understanding
and vocabulary than students using content-comparable curricula with a hands-on
focus. Although the differences in growth cannot be attributed to the texts alone,
the use of the texts was one key difference between the two approaches.
• Emphasizing the routine that enhances students’ ability to identify, unpack, and
understand the complex features of science textbooks;
• Modifying texts in ways that create greater elaboration and cohesion; and
• Choosing alternatives to traditional science textbooks.
Note
1 We could review this literature in several of our subsections because there is work on text
structure for EL students and for all students and in all disciplines, including science, social
studies, and literature. Because of the focus of this volume, we have chosen to focus our
review on the literature most relevant to EL students reading science texts.
98 Gina N. Cervetti and P. David Pearson
References
Achugar, M., & Carpenter, B. D. (2012). Developing disciplinary literacy in a multilingual
history classroom. Linguistics and Education, 23, 262–276.
Arya, D. J., Hiebert, E. H., & Pearson, P. D. (2011). The effects of syntactic and lexical com-
plexity on the comprehension of elementary science texts. International Electronic Journal
of Elementary Education, 4(1), 107–125.
Arya, D. J., & Maul, A. (2012). The role of the scientific discovery narrative in middle
school science education: An experimental study. Journal of Educational Psychology, 104,
1022–1032.
Baram-Tsabari, A., & Yarden, A. (2005). Text genre as a factor in the formation of scientific
literacy. Journal of Research in Science Teaching, 42, 403–428.
Bernhardt, E. B. (2011). Understanding advanced second-language reading. London: Routledge.
Bravo, M. A., & Cervetti, G. N. (2014). Attending to the language and literacy needs of
English Learners in science. Equity & Excellence in Education, 47, 230–245.
Cervetti, G. N. (2013). Integration of literacy and science. In B. M. Taylor & K. Duke (Eds.),
Handbook on effective literacy instruction (pp. 371–393). New York: Guilford Press.
Cervetti, G. N., & Barber, J. (2008). Text in hands-on science. In E. H. Hiebert & M. Sailors
(Eds.), Finding the right texts: What works for beginning and struggling readers (pp. 89–108).
New York: Guilford Press.
Cervetti, G. N., Barber, J., Dorph, R., Pearson, P. D., & Goldschmidt, P. (2012).The impact of
an integrated approach to science and literacy in elementary school classrooms. Journal of
Research in Science Teaching, 49, 631–658.
Cervetti, G. N., Pearson, P. D., Greenleaf, C., & Moje, E. B. (2013). Science? Literacy? Syn-
ergy! In W. Banko, M. L. Grant, M. E. Jabot, A. J. McCormack, & T. O’Brien (Eds.), Sci-
ence for the next generation: Preparing for the new standards (pp. 99–124). Washington, DC:
NSTA & STANYS.
Crossley, S. A., & McNamara, D. S. (2016). Text-based recall and extra-textual generations
resulting from simplified and authentic texts. Reading in a Foreign Language, 28, 1–19.
Davison, A., & Kantor, R. (1982). On the failure of readability formulas to define readable
texts: A case study from adaptations. Reading Research Quarterly, 17, 187–209.
Dole, J. A., Nokes, J. D., & Drits, D. (2009). Cognitive strategy instruction. In G. G. Duffy &
S. E. Israel (Eds.), Handbook of research on reading comprehension (pp. 347–372). Mahwah,
NJ: Erlbaum.
Duke, N. K., Pearson, P. D., Strachan, S. L., & Billman, A. K. (2011). Essential elements of fos-
tering and teaching reading comprehension. In S. J. Samuels & A. E. Farstrup (Eds.), What
research has to say about reading instruction (4th ed., pp. 51–93). Newark, DE: International
Reading Association.
Fang, Z. (2005). Scientific literacy: A systemic functional linguistics perspective. Science Educa-
tion, 89, 335–347.
Fang, Z. (2006). The language demands of science reading in middle school. International
Journal of Science Education, 28, 491–520.
Fang, Z. (2013). Disciplinary literacy in science: Developing science literacy through trade
books. Journal of Adolescent & Adult Literacy, 57, 274–278.
Fang, Z., & Schleppegrell, M. J. (2010). Disciplinary literacies across content areas: Support-
ing secondary reading through functional language analysis. Journal of Adolescent & Adult
Literacy, 53, 587–597.
Fang, Z., & Wei,Y. (2010). Improving middle school students’ science literacy through read-
ing infusion. The Journal of Educational Research, 103, 262–273.
Fillmore, L. W., & Snow, C. E. (2002). What teachers need to know about language. In C. T.
Adger, C. E. Snow, & D. Christian (Eds.), What teachers need to know about language (pp.
7–53). Washington, DC and McHenry, IL: Center for Applied Linguistics and Delta Sys-
tems Co., Inc.
Ford, D. J. (2009). Promised and challenges for the use of adapted primary literature in sci-
ence curricula: Commentary. Research in Science Education, 39, 385–390.
Reading and Understanding Science Texts 99
Gallagher, M. C., & Pearson, P. D. (1982, December). The role of reading in content area instruc-
tion. Paper presented at the National Reading Conference, Clearwater, FL.
García, O. (2009). Education, multilingualism and translanguaging in the 21st century. In A.
Mohanty, M. Panda, R. Phillipson, & T. Skutnabb-Kangas (Eds.), Multilingual education for
social justice: Globalising the local (pp. 140–158). New Delhi: Orient Blackswan.
Gibbons, P. (2003). Mediating language learning:Teacher interactions with ESL students in a
content-based classroom. TESOL Quarterly, 37, 247–273.
Goldenberg, C. (2008).Teaching English language learners:What the research does-and does
not-say. American Educator, 32(2), 8–44.
Goldenberg, C. (2011). Reading instruction for English language learners. In M. Kamil, P. D.
Pearson, E. Moje, & P. Afflerbach (Eds.), Handbook of reading research (Vol. 4, pp. 684–710).
New York: Routledge.
Goldenberg, C. (2013). Unlocking the research on English learners: What we know—and
don’t yet know—about effective instruction. American Educator, 37(2), 4–11.
Graves, M. F., August, D., & Mancilla-Martinez, J. (2013). Teaching vocabulary to English lan-
guage learners. New York: Teachers College.
Greenleaf, C. L., Litman, C., Hanson, T. L., Rosen, R., Boscardin, C. K., Herman, J., . . . &
Jones, B. (2011). Integrating literacy and science in biology:Teaching and learning impacts
of Reading Apprenticeship Professional Development. American Educational Research Jour-
nal, 48, 647–717.
Greenleaf, C., & Valencia, S. W. (2017). Missing in action: Learning from texts in subject-
matter classrooms. In K. Hinchman & D. Appleman (Eds.), Handbook of adolescent literacy
(pp. 235–256). New York: Guilford Press.
Guthrie, J. T., McRae, A., & Klauda, S. L. (2007). Contributions of concept-oriented read-
ing instruction to knowledge about interventions for motivations in reading. Educational
Psychologist, 42, 237–250.
Guthrie, J. T., Wigfield, A., & You, W. (2012). Instructional contexts for engagement and
achievement in reading. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook
of research on student engagement (pp. 601–634). New York: Springer.
Hall, S. S., Kowalski, R., Paterson, K. B., Basran, J., Filik, R., & Maltby, J. (2015). Local text
cohesion, reading ability and individual science aspirations: Key factors influencing com-
prehension in science classes. British Educational Research Journal, 41(1), 122–142.
Kloser, M. (2013). Exploring high school biology students’ engagement with more and
less epistemologically considerate texts. Journal of Research in Science Teaching, 50(10),
1232–1257.
Manavathu, M., & Zhou, G. (2012). The impact of differentiated instructional materials on
English Language Learner (ELL) students’ comprehension of science laboratory tasks.
Canadian Journal of Science, Mathematics, and Technology Education, 12, 334–349.
National Academies of Sciences, Engineering, and Medicine. (2017). Promoting the educa-
tional success of children and youth learning English: Promising futures. Washington, DC: The
National Academies Press.
National Governors Association Center for Best Practices and Council of the Chief State
School Officers. (2010). Common Core state standards for English language arts and literacy in
history/social studies, science, and technical subjects. Washington, DC: Author. Retrieved from
www.corestandards.org/wp-content/uploads/ELA_Standards.pdf
National Reading Panel. (2000). Report of the national reading panel:Teaching children to read: An
evidence-based assessment of the scientific research literature on reading and its implications for read-
ing instruction: Reports of the subgroups.Washington, DC: National Institute of Child Health
and Human Development, National Institutes of Health.
NGSS Lead States. (2013). Next Generation Science Standards: For states, by states. Washington,
DC: The National Academies Press.
Norris, S. P., Stelnicki, N., & de Vries, G. (2012). Teaching mathematical biology in high
school using adapted primary literature. Research in Science Education, 42(4), 633–649.
Palincsar, A. S., & Magnusson, S. J. (2001). The interplay of first-hand and text-based inves-
tigations to model and support the development of scientific knowledge and reasoning.
100 Gina N. Cervetti and P. David Pearson
In S. Carver & D. Klahr (Eds.), Cognition and instruction: Twenty-five years of progress (pp.
151–193). Mahwah, NJ: Lawrence Erlbaum Associates.
Paris, S. G., & Paris, A. H. (2001). Classroom applications of research on self-regulated learn-
ing. Educational Psychologist, 36, 89–101.
Pearson, P. D. (1974–1975).The effects of grammatical complexity on children’s comprehen-
sion, recall and conception of semantic relations. Reading Research Quarterly, 10, 155–192.
Pearson, P. D., & Cervetti, G. N. (2015). Fifty years of reading comprehension theory and
practice. In P. D. Pearson & E. H. Hiebert (Eds.), Research-based practices for Common Core
literacy (pp. 1–24). New York: Teachers College Press.
Pearson, P. D., Moje, E., & Greenleaf, C. (2010). Science and literacy: Each in the service of
the other. Science, 328, 459–463.
Puzio, K., Keyes, C., & Jiménez, R. T. (2017). Let’s translate! Teaching literary concepts with
English language learners. In K. A. Hinchman & D. A. Appleman (Eds.), Adolescent litera-
cies: A handbook of practice-based research (pp. 276–291). New York: Guilford Press.
Rahimi, M. A., & Rezaei, A. (2011). Use of syntactic elaboration techniques to enhance
comprehensibility of EST texts. English Language Teaching, 4, 11–17.
RAND Reading Study Group. (2002). Reading for understanding:Toward a research and develop-
ment program in reading comprehension. Santa Monica, CA: Office of Education Research
and Improvement.
Roman, D., Jones, F., Basaraba, D., & Hironaka, S. (2016). Helping students bridge infer-
ences in science texts using graphic organizers. Journal of Adolescent & Adult Literacy, 60,
121–130.
Romance, N. R., & Vitale, M. R. (2001). Implementing an in-depth expanded science model
in elementary schools: Multi-year findings, research issues, and policy implications. Inter-
national Journal of Science Education, 23, 272–304.
Shanahan, C., Shanahan, T., & Misischia, C. (2011). Analysis of expert readers in three disci-
plines: History, mathematics, and chemistry. Journal of Literacy Research, 43, 393–429.
Taboada, A., Bianco, S., & Bowerman, V. (2012). Text-based questioning: A comprehension
strategy to build English language learners’ content knowledge. Literacy Research and
Instruction, 51, 87–109.
Taboada, A., & Rutherford, V. (2011). Developing reading comprehension and academic
vocabulary for English language learners through science content: A formative experi-
ment. Reading Psychology, 32(2), 113–157.
Turkan, S., De Oliveria, L. C., Lee, O., & Phelps, G. (2014). Proposing a knowledge base for
teaching academic content to English language learners: Disciplinary linguistic knowl-
edge. Teachers College Record, 116, 1–30.
Unsworth, L. (2001). Evaluating the language of different types of explanations in junior
high school science texts. International Journal of Science Education, 23(6), 585–609.
Verhoeven, L. (2011). Second language reading acquisition. In M. L. Kamil, P. D. Pearson,
E. B. Moje, & P. Afflerbach (Eds.), Handbook of reading research (Vol. 4, pp. 661–683). New
York: Routledge.
Wallace, C. S., Tsoi, M. Y., Calkin, J., & Darley, W. M. (2003). Learning from inquiry-based
laboratories in nonmajor biology: An interpretive study of the relationships among
inquiry experience, epistemologies, and conceptual growth. Journal of Research in Science
Teaching, 40, 986–1102.
Wilkinson, I. A. G., & Son, E. H. (2011). A dialogic turn in research on learning and teaching
to comprehend. In M. L. Kamil, P. B. Rosenthal, P. D. Pearson, & R. Barr (Eds.), Handbook
of reading research (Vol. 4, pp. 359–387). New York: Routledge.
Wong-Fillmore, L., & Fillmore, C. J. (2012). What does text complexity mean for English
learners and language minority students? Proceedings from the Understanding Language
Conference: Understanding language: Language, literacy, and learning in the content areas. Stan-
ford, CA: Stanford University.
6
WRITING IN MATHEMATICS
CLASSROOMS
Richard Barwell
Focus Points
• Writing in mathematics involves considerable complexity, including the inte-
gration of natural language, symbols and images, and the coordination of dif-
ferent overlapping functions of written language (i.e., to communicate ideas,
to relate to the reader, and to organize the text).
• There are multiple purposes for writing in mathematics, including writing for
oneself or writing for others, writing to advance mathematical thinking, and
writing for assessment or evaluation.Writing in mathematics can help students
to organize and deepen their mathematical thinking.
• English learners (ELs) may draw on a wide range of linguistic or semiotic
resources to make meaning through writing in mathematics by drawing on a
variety of discourses, voices, and languages.
• Learning to writing mathematically and teaching students to write mathemat-
ically entails negotiating a tension between the imposition of the conventions
of formal mathematical discourse on the one hand and allowing students to
draw on other resources in order to write in ways that are meaningful to them.
Teachers therefore need to adopt a responsive instructional approach to sup-
port students’ learning of the conventions of written mathematics.
Chapter Purpose
In this chapter, I provide an overview of research on writing in mathematics class-
rooms, including some of the features of formal written mathematics, and with par-
ticular attention to EL students. I introduce a Bakhtinian perspective on language,
including an emphasis on the inherent tensions between standard or conventional
forms and students’ diverse expressions of mathematical meaning. I highlight the dif-
ferent kinds of resources EL students may use in their development of mathematical
102 Richard Barwell
writing. I illustrate my discussion with examples of the role of writing from the
literature and from my own research into second language learners of mathematics.
Curtis and Alex are in a combined Grade 5/6 mathematics class for EL students.
They are both from an indigenous Canadian background and speak Cree and Eng-
lish.1 They are working on the following problem:
What do you think about their writing? What do you think it shows about Curtis’s
and Alex’s mathematical thinking? Do you think the errors of grammar and punc-
tuation matter? How might you help the two students to develop their mathemati-
cal writing? What does good mathematical writing look like?
I spent a year conducting ethnographic fieldwork in Curtis’s and Alex’s math-
ematics classes (as well as in several other classrooms in different schools). I regularly
observed how students were able to read and solve mathematics problems with
relative ease, but then struggled to formulate written explanations of their solutions.
Why do we ask students to write such explanations? How do they contribute to
students’ learning? How does students’ mathematical writing help teachers? What
challenges do students and teachers face? What might some of the drawbacks be?
Research addressing these questions is not extensive. Nevertheless, there is suffi-
cient research to develop some general observations about writing in mathematics.
In this chapter, I discuss the nature and role of writing in mathematics class-
rooms, with particular attention to EL students like Curtis and Alex. I consider the
Writing in Mathematics Classrooms 103
nature of mathematical writing and some possible purposes for writing in math-
ematics classrooms, as well as ways in which writing in mathematics contributes to
learning mathematics. My focus is particularly on more extended writing, such as
Curtis’s and Alex’s explanations, or students’ justifications of their solutions to prob-
lems, rather than on the traces of students’ calculations or the minimal responses to
a page of exercises, so common in many classrooms. I do not discuss students’ writ-
ing about their experiences of mathematics, such as journal writing (e.g., Borasi &
Rose, 1989; Clarke, Waywood, & Stephens, 1993), preferring to focus on students’
writing as part of doing mathematics.
Curtis’s writing, for example, lexical items like “pattern” or “2,” actions like “add-
ing,” and qualifiers like “each” all contribute to the ideational aspect of his text.
The interpersonal metafunction organizes the identities of the author and readers
of a piece of writing, as well as the relationship between them. In Curtis’s writing,
he uses passive forms quite common in mathematics:That is, he writes “the pattern
develops” rather than “I developed the pattern” or “you develop the pattern.” Like
any writing, mathematical writing also constructs its reader through, for example,
the assumptions it makes about what the reader knows, as well as in the degree of
formality adopted (see also Pimm & Sinclair, 2009).
The textual metafunction deals with the organization of material within a text.
In mathematics, the textual metafunction is important in, among other things, the
construction of a mathematical argument or explanation, by making connections
across the text, at sentence, paragraph, and discourse levels. In Curtis’s text, for
example, he uses the word “like,” as well as the sequencing of his writing, to link
two instances of the method he used (based on 8 and 16) to his initial general state-
ment of that method (“adding 2 in each number”). Indeed, without his examples,
it would be difficult to interpret “adding 2 in each number” in the way Curtis
intends.
Morgan (1998) uses these ideas to identify some common features of formal
mathematical text, through a comparison of a published mathematics article with
written mathematical explanations of secondary school students. Some preferred
features in the social context of mathematics are shown in Table 6.1.
Morgan (2001) also identifies some general features of mathematics writing
that arise from specific combinations of the three metafunctions: the avoidance
different from informal notes or a mathematics textbook, although they all have
much in common (see Burton & Morgan, 2000, for an analysis of writing in aca-
demic mathematics journals). The norms of mathematical writing have a purpose:
A shared sense of how mathematics should be written eases communication within
a mathematical community. But these norms are not set in stone and vary over time
and context. Good mathematical writing reflects these norms to effectively com-
municate and contribute to mathematical thinking. Effective communication is a
goal reflected in, among other things, the NCTM process standards, which includes
reference to coherence, precision, and organizing thinking.
. . . self • Trying out and verifying • Trying out and verifying ideas, including
ideas. making sense of a mathematics problem,
• Recording information for trying some cases, completing calculations
later use. exploring a pattern, checking a conjecture.
• Recording information for later use, such
as from one class to the next.
. . . others • Communicating with • Communicating ideas to peers during
collaborators. the problem-solving process, e.g., when
• Writing up a paper working in groups or sharing ideas with
reporting the work. the class.
• Writing up work, such as a solution and its
justification, a project report, etc., often for
the purposes of assessment.
• Writing for all of these purposes may also
be interpreted by teachers for formative
assessment or summative evaluation.
Writing in Mathematics Classrooms 107
Every time you start a new triangle you add the number that comes after the
bottom one. For example, when you had the second triangle the bottom had
2 dots. The next number after two is 3. So when you add three to it plus the
one on top you get 6. I am going to do this for all the numbers. I am check-
ing over the problem now. My correct answer for the tenth triangle is 55.
(Pugalee, 2001, p. 242)
This student’s response displays evidence of interpreting the problem (she notices
a pattern in the diagram), making a plan (“I am going to do this . . .”), perform-
ing calculations and reviewing her work (“I am checking over my problem”). The
value of these kinds of mathematical writing behaviors has been observed in several
other studies (e.g. Steele, 2005; Martin, 2015). From a Vygotskian perspective, this
work suggests that writing in mathematics helps students to organize their think-
ing within the problem-solving process. Asking students to write explanations, for
108 Richard Barwell
example, is not simply a useful way for teachers to learn about students’ thinking;
the writing process is itself an important way of developing mathematical thinking.
Students, of course, do not naturally write in a formal mathematical style. Learn-
ing to write mathematically goes hand in hand with learning mathematics. Santos
and Semana (2015) investigated how students can be guided to develop a more
formal style, with a focus on developing students’ written mathematical justifi-
cations. Groups of Portuguese secondary school students were given a series of
writing tasks to work on, related to complex mathematical problems. Each group
submitted a first version of their writing, received feedback from their teacher, and
then reworked their text. Much of the teacher’s feedback encouraged students to
be more explicit and detailed in their justifications, such as explaining their choice
of calculations and clarifying their ideas. Feedback also prompted students to make
use of all three semiotic systems (natural language, symbols, and visual images) and
make links between them. For example, in the first draft for one problem, stu-
dents presented diagrams with some symbols. The teacher prompted the students
to elaborate their writing by also including explanatory verbal text to clarify what
different values in the diagrams represented. The second version of their writing
was, as a result, more comprehensive and precise than the first. This research sug-
gests that students can be supported to develop more mathematically appropriate
forms of written explanation (in this case, more explicit and detailed), through a
process of drafting and redrafting.
Best Practices
So far, I have reviewed some key ideas arising in research on writing in math-
ematics classrooms in general. For EL students, however, writing in mathematics
may pose particular challenges. Most research on the learning and teaching of
mathematics by EL students has focused on spoken interaction. Among other
things, this work has shown how EL students draw on multiple resources, includ-
ing other languages, everyday language, and diagrams, to think about mathemat-
ics (see, for example, Moschkovich, 2009). These ideas are also useful in thinking
about writing in mathematics.
To think about the notion of resource more carefully, I draw on a dialogic
perspective on language developed by Mikhail Bakhtin. As a literary theorist, his
work focused on understanding the language of novels and poetry, through which
he developed a multifaceted theory of language. All language use reflects multiple
ways of speaking and writing. We can think, for example, of the language of math-
ematics, of the curriculum, of teachers, of the schoolyard, and so on. Language-
as-it-is-used is always diverse and always changing, with new forms emerging and
others disappearing. As noted earlier, for example, the language of mathematics has
changed over time. This constantly shifting diversity is called heteroglossia (Bakhtin,
1981), and without it, it would not be possible to ever say anything new. Bakhtin-
ian scholars further divide heteroglossia into three dimensions: multiple discourses,
multiple voices, and multiple languages (Busch, 2014). At the same time, language
has a certain stability, often thought of in terms of rules and structures, and of cor-
rect and incorrect usage. This underlying stability, or at least the idea of it, is known
as unitary language (Bakhtin, 1981). Heteroglossia and unitary language are both
Writing in Mathematics Classrooms 109
present in all speech or writing and pull in opposite directions, a bit like centripetal
and centrifugal forces (Bakhtin, 1981).
The various language resources on which EL students may draw to think about
mathematics can be organized according to the three dimensions of heteroglossia
(Barwell, 2015):
In each dimension, students and teachers will be influenced by both centripetal and
centrifugal forces: That is, there is both diversity (heteroglossia) and the idea of cor-
rect or preferred forms (unitary language). Unitary language often reflects the posi-
tion of authority, while less favored forms find themselves at the margins. Learning
to write in mathematics requires students to learn to use or “appropriate” (Bakhtin,
1981, p. 293) forms of writing initially introduced by their teacher, their textbooks,
and other authoritative sources. Nevertheless, the centrifugal and centripetal forces
are constantly at work, and teachers find themselves mediating the tension that
arises between them (Barwell, Chapsam, Nkambule, & Setati Phakeng, 2016).
These three kinds of resources can all be seen in the example of Curtis and Alex.
They draw on multiple discourses, including mathematical discourses (vocabulary,
forms of explanation, use of symbols) and general everyday discourses of explana-
tion (e.g., using the word “like”).They also draw on multiple voices, including their
own meanings and ideas, ideas derived from the word problem, and specific words
suggested by me (I wrote down some useful words, such as “pattern”). For Alex,
I also offered a sentence starter: “The pattern is. . . ” Finally, the two students make
use of both English and Cree at different times, such as when discussing with each
other, or with me, what they should write in their explanation.
In earlier research conducted in the UK, I recorded pairs of upper elementary
school EL students as they worked on the task of writing and solving word prob-
lems of their own.Their word problems and the discussions they had while writing
them were fascinating and provide further evidence for the three forms of resources
mentioned above. For example, Farida and Parveen, two students from a Pakistani
background (Panjabi/Urdu speakers), wrote the following word problem together:
My analysis of their discussion (Barwell, 2005b) showed that they drew on a variety
of resources (although I did not use that term at the time of the research). They
drew on the discourse of word problems, paying particular attention to the generic
form of word problems. For example, they were aware that their problems needed
to have a scenario, some mathematical information, and a question. They also used
mathematical discourse as a resource, through their reference to the mathemati-
cal structure of the problem. They made use of multiple voices, including their
110 Richard Barwell
negotiation of their different individual ideas and their use of narrative accounts
of their experiences to make sense of what they were writing. And they made use
of multiple languages to make meaning, including explicit discussion of grammar
and spelling, as well as forms of English influenced by their home language. More
specifically, my analysis showed how attention to written English and attention to
the mathematical structure of the problem each informed the other.
In each example, centripetal and centrifugal forces simultaneously shaped
students’ use of the various resources. The students are constantly influenced by
expected norms of the use of English, spelling, punctuation, and mathematical dis-
course. At the same time, their writing reflects their own unique and emergent
versions of these norms, such as in Curtis’s and Alex’s somewhat colloquial expla-
nations, or Farida and Parveen’s rather idiosyncratic orthography. In each case, it is
worth remembering that a different set of norms might result in rather different
writing, such as if the students were working in their home languages.
So far, in this section, I have highlighted some of the resources available to stu-
dents to make meaning while writing mathematics. Nevertheless, EL students’ math-
ematical writing is likely, at first, to be idiosyncratic and reflect informal or everyday
oral discourses. How, then, can EL students be supported to develop more appro-
priate forms of mathematical writing, both in terms of reflecting aspects of mathe-
matical discourse like precision, or in terms of reflecting the conventions of written
English? Supporting students to develop mathematical writing ideally should not
be through imposing a set of rules (reflecting a formal model of writing) but instead
should leave space for students’ voices. What strategies, then, might be appropriate?
Chval and Khisty (2009) conducted an ethnographic study of one Grade 5
class of Latino students in the U.S., in which the teacher included an explicit focus
on writing in mathematics and in which students’ mathematical writing visibly
improved over the course of the school year.The focus of Chval and Khisty’s analy-
sis was on how the students, over time, came to appropriate written mathematical
discourse, as a result of the conditions created by the teacher. They identified three
patterns that seemed to be significant.
First, they noticed that the teacher created a culture that valued writing in math-
ematics. The teacher used a variety of strategies to do this, including the use of
writing tasks every day in mathematics; the use of meaningful writing assignments
to communicate mathematical thinking and understanding; the use of a drafting
process to improve the quality of mathematical writing; public discussion of aspects
of the writing process; and clear expectations and evaluation criteria (pp. 135–136).
Second, the classroom was rich in mathematical language and meaningful dis-
cussion of mathematics in which students played an active part. This observation is
consistent with the idea that it is not enough for students to read and hear math-
ematical discourse in order to appropriate it for themselves; they need multiple,
meaningful opportunities to talk about and write about mathematics. In the class
observed by Chval and Khisty, the teacher used several strategies to create such an
environment for her students. She used rich mathematical language herself; she
ensured writing in mathematics had a clear purpose; and she ensured that students
“experienced and used rich words and language in context” (p. 136). The students
all spoke Spanish as well as English, and so the teacher made use of her own knowl-
edge of Spanish to support students’ writing in English: For example, she would
Writing in Mathematics Classrooms 111
use English words that have similar forms in Spanish, such as combine (combinar)
or clarify (clarificar).
Third, through the drafting and redrafting process, the teacher engaged in writ-
ten dialogue with her students. She would annotate drafts with questions and sug-
gestions. Much like the drafting process that might be used in creative writing,
the annotations initially focused on clarifying meaning, developing precision, and
organizing ideas, while later annotations focused more on conventions of written
English, such as spelling and punctuation. For early drafts, the teacher used written
questions to elicit clarification, such as “why do you need to build a congruent
triangle?” (p. 14). And she offered general guidance, such as “You would have a bet-
ter explanation if you: 1) reread your work; 2) add details; 3) draw a sketch [. . .]”
(p. 141). Some students worked on as many as eight drafts before a piece of writing
was considered to be ready for final evaluation based on a rubric.
The patterns observed by Chval and Khisty combined to enable students to
make clear progress in writing mathematics and, as a result, in their achievement in
mathematics. The teacher’s strategies made use of various resources, including mul-
tiple discourses (mathematical vocabulary, informal explanations), multiple voices
(such as the students’ voices and the teacher’s voice in the dialogic interaction of the
drafting process), and multiple languages (the teacher and students regularly used
Spanish mixed in with English when discussing their work). The teacher seems
to have found a good balance between centripetal and centrifugal forces, allow-
ing students to use multiple discourses, voices and languages (reflecting centrifugal
forces), while also introducing conventional ways of writing about mathematics
(centripetal forces).
each get?” Through such interaction, the two students and the teacher will learn
more about each other and the different resources each uses. A dialogic approach
gives value to students’ experiences and the resources they bring to their writ-
ing, allowing teachers to work with them to expand their repertoires of ways of
writing about mathematics, without negating what they bring to that writing.
In this way, teachers can support students to negotiate the centripetal and cen-
trifugal forces they encounter in their mathematics classroom.
There is a clear link between writing and assessment in mathematics, and, in
particular, summative evaluation (Morgan, 1998). The majority of evaluation relies
on students’ written work. In reform classrooms, students are often asked to work
on complex problems over extended time periods and write up their work for
evaluation.The example of Curtis and Alex shows that students are capable of solv-
ing a problem but struggle to write up an explanation, particularly if they are EL
students. It is therefore crucial that explicit attention is given to writing in math-
ematics and that students have opportunities to develop their proficiency in writing
mathematics, so that evaluation more fairly reflects students’ progress.
There is insufficient research on writing in mathematics. It is particularly notice-
able that there is much more research on spoken interaction in mathematics class-
rooms. We know little about how students experience writing in mathematics,
beyond anecdotal evidence that they often find it challenging. Writing is a cogni-
tively demanding activity and, as teachers have long recognized, students’ writing
in mathematics offers fascinating insights into students’ thinking. Researchers have
not yet explored these insights in sufficient depth.
Note
1 There is unfortunately a history of marginalization of indigenous peoples and languages
in Canadian education, including the removal of indigenous students to residential schools
in the nineteenth and twentieth centuries and the lack of status of indigenous languages,
many of which are endangered, as well as divergent varieties of English or French.
References
Bakhtin, M. M. (1981). The dialogic imagination: Four essays (Ed., M. Holquist; Trans. C. Emer-
son & M. Holquist). Austin, TX: University of Texas Press.
Barwell, R. (2005a). Ambiguity in the mathematics classroom. Language and Education, 19(2),
118–126.
Barwell, R. (2005b). Integrating language and content: Issues from the mathematics class-
room. Linguistics and Education, 16(2), 205–218.
Barwell, R. (2015). Language as a resource: Multiple languages, discourses and voices in
mathematics classrooms. In K. Beswick, T. Muir, & J. Wells (Eds.), Proceedings of the 39th
conference of the international group for the psychology of mathematics education (Vol. 2, pp.
97–104). Hobart, Australia: University of Tasmania.
Barwell, R. (2016). Formal and informal mathematical discourses: Bakhtin and Vygotsky,
dialogue and dialectic. Educational Studies in Mathematics, 92(3), 331–345.
Barwell, R., Chapsam, L., Nkambule,T., & Setati Phakeng, M. (2016).Tensions in teaching math-
ematics in contexts of language diversity. In R. Barwell, P. Clarkson, A. Halai, M. Kazima, J.
Moschkovich, N. Planas, M. Setati Phakeng, P.Valero, & M.Villavicencio (Eds.), Mathematics
education and language diversity:The 21st ICMI study (pp. 175–192). New York: Springer.
114 Richard Barwell
Borasi, R., & Rose, B. (1989). Journal writing and mathematics instruction. Educational Stud-
ies in Mathematics, 20(4), 347–365.
Burton, L., & Morgan, C. (2000). Mathematicians writing. Journal for Research in Mathematics
Education, 31(4), 429–453.
Busch, B. (2014). Building on heteroglossia and heterogeneity:The experience of a multilin-
gual classroom. In A. Blackledge & A. Creese (Eds.), Heteroglossia as practice and pedagogy
(pp. 21–40). Dordrecht/The Netherlands: Springer.
Chval, K. B., & Khisty, L. L. (2009). Bilingual Latino students, writing and mathematics:
A case study of successful teaching and learning. In R. Barwell (Ed.), Multilingualism in
mathematics classrooms: Global perspectives (pp. 128–144). Bristol: Multilingual Matters.
Clarke, D. J., Waywood, A., & Stephens, M. (1993). Probing the structure of mathematical
writing. Educational Studies in Mathematics, 25(3), 235–250.
Gray, E. M., & Tall, D. O. (1994). Duality, ambiguity, and flexibility: A “proceptual” view of
simple arithmetic. Journal for Research in Mathematics Education, 25(2), 116–140.
Halliday, M. A. K., & Mathiessen, C. (2015) An introduction to functional grammar (3rd ed.).
Abingdon, UK: Routledge.
Lea, M. R., & Street, B.V. (2006). The “academic literacies” model: Theory and applications.
Theory into Practice, 45(4), 368–377.
Martin, C. L. (2015). Writing as a tool to demonstrate mathematical understanding. School
Science and Mathematics, 115(6), 302–313.
Misfeldt, M. (2005). Media in mathematical writing: Can teaching learn from research prac-
tice? For the Learning of Mathematics, 25(2), 36–41.
Morgan, C. (1998). Writing mathematically:The discourse of investigation. London: Falmer Press.
Morgan, C. (2001). Mathematics and human activity: Representation in mathematical writ-
ing. Research in Mathematics Education, 3(1), 169–182.
Moschkovich, J. (2002). A situated and sociocultural perspective on bilingual mathematics
learners. Mathematical Thinking and Learning, 4(2&3), 189–212.
Moschkovich, J. (2009). How language and graphs support conversation in a bilingual math-
ematics classroom. In R. Barwell (Ed.), Multilingualism in mathematics classrooms: Global
perspectives (pp. 78–96). Bristol: Multilingual Matters.
O’Halloran, K. (2005). Mathematical discourse: Language, symbolism and visual images. London:
Continuum.
Pimm, D. (1987). Speaking mathematically: Communication in mathematics classrooms. London:
Routledge.
Pimm, D., & Sinclair, N. (2009). Audience, style and criticism. For the Learning of Mathematics,
29(2), 23–27.
Pugalee, D. K. (2001). Writing, mathematics, and metacognition: Looking for connections
through students’ work in mathematical problem solving. School Science and Mathematics,
101(5), 236–245.
Santos, L., & Semana, S. (2015). Developing mathematics written communication through
expository writing supported by assessment strategies. Educational Studies in Mathematics,
88(1), 65–87.
Sfard, A. (2008). Thinking as communicating: Human development, the growth of discourses, and
mathematizing. Cambridge: Cambridge University Press.
Steele, D. (2005). Using writing to access students’ schemata knowledge for algebraic think-
ing. School Science and Mathematics, 105(3), 142–154.
Valdés, G. (2004). The teaching of academic language to minority second language learners.
In A. F. Ball & S.W. Freedman (Eds.), Bakhtinian perspectives on language, literacy, and learning
(pp. 66–98). Cambridge: Cambridge University Press.
Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. (Eds. M.
Cole,V. John-Steiner, S. Scribner, & E. Souberman). Cambridge, MA: Harvard University
Press.
7
WRITING THE SCIENCE
REGISTER AND MULTIPLE
LEVELS OF LANGUAGE
Implications for English Learners
Focus Points
• The Next Generation Science Standards were developed with the perspective
that “doing” science depends on competent language processing in the four
language systems of listening, speaking, reading, and writing.
• As a group, Hispanic students, including those who are English learners (ELs),
continue to display an achievement gap in science at grades 4, 8, and 12.
• Foundational literacy, much less the academic language and science registers,
cannot be separated from the larger context of the multiple levels of language
within and across the four language systems and their supporting cognitive
infrastructures.
• Science is a blending of natural language, science symbolism, and visual display;
specific language and discourse characteristics of the science register allow
insights into what may make it potentially difficult for all students, but more
so for EL students, to acquire science knowledge or present their scientific
ideas.
• A writing model that focuses on cognitive processes and includes the multi-
ple levels of language is uncommon in the teaching of science content to EL
students.
• Promoting EL student engagement with science requires asking deeper ques-
tions about science vocabulary, incorporating multiple language levels into the
explicit teaching of writing, and providing multiple opportunities to develop
and demonstrate scientific ideas.
Chapter Purpose
This chapter highlights the critical function that the multiple levels of language
play in science writing in general and, specifically, in the thinking, development,
and expression of science literacy by all students, and, in particular for this chapter’s
116 Elaine R. Silliman et al.
knowledge, of the listener, speaker, reader, or writer and the particular situation in
which he/she engages.Thus, foundational literacy, much less the academic language
and science registers, cannot be dissociated from the larger context of the multiple
levels of language within and across the four language systems and their supporting
infrastructures (Wilkinson & Silliman, 2000).
Writing Science
On an individual level, science literacy is an understanding of scientific practices,
the knowledge of disciplinary content, and the perspective that science is a social
process whereby others with membership in a scientific group validate and assign
expertise. Science literacy can extend beyond the individual to communities when
people who are not scientists in an academic sense collaborate to resolve a science-
related problem (NASEM, 2016), such as the source of polluted drinking water.
At the core of science literacy, whether in the form of creating science or utilizing
science information, is foundational literacy, “the ability to access text, construct
meaning, and evaluate newly encountered information in the specific domain of
science” (NASEM, 2016, pp. 15–16). We extend the case for foundational literacy
further with the interrelated premises that the science register, a specialized sub-
category of the academic language register, and its productive engagement with
multiple levels of language are keystones for science literacy.
modes that are acceptable for representing scientific reasoning in writing, such as
providing explanations. An overriding feature of both the disciplinary-general aca-
demic register and the more specialized science register is the linguistic/discourse
complexity of multiple language levels, a feature that increases the likelihood that
EL students will encounter major obstacles in formulating well-knitted explana-
tions. For example, think about activities EL students must initiate to “unpack”
complicated instructions for crafting an explanation, such as the following sentence
devised for the chapter: Explain why Einstein, who was the renowned German physicist
who published influential papers over multiple decades, never claimed that he was omniscient.
First, they must have available the requisite concepts and background knowledge
(inferring who and what Einstein represented in the history of physics). Second, if
this information is accessible, they must apply a flexible repertoire of cognitive and
linguistic/discourse resources to hold in mind and efficiently manage the syntactic
challenge of interpreting a sentence with a long-distance dependency.This depend-
ency has two embedded relative clauses marked with who and a series of elabo-
rated noun phrases (ENPs) (in brackets) (who was [a renowned German physicist] who
published[influential papers]over[multiple decades]). Any or all of these constructions
can sidetrack comprehension (Scott & Koonce, 2014). At the same time, students
must uncover the lexical and semantic relationships embedded in the main and
subordinate clauses, for example, the complex morphological derivation omnisci-
ent. Here, students must (a) deconstruct the word into its base (science); (b) know its
general meaning; (c) be familiar with the Latin origin of the prefix, omni- (in Span-
ish and English the prefix is identical); hence EL students who are Spanish speakers
may be able to draw on cognate knowledge [omnisciente] to aid in interpretation of
the word meaning); and (d) lastly, overcome the opaqueness of the semantic rela-
tionship between science and omniscient through understanding that the derivation
represents a subword dual shift: in pronunciation (phonological level) and spelling
(orthographic level). Finally, and at the same time, EL students must continuously
reassemble these language levels into topic units at the text level, if they are to write
a coherent explanation.
Lastly, an effective instructional model for developing the academic language
proficiency of EL students does not exist (August, Goldenberg, Saunders, &
Dressler, 2010). For many EL students, who may be recent immigrants and do
not bring grade-level background knowledge and a repertoire of English aca-
demic language to the science classroom (Goodrich & Lonigan, 2017), learn-
ing to write science well may require a triple border crossing. They must cross
over from their home language and everyday conversational register to steer-
ing through the linguistic/discourse complexities of the academic register to
navigating the multileveled nuances of a disciplinary-specific science register
(Yore & Treagust, 2006). For students crossing these borders, Lee, Quinn, and
Valdés (2013) make the case that initially embedding the language of the science
classroom in the everyday conversational register, a resource that all EL students
possess, may serve as a bridge for crossing over to writing the language of the
science register (see also Buxton, Cardozo-Gaibisso, Xia, & Li, 2017). In sum,
our view is that writing is an essential conceptual medium for learning how to
communicate science content.
Writing the Science Register 119
The cascading nature of the writing model (Hayes & Berninger, 2014; Hayes &
Olinghouse, 2015) suggests three conclusions for all students, including EL stu-
dents. First, for students to learn to write in a variety of science genres, including
their development of academic content knowledge, teacher education at elemen-
tary and secondary levels must incorporate explicit and systematic instruction on
composing in the academic and science registers. Second, varied and in depth
experiences with writing should begin early. A reason is that writing reorganizes
“existing reading and writing systems to create a new functional system . . . that
requires switching back and forth across the roles of reader and writer in creating
a new text” (Berninger & Chanquoy, 2012, p. 65). Third, multiple language levels
are intertwined in oral and written language. Research findings indicate that the
oral and written systems concurrently employ distinctive and common processes
(Berninger & Abbott, 2010).
The next section offers an example of how the writing model applies to
construction of a chemistry lab report, produced by a 15-year-old female who
is a high school sophomore and a monolingual English speaker. The example
illustrates how learning to write the science register requires knowing how to
layer multiple language levels, as demonstrated by a student still in the process of
mastering these levels. The requirements of the science register are a matter of
degree of linguistic precision and alignment with the tasks. This is the case for
EL students and monolingual English speakers. In other cases, knowledge of the
conversational English register may provide a foundation upon which to build
the science register.
Writing the Science Register 121
In this lab, I predicted that it would have similar results to the oxygen lab because
the procedures were the same. The importance of this lab is for the students to be
familiarized with metal-acid reaction that produces hydrogen gas, water displacement,
and several properties of hydrogen gas. Safety precautions in this lab consist of wearing
goggles and aprons to ensure that no material gets on our clothes or in our eyes; make
sure that hair is tied back so it does not get caught in the experiment; and make sure that
when the graduated cylinder is not in use that it is laying down on the table because it
may break.
To begin the lab my lab partner and I filled the pneumatic trough with water only up to
overflow spout. We placed the over flow spout of the sink, so if water were to pour out
it would go in the sink. Next, we took the gas collecting bottle and filled it with water
from the sink until it was overflowing. Then we took the square glass plate and placed
it on top of the collecting bottle. Then, we placed the bottle and the glass in trough;
being careful not to let any air in the bottle. Making sure that the neck of the bottle
was submerged in water, I slid the glass out from underneath of the bottle. I placed the
bottle above the indentation on the trough so the tubing would be allowed to enter the
bottle. I then obtained the Erlenmeyer flask and placed it on the ring stand. In order to
keep it in place I tightened the clamp. Then I collected some hydrochloric acid, trying
to measure it as close to 30mL as possible using a graduated cylinder. I poured the HCI
into the flask and then took the tubing that was attached to the stopper and placed it
into the collecting bottle through the indentation. We had to be careful that the tubing
was not pinched because then material would not be able to be transferred through
the tubing. I then made my way over to the side counter to get 3 pieces of mossy zinc
and placed it into the small plastic beaker. I quickly and carefully poured the mossy
zinc into the flask containing the HCI and then placed the rubber stopper on the flask
immediately. When the bottle was filled with gas, bubbles started coming out of
the collecting bottle. This was our sign to remove the tubing from the bottle
and then we slid the square glass plate under the collecting bottle and took
it out of the trough being careful not to let any air out. We placed the bottle on
the table with the neck facing down and the glass lying on the table. Then to test the
gas in the bottle we lit a wood stick on fire and lifted up the bottle and placed the stick
underneath it. This caused a loud popping sound to occur.
The single replacement reaction occurred with the burning of metals (corrode) inside of
the flask. The equation for this reaction is HCI(aq)+ Zn(s)→ ZnCl2(aq) + H2(g). The reactants
are hydrochloric acid and zinc metal and the products is Zinc Chloride and hydrogen
gas. The zinc pushes ion hydrogen atom out of the compound and replaces it in this
reaction. The second equation used in this lab is synthesis H2 (g) + O2(g)→ H20 + heat.
The reactants in this equation are the hydrogen gas and oxygen from the atmosphere
that yield s water and heat. This is an exothermic reaction because the products needed
heat to be added in order to equal the reactants. Some traits that we acquired through
this lab was that hydrogen is colorless, ordorless, tasteless, and is less dense than the
atmosphere. Hydrogen doesn’t combust but acts violently with heat which causes an
explosion. The excess heat increases kinetic energy and the molecules slam against the
side causing a noise. This shows that hydrogen reacts easily when given a little bit of
activation energy.
After completing the hydrogen lab, I found that my prediction was wrong. The result
of the hydrogen lab was far different from the oxygen lab. One difference is that when
we took the collecting bottle out of the trough we placed it faced down whereas in
the oxygen lab we faced the neck of the bottle upwards. This was because hydrogen
has a smaller mass and more molecules would be able to escape out of the tiny spaces
(effusion). Placing the neck of the bottle downwards would minimize the amount of
Writing the Science Register 123
effusion because the weight would help seal the bottle. Also, in this lab the hydrogen
did not combust, but it did in the oxygen lab. The hydrogen was very reactive to the
heat and caused an explosion within the bottle, while the oxygen allowed for the stick
to start fire and burn. Lastly, in the end of the oxygen lab there was oxygen left in the
gas collecting bottle, while in the hydrogen lab there was hydrogen left in the collecting
bottle. This was because there were different reactants in the lab causing a different these
different elements to be produced in the gas collecting bottle. In this lab, I learned
the difference between how hydrogen and oxygen react with heat and what
material produce them. Also, another example of water displacement is used
when measuring the volume of a substance. By subtracting the final volume of the
water by the starting volume the water displacement is calculated. Sources of error in
this lab may have consisted of contaminated HCI or Zinc. Faulty tubing that may have
affected the transferring of materials, and hydrogen escaping from the collecting bottle
due to effusion.
Bolded text indicates NGSS key features
a
ideas to include in a text will not likely culminate in any visible outcome until
combined with the process of translating ideas into language (Hayes & Oling-
house, 2015). Hence, Jenna’s familiarity with the necessary science concepts seems
interconnected with her ability to generate the relevant ideas and translate them
into conceptual relationships, which are then transformed into language with an
emerging knowledge building approach to expository text production, a hallmark
of the more literate writer. As these authors observe, learning through knowledge
building “like scientific discovery and theorizing—is a process of working toward
more complete and coherent understanding” (p. 38), in this case through writing.
Level
(Continued)
TABLE 7.2 (Continued)
Level
Text (Clause Thematic Topic Shift Conclusion
Package) Introduction (Move-on) • Lastly in the end
• In this lab • To begin the lab of the oxygen lab
I predicted that it my lab partner there was oxygen
would have similar and I filled the left in the gas
results to the pneumatic trough collecting bottle,
oxygen lab because with water only up while in the
the procedures to overflow spout. hydrogen lab there
were the same. Elaboration was hydrogen left
The importance • We placed the over in the collecting
of this lab is for flow spout of the bottle. This was
the students to be sink, so if water because there
familiarized with were to pour out were different
metal-acid reaction it would go in the reactants in the lab
that produces sink. causing . . . these
hydrogen gas, water different elements
displacement, and to be produced in
several properties of the gas collecting
hydrogen gas. bottle.
a
Word/lexical-semantic level—Words in parentheses are included only to provide a linguistic context
for the general or derivational meaning displayed in the columns. Disciplinary-general items represent
lower frequency words “that are common across various content-area texts” (Beck, McKeown, &
Kucan, 2013, p. 132) and support but are not critical for the text topics in which they appear (Cox-
head, 2000, p. 214), such as stopper, graduated, and plastic, among others. In contrast, academic vocabulary
that is technical or specialized and may have German, Latin, and Greek origins, like Erlenmeyer flask,
kinetic, hydrochloric, and exothermic, occur within a specific content area, and their meaning often can be
variable depending on the disciplinary domain (Greene & Coxhead, 2015). For example, compound, as
applied in Jenna’s report, has a meaning that differs from the legal meaning of compound.
b
Lexical-syntactic level (ENPs)—ENPs provide important information about the sequence of permis-
sible multiword relationships among English words (Arnon, McCauley, & Christiansen, 2017). At a
minimum, the ENP consists of a determiner + head noun, such as the procedures. However, because
this form is high frequency, ENP types were identified as at least a modifier either preceding the
head noun (a premodification) or following the head noun (a postmodification). Additionally, ENPs
can assume highly dense architecture when pre-and postmodifications co-occur (modifier(s) precede
and follow the head noun), as illustrated in the third column, which increases the depth of head
noun-modifier relationships. Of note, relative clauses can function as postmodifiers; however, their
occurrence in written English accounts for only 10–15% of all postmodifiers (Biber & Clark, 2002).
c
Syntactic level (subordinate clauses)—Main (independent) clauses are obligatory in any sentence con-
struction for meaning to take place. In contrast, subordinate (dependent) clauses cannot standalone
and must be connected to a main clause. The three types of subordinate clauses are ( Justice & Ezell,
2016): (1) noun (or nominal) clauses that serve as nouns and may function as subjects, direct or indirect
objects, or a complement; (2) relative (or adjective) clauses, which follow the noun or pronoun that
they modify in the main clause; and (3) adverb clauses, which give information about time, place, man-
ner, reason, and conditions. Subordinate clauses are notated with brackets. For economy, subordinate
clauses from the report’s last paragraph are not listed.
d
Syntactic level (embedding chain)—The function of an e-chain is to incorporate ideas at increasing
depths (Karlsson, 2007).
e
Text level (clause package)—A clause package is a text (discourse) unit that consists of several clauses
linked by thematic, syntactic, or topical criteria (Katzenberger, 2004). Its primary function is to pro-
vide coherence or thematic continuity. A clause package consists of an introduction/move-on, which
serves to introduce a new topic; elaboration/expansion that develops a previously introduced topic
with more detail; and a conclusion that summarizes the information presented and must be linked to
a move-on or elaboration.
Writing the Science Register 127
Nagy, & Berninger, 2018).Transcription is also bound up with the subword level as
the lexical-syntactic “strings” produced, as the output of translation must be honed
into spelling patterns at the word, syntactic, and text levels (Hayes & Berninger,
2014). Finally, to some extent, motivation depends on the degree of ease with the
transcription process, since more problems with transcription can impede writing
fluency (Hayes, 2011).
Linguistic Levels
The linguistic levels represent the local context of writing: the words, phrases, and
syntax that are stitched and re-stitched to “hang together” and provide cohesion.
How Jenna managed these local processes offers a window into her efforts to con-
struct complexity and cohesiveness.
• The subword and word levels are intertwined in Jenna’s complexity building
and point to the relevance of rich lexical representations underlying word
knowledge in the science register. To begin, all of her words are transcribed
correctly, an outcome that may be related to a combination of spell-checker
and the lexical scope of disciplinary-general and disciplinary-specific mean-
ings, particularly as represented in derivations (see Table 7.2). Derivations are
morphological units that convey new meanings through attachment of affixes,
whether prefixes and/or suffixes (e.g., replacement), to the root word (place).
They are critical to academic vocabulary learning for at least two reasons. First,
morphologically complex derivations often alter syntactic roles as occurs with
dent/indent/indentation; hence, morphemic units unite meaning and form (Car-
lisle, 2007), and students must learn how to manage both dimensions. Second,
metalinguistic awareness of these meaning-form relationships correlates with
vocabulary and spelling knowledge (Nagy, Carlisle, & Goodwin, 2014). This
means that rich experiences with translation of science meanings and their
transcription are essential for strengthening connections among the subword,
word, and syntactic levels. These experiences seem necessary if students are
to attain “tighter links between the representations of sounds, spellings, and
meanings of words and morphemes (whether roots or affixes) and the spelling
of words in morphemic chunks” (Nagy et al., 2014, p. 5), such as the chunked
spelling of the morphemic units in re-act-ive.
• ENPs have a critical function: to unlock “meaning in text because they often
serve the purpose of identifying and further specifying complex concepts”
(DiCerbo et al., 2014, p. 454). As Table 7.2 shows, ENPs are the primary lin-
guistic device that Jenna employs at the phrase level to compact more precise
conceptual information. At the same time, these complex ENPs contributed to
increased length of expression, for example ENPs with nested relative clauses.
The condensing and expansion operations support two interrelated assump-
tions. One is that “the complexity of academic writing is phrasal” (Bieber,
Gray, & Poonpon, 2011, p. 22) rather than clausal. The other is that, when
combined with academic vocabulary, ENPs are a major device for constructing
informational density in academic texts (Biber et al., 2011; Ravid & Berman,
128 Elaine R. Silliman et al.
2010) due to their function in uniting lexical and syntactic complexity (Silli-
man, Brea-Spahn, & Danzak, 2016). Of note, although some data are available
on the development of more complex ENPs in the writing of particular top-
ics from age 9 years upward (e.g., Ravid & Berman, 2010), there is minimal
research on the production of multifaceted ENPs, such as postmodifications
and the even more complex pre-and postmodifications found in Jenna’s sci-
ence writing (see Table 7.2, lexical-syntactic level, ENPs).
• Regarding the larger but still local level of syntax, as a rule, complex sen-
tences are usually longer (Balthazar & Scott, 2017), repeating the point just
mentioned with ENPs that condensing and increased length frequently co-
exist. Condensing takes place through embedding or the structural insertion of
ideas via noun, relative, and adverbial subordinate clauses resulting in a vertical
(hierarchical) relationship that yields a sense of depth to the content being
expressed. Apparent from Table 7.2 is that Jenna engages in an intricate weav-
ing of subordinate clauses. The depth of this interweaving is displayed most
prominently in the 64-word example of an embedding chain (e-chain; Karls-
son, 2007), which illuminates the meaning of complexity. Here, the e-chain
that Jenna builds consists of: (a) a “less deep” main clause (Safety precautions in
this lab consist of wearing goggles and aprons); (b) center-embedded subordinate
clauses ([that hair is tied back] and [so it does not get caught in the experiment]);
and, in relation to the main clause, (c) increasingly “deeper” multiple center-
embedded clauses ([when the graduated cylinder is not in use] [that it is laying down
on the table] [because it may break]). E-chains rarely have been identified in the
school-based writing literature; however, they are important for learning “syn-
tactic control” or the metalinguistic ability to generate “a variety of sentences
that clearly express an intended meaning” (Saddler, 2012, p. 9).
next one, conveying “I am moving-on.” These move-ons can stand on their own,
but expansions and conclusions must be linked to another move-on or expansion.
The coherence of an academic composition depends therefore on the degree of
connections among these three components as the text evolves.
Well-developed clause packages begin to emerge during the high school years
(Katzenberger, 2005), and this emergence correlates with the frequency and quality
of writers’ experiences with expository texts throughout their schooling (Gra-
ham, Harris, & Chambers, 2016), a possible reason why EL students and their
non-EL counterparts may continue to produce underdeveloped academic texts.
Even skilled adult writers find difficulty with conclusions, perhaps because they
require a generalized synthesis of the text written so far (Hayes & Olinghouse,
2015). Jenna’s framing of her chemistry report (see Table 7.1) can be character-
ized as fully hierarchical (both the global and local levels are linked to the larger
theme), but it does not yet meet criteria for a fully developed expository text.
That is, she produces instances of top-down organization through clause packages,
which by definition are nested (move-on-expand-conclusion or a more truncated
move-on-expand). However, her global text organization and information flow is
not always signaled by explicit transitions (Katzenberger, 2004). These transitions
vary from clear-cut (In this lab, I predicted, to begin, lastly, after completing the hydrogen
lab, In this lab I learned) to ambiguity (e.g., the use of then as a general discourse signal
to mark the sequence of procedures), to implicit where the reader must infer that a
topic transition is occurring.
Best Practices
This section describes best practices, which: (1) are consistent with the conceptual
frame presented in this chapter; (2) offer potential curriculum effectiveness (Gra-
ham et al., 2016); and (3) align with the NGSS, CCSS, and the reauthorized ESSA
regulations for progress monitoring of EL students.
1. The word’s academic role, such as how often it occurs in the science register
being taught.
2. The word’s relationship to other words that the student knows or needs to
know; for example, word families are easier for students to infer meanings
from, such as place-replacing-replaced-replacement (see Greene & Coxhead (2015)
for science word families), than are words with limited family relationships,
such as theme-thematic. Teachers can make word-family relationships explicit
teachers, thus supporting students’ development of metalinguistic awareness
(see Snow, 2010).
Writing the Science Register 131
3. The extent to which the student can use the word in a variety of situations and,
if not, the extent to which the word can be explained in a “student-friendly”
way (see Beck et al., 2013) to make meaning conceptually transparent.
4. The extent to which the student needs to know the word for the specific writ-
ing/reading activity and the frequency with which the word will be encoun-
tered again.
the register and its use in both texts and tests and whether they provide opportu-
nities for students to be aware of multiple representations of science knowledge,
including the science register.
At any given point in science learning, students may not be in full command—
for production and comprehension—of the science register until they understand
the science. Teachers might find it beneficial to construct tasks for which students
deploy all of their diverse repertories of science knowledge and skills. Since science
proficiency includes a blend of understanding about how and when to use scien-
tific symbolism, natural language, and visual displays, students can make connec-
tions among all three semiotic systems; each has its own conventions and each poses
specific challenges (Lemke, 1990; O’Halloran, 2005). EL students can be encour-
aged to mix and mesh elements of their oral and written language, thus utiliz-
ing all of their linguistic and cultural resources as tools to support their academic
language development and literacy learning (Cummins, 2014). That is, while con-
structing their scientific understanding, students should be encouraged to commu-
nicate those understandings about the multiple levels of language by employing all
of their communicative resources—linguistic, symbolic, and gestural. For example,
writing individual science journal entries, drawing scientific procedures, or play-
ing charades to enact meanings of words learned might be beneficial (Ardasheva &
Tretter, 2017).
Best practices for science instruction require that, to encourage engagement,
EL students focus first and foremost on grasping and solving the scientific prob-
lem at hand, using all of their resources—language and nonlanguage. The practice
of segregating EL students in one location during science instruction, so that the
material can be “watered down” and focused primarily on rote memorization and
procedures, should be avoided for all learners.
and for varied purposes must be presented, as these are positively related with writ-
ing achievement (Mo & Troia, 2017).
In summary, as Jenna’s chemistry report illuminates, in engaging with the sci-
ence register the integration of coherence and cohesion in generating expository
texts requires advanced planning combined with an array of complex writing pro-
cesses. Writers must also constantly attend to evaluating how adequately they are
succeeding in achieving their writing goal and revise in flexibly adaptive ways, an
ability that can be explicitly taught (see Berninger & Chanquoy, 2012).
transcribed into written language. Writing plays a central role in the communica-
tion of ideas, a point highlighted by the NGSS (2013). Nonetheless, writing has
received minimal attention by researchers investigating how students learn to do
science through literacy and even less notice in how EL students learn to employ
writing effectively in scientific activities. Effective writing in any disciplinary domain
requires sensitivity to the multiple levels of language. In this chapter we focused on
five interrelated topics: (1) the implications of the NGSS (2013) for learning to write
in the science register; (2) the multiple dimensions of language (Abbott, Berninger,
Fayol, 2010) as central for our understanding the varied purposes of science literacy;
(3) an approach to science as a specialized academic language register that all students,
but especially EL students, must master for procuring scientific literacy; (4) a writing
model that interfaces with the multiple language levels of the science register, further
illustrated through a chemistry report drafted by a 15-year-old monolingual English-
speaking female; and (5) a discussion of best practices that have the potential to meet
the individual writing needs of EL students in doing science. The chapter suggests
instructional directions for teachers to consider as they orchestrate multiple language
levels in science writing and address the needs of EL students who are learning to
navigate writing within the science register.
Implications
Educational reform is in a period of uncertainty. The political polarization in the
United States is contributing to questions about what will be possible to accom-
plish with the new ESSA, the CCSS, and the NGSS. For example, it is unknown
whether the best intentions of ESSA, CCSS, and NGSS will be implemented in
ways that will better reconcile the existing access–achievement gap and lead to
fulfillment of the vision in which American students excel internationally. Even if
the ESSA is implemented fully by states, educational reform will take more than a
decade, perhaps even a generation; and this time line assumes that the policy is not
dramatically altered with each new presidency and Congress.
We do not have the crystal ball that would allow us to see into the future. We
anticipate that our colleagues in science, literacy, and EL education will continue
to appreciate the centrality of academic language learning for academic success. In
real-life contexts, language operates as a synergetic system of multiple levels, always
in the service of communication; therefore, it is not possible to separate language
and communication from their social and academic discourse functions.
References
Abbott, R. D., Berninger, V. W., & Fayol, M. (2010). Longitudinal relationships of levels of
language in writing and between writing and reading in grades 1 to 7. Journal of Educa-
tional Psychology,102, 281–298.
Ardasheva,Y., & Tretter, T. (2017). Developing science-specific, technical vocabulary of high
school newcomer English learners. International Journal of Bilingual Education and Bilingual-
ism, 20, 252–271.
Arnon, A., McCauley, S., & Christiansen, M. (2017). Digging up the building blocks of lan-
guage: Age-of-acquisition effects for multiword phrases, Journal of Memory and Language,
92, 265–280.
136 Elaine R. Silliman et al.
August, D., Goldenberg, C., Saunders, W. M., & Dressler, C. (2010). Recent research on
English language and literacy instruction. In M. Shatz & L. C. Wilkinson (Eds.), The edu-
cation of English language learners: Research to practice (pp. 272–297). New York: Guildford.
Avenia-Tapper, B., & Llosa, L. (2015). Construct relevant or irrelevant? The role of linguistic
complexity in the assessment of English language learners’ science knowledge. Educational
Assessment, 20, 95–111.
Bahr, R. H., Silliman, E. R., & Berninger,V.W. (2009).What spelling errors have to tell about
vocabulary learning. In C. Wood & V. Connelly (Eds.), Contemporary perspectives on reading
and spelling (pp. 109–129). New York: Routledge.
Bailey, A. L., & Butler, F. A. (2007). A conceptual framework of academic English language for
broad application to education. In A.L. Bailey (Ed.), The language demands of school: Putting
academic English to the test (pp. 68–102). New Haven, CT:Yale University Press.
Balthazar, C. H., & Scott, C. M. (2017). Complex sentence intervention. In R. J. McCauley,
M. E. Fey, & R. B. Gillam (Eds.), Treatment of language disorders in children (2nd ed., pp.
349–386). Baltimore, MD: Paul H. Brookes.
Beck, I. L., McKeown, M. G., & Kucan, L. (2013).Bringing words to life: Robust vocabulary
instruction (2nd ed.). New York: Guilford Press.
Bedore, L., & Peña, E. (2011). Ways with words: Learning a second language vocabulary. In
M. Shatz & L. C. Wilkinson (Eds.), The education of English language learners: Research to
practice (pp. 87–107). New York: Guilford Press.
Berninger, V. W., & Abbott, R. D. (2010). Listening comprehension, oral expression, reading
comprehension, and written expression: Related yet unique language systems in grades
1, 3, 5, and 7. Journal of Educational Psychology, 102, 635–665.
Berninger,V.W., & Chanquoy, L. (2012).What writing is and how it changes across early and
middle childhood development. In E. L. Grigorenko, E. Mambrino, & D. D. Preiss (Eds.),
Writing: A mosaic of new perspectives (pp. 65–84). New York: Psychology Press.
Biber, D., & Clark, V. 2002. Historical shifts in modification patterns with complex noun
phrase structures: How long can you go without a verb? In T. Fanego, M. J. Lopez-Couso,
and J. Perez-Guerra (Eds.), English historical syntax and morphology (pp. 43–66). Amsterdam:
John Benjamins.
Biber, D., Gray, B., & Poonpon, K. (2011). Should we use characteristics of conversation
to measure grammatical complexity in L2 writing development? TESOL Quarterly, 45,
5–35.
Brown, B. A., & Ryoo, K. (2008). Teaching science as a language: A “content-first” approach
to science teaching. Journal of Research in Science Teaching, 45, 529–553.
Buxton, C. A., Allexsaht-Snider, M., Suriel, R., Kayumova, S., Choi,Y. J., Bouton, B., & Baker,
M. (2013). Using educative assessments to support science teaching for middle school
English-language learners. Journal of Science Teacher Education, 24, 347–366.
Buxton, C., Cardozo-Gaibisso, L., Xia,Y., & Li, J. (2017). How perspectives from linguistically
diverse classrooms can help all students unlock the language of science. In L. Bryan & K.
Tobin (Eds.), 13 questions: Reframing education’s conversation: Science. New York: Peter Lang.
Carlisle, J. F. (2007). Fostering morphological processing, vocabulary development, and read-
ing comprehension. In R. K. Wagner, A. E. Muse, & K. R. Tannenbaum (Eds.), Vocabulary
acquisition: Implications for reading comprehension (pp. 78–103). New York: Guilford Press.
Common Core State Standards Initiative. (2010). Common Core State Standards for English
language arts and literacy in history/social studies, science, and technical subjects. Retrieved from
www.corestandards.org/assets/CCSSI_ELA%20Standards.pdf
Conrad, N.K., Gong, Y., Sipp, L., & Wright, L. (2004). Using text talk as a gateway to cul-
turally responsive teaching. Early Childhood Education Journal, 31, 187–192. https://doi.
org/10.1023/B:ECEJ.0000012137.43147.af
Coxhead, A. (2000), A new academic word list. TESOL Quarterly, 34: 213–238.
doi:10.2307/3587951
Cummins, J. (2014). Beyond language: Academic communication and student success. Lin-
guistics and Education, 26, 145–154.
Writing the Science Register 137
Davies, M. (2009). The 385+ million word corpus of contemporary American English
(1990-2008+): Design, architecture and linguistic insight. International Journal of Corpus
Linguistics, 14, 159–190.
DiCerbo, P. A., Anstrom, K. A., Baker, L. L., & Rivera, C. (2014). A review of the literature
on teaching academic English to English language learners. Review of Educational Research,
84, 446–482.
Fang, Z. (2006). The language demands of science reading in middle school. International
Journal of Science Education, 28, 491–520.
Fang, Z., Scheppegrell, M. J., & Moore, J. (2014). The linguistic challenge of learning across
academic disciplines. In C. A. Stone, E. R. Silliman, B. J. Ehren, & G. P. Wallach (Eds.),
Handbook of language and literacy: Development and disorders (2nd ed., pp. 302–322). New
York: Guilford Press.
Fayol, M. (2016). From language to text: The development and learning of translation. In
C. A. MacArthur, S. Graham, & J. Fitzgerald (Eds.), Handbook of writing research (2nd ed.,
pp. 130–143). New York: Guilford Press.
Frantz, R. S., Starr, L. E., & Bailey, A. L. (2015). Syntactic complexity as an aspect of text
complexity. Educational Researcher, 44, 387–393.
Gardner, D., & Davies, M. (2014).A new academic vocabulary list. Applied Linguistics, 35, 305–327.
Gibbons, S. P. (2003). Mediating language learning:Teacher interactions with ESL students in
a content-based classroom. TESOL Quarterly, 37, 247–273. doi:10.2307/3588504
Goodrich, J. M., & Lonigan, C. J. (2017). Language-independent and language-specific aspects
of early literacy: An evaluation of the common underlying proficiency model. Journal of
Educational Psychology, 109(6), 782–973. http://dx.doi.org/10.1037/edu0000179
Graham, S., Harris, K. R., & Chambers, A. B. (2016). Evidence-based practice and writing
instruction: A review of reviews. In C. A. MacArthur, S. Graham, & J. Fitzgerald (Eds.),
Handbook of writing research (2nd ed., pp. 211–226). New York: Guilford Press.
Greene, J. W., & Coxhead, A. (2015). Academic vocabulary for middle school students: Research-
based lists and strategies for key content areas. Baltimore, MD: Paul H. Brooks Publishing.
Gunel, M., Hand, B., & McDermott, M. A. (2009). Writing for different audiences: Effects
on high-school students’ conceptual understanding of biology. Learning and Instruction,
19(4), 354–367.
Hayes, J. R. (2011). Kinds of knowledge-telling: Modeling early writing development. Journal
of Writing Research, 3, 73–92.
Hayes, J. R., & Berninger,V. (2014). Cognitive processes in writing: A framework. In B. Arfé,
J. Dockrell, & V. Berninger (Eds.), Writing development and instruction in children with hearing,
speech, and language disorders (pp. 3–15). New York: Oxford University Press.
Hayes, J. R., & Olinghouse, N. G. (2015). Can cognitive writing models inform the design of
the Common Core State Standards? The Elementary School Journal, 115, 480–497.
James, K. H., Jao, R. J., & Berninger,V. (2016). The development of the multileveled writing
systems of the brain. In C. A. MacArthur, S. Graham, & J. Fitzgerald (Eds.), Handbook of
writing research (2nd ed., pp. 116–129). New York: Guilford Press.
Justice, L., & Ezell, H. (2016) The syntax handbook: Everything you learned about syntax but forgot
(2nd ed.). Austin, TX: Pro-Ed.
Kachchaf, R., Noble, T., Rosebery, A., O’Connor, C., Warren, B., & Wang,Y. (2016). A closer
look at linguistic complexity: Pinpointing individual linguistic features of science multiple-
choice items associated with English language learner performance. Bilingual Research
Journal, 39, 152–166.
Karlsson, F. (2007). Constraints on multiple initial embedding of clauses. International Journal
of Corpus Linguistics, 12, 107–118.
Katzenberger, I. (2004). The development of clause packaging in spoken and written texts.
Journal of Pragmatics, 36, 1921–1948.
Katzenberger, I. (2005). The super-structure of written expository texts: A developmental
perspective. In D. Ravid & B. Shyldkrot (Eds.), Perspectives on language and language develop-
ment (essays in honor of Ruth A. Berman (pp. 327–336). New York: Springer.
138 Elaine R. Silliman et al.
Lee, O., Mahotiere, M., Salinas, A., Penfield, R. D., & Maerten-Rivera, J. (2009). Science
writing achievement among English language learners: Results of three-year interven-
tion in urban elementary schools. Bilingual Research Journal, 32, 153–167.
Lee, O., Quinn, H., & Valdés, G. (2013). Science and language for English language learners
in relation to Next Generation Science Standards and with implications for Common
Core State Standards for English language arts and mathematics. Educational Researcher,
42, 223–233.
Lemke, J. L. (1990).Talking science: Language, learning, and values. Norwood, NJ: Ablex Publish-
ing Corporation.
Miller, J. (2009).Teaching refugee learners with interrupted education in science:Vocabulary,
literacy and pedagogy. International Journal of Science Education, 31, 571–592.
Mo, Y., & Troia, G. (2017). Predicting students’ writing performance on the NAEP from
student- and state-level variables. Reading and Writing, 30, 739–770.
Moll, L., Amanti, C., Neff, D., & González, N. (1992). Funds of knowledge for teaching:
Using a Qualitative approach to connect homes and classrooms. Theory into Practice, 31,
132–141.
Nagy, W. E., Carlisle, J. F., & Goodwin, A. P. (2014). Morphological knowledge and literacy
acquisition. Journal of Learning Disabilities, 47, 3–12.
Nagy, W. E., & Hiebert, E. H. (2011). Towards a theory of word selection. In M. L. Kamil,
P. D. Pearson, E. B. Moje, & P. Afflerbach (Eds.), Handbook of reading research (Vol. IV). New
York: Routledge.
Nagy, W., & Townsend, D. (2012) Words as tools: Learning academic vocabulary as language
acquisition. Reading Research Quarterly, 47, 91–108.
National Academies of Sciences, Engineering, and Medicine. (2016). Science literacy:
Concepts, contexts, and consequences. Washington, DC: The National Academies Press.
doi:10.17226/23595
Next Generation Science Standards. (2013). Practice 8 obtaining, evaluating, and communicating
information. Retrieved from www.nextgenscience.org/sites/default/files/Appendix%20
F%20%20Science%20an d%20Engineering%20Practices%20in%20the%20NGSS%20-
%20FINAL%20060513.pdf
Next Generation Science Standards. (2016). Phenomena. Retrieved from www.nextgenscience.
org/resources/phenomena
O’Halloran, K. L. (2005). Mathematical discourse: Language, symbolism and visual images. Lon-
don/New York: Continuum.
O’Halloran, K. L. (2015). The language of learning mathematics: A multimodal perspec-
tive. The Journal of Mathematical Behavior, 40, 63–74. http://dx.doi.org/10.1016/j.
jmathb.2014.09.002
Parkinson, A., Doyle, J., Cowie, B., Otrel-Cass, K., & Glynn, T. (2011). Engaging with chil-
dren’s science learning home learning books. Teaching and Learning, 1, 3–9.
Perin, D., De La Paz, S., Piantedosi, K. W., & Peercy, M. M. (2017). The writing of language
minority students: A literature review on its relation to oral proficiency. In Reading and
Writing Quarterly, 33(5), 465-483. http://dx.doi.org/10.1080/10573569.2016.1247399
Piolat, A., Olive, T., & Kellogg, R. T. (2005). Cognitive effort during note taking. Applied
Cognitive Psychology, 19, 291–312.
Ravid, D., & Berman, R. A. (2010). Developing noun phrase complexity at school age:
A text-embedded cross-linguistic analysis. First Language, 30, 3–26.
Ravid, D., Dromi, E., & Kotler, P. (2010). Linguistic complexity in school-age text produc-
tion: Expository versus mathematical discourse. In M. A. Nippold & C. M. Scott (Eds.),
Expository discourse in children, adolescents, and adults (pp. 123–154). New York: Psychology
Press.
Ray, A. B., Graham, S., Houston, J. D., & Harris, K. R. (2016).Teachers’ use of writing to sup-
port students’ learning in middle school: A national survey in the United States. Reading
and Writing, 29, 1039–1068.
Writing the Science Register 139
Richards, T.L., Berninger,V. W., & Fayol, M. (2012). The writing brain of normal child writ-
ers and children with writing disabilities: Generating ideas and transcribing them through
the orthographic loop. In E. L. Grigorenko, E. Mambrino, & D. D. Preiss (Eds.), Writing:
A mosaic of new perspectives (pp. 85–105). New York: Psychology Press.
Saddler, B. (2012). Teacher’s guide to effective sentence writing. New York: Guilford Press.
Schleppegrell, M. J., & Go, A. L. (2007). Analyzing the writing of English learners: A func-
tional approach. Language Arts, 84, 529–538.
Scott, C. and Balthazar, C. (2010).The grammar of information challenges for older students
with language impairments. Topics in Language Disorders, 30, 288–307.
Scott, C. M., & Koonce, N. M. (2014). Syntactic contributions to literacy learning. In C. A.
Stone, E. R. Silliman, G. P. Wallach, & B. J. Ehren (Eds.), Handbook of language and literacy:
Development and disorders (2nd ed., pp. 283–301). New York: Guilford Press.
Silliman, E. R., Bahr, R. H., Nagy, W., & Berninger, V. (2018). Language bases of spelling
in writing during early and middle childhood: Grounding applications to struggling
writers in typical writing development. In B. Miller, P. McCardle, & V. Connelly (Eds.),
Writing development in struggling learners: Understanding the needs of writers across the lifecourse
(pp. 99–119). Leiden, The Netherlands: Brill.
Silliman, E. R., Brea-Spahn, M. R., & Danzak, R. L. (2016, November). Academic writing in
math and social studies by students with LLD: Orchestrating language complexity. Short course
presented at the Annual Convention of the American Speech-Language-Hearing Asso-
ciation, Philadelphia, PA.
Silliman, E., & Wilkinson, L. C. (2015). Challenges of the academic language register for
students with language learning disabilities. In R. Bahr & E. R. Silliman (Eds.), Routledge
handbook of communication disorders (pp. 291–302). New York/Oxford: Routledge Taylor
& Francis.
Snow, C. E. (2010). Academic language and the challenge of learning about science. Science,
328(5977), 450–452.
Snow, C. E., Lawrence, J. F., & White, C. (2009). Generating knowledge of academic language
among urban middle school students. Journal of Research on Educational Effectiveness, 2,
325–344.
Townsend, D., & Kiernan, D. (2015). Selecting academic vocabulary words worth learning.
The Reading Teacher, 69, 113–118.
Wilkinson, L.C., & Silliman, E. (2000). Classroom language and literacy learning. In M.
Kamil, P.B. Mosenthal, P.D. Pearson, & R. Barr (Eds.), Handbook of reading research (Vol. III,
pp. 337–360). Mahwah, NJ: Lawrence Erlbaum.
Yore, L. D., Hand, B., Goldman, S. R., Hildebrand, G. M., Osborne, J. F., Treagust, D. F., &
Wallace, C. S. (2004). New directions in language and science education research. Reading
Research Quarterly, 39, 347–352.
Yore, L. D., & Treagust, D. F. (2006). Current realities and future possibilities: Language and
science literacy—empowering research and informing instruction. International Journal of
Science Education, 28, 291–314.
PART III
Focus Points
• With the advent of college- and career-ready standards, the language demands
on students and on English learners (ELs) in particular have increased, with
greater focus on explanation discourse practices in mathematics instruction
and assessment.
• Learning progressions trace the development of students’ thinking in relation
to core ideas and principles in a domain from rudimentary to sophisticated
levels. They support teachers’ ability to engage in “interpretive listening,” that
is, to gain insight into student understanding.
• Since the ability to explain one’s mathematical thinking is an important skill
in mathematics standards and explanation more generally is an important lan-
guage development skill, we explored the joint application of two learning
progressions.
• Applying companion progressions of proportional reasoning and explanation,
we found that few students in our sample were able to convey high levels of
mathematical knowledge without corresponding high levels of explanation
abilities.
• Formative assessment is part of classroom practice and focuses on learning as
it is taking place. Using companion progressions during formative assessment,
teachers may be better able to target instructional next steps that are high-
lighted by different configurations of student performance.
• Teachers will need ongoing support to understand progressions and how they
relate to standards.
144 Caroline Wylie et al.
Chapter Purpose
This chapter explores the joint application of the Proportional Reasoning Learning
Progression developed by researchers at Educational Testing Service (ETS) and the
language progressions created via the Dynamic Language Learning Progressions pro-
ject at the University of California–Los Angeles (UCLA). Specifically, it focuses
on attempts to apply both mathematics content and language learning progres-
sions to student mathematics tasks to describe how companion progressions can
be used for the purpose of formative assessment and instructional planning and to
inform how teachers engage students in discussions. Mathematics learning pro-
gressions can support teacher understanding of what strategies students are using,
provide teachers with a broader understanding of the range of strategies that stu-
dents might use, and support their instructional planning around how to deepen
and develop students’ understanding. Language learning progressions can provide
insight into the sophistication of student explanations, which may help teach-
ers understand whether a student’s mathematical explanation is lacking due to
a relatively naïve understanding of the mathematics or the challenge to put into
words his or her own thinking. While the study we describe includes analyses of
mathematical explanations predominantly written by English-speaking students,
our work illustrates the potential benefits of applying companion progressions for
all students. Specifically, companion progressions hold promise in providing more
targeted feedback and instruction to support improved content and language
learning for EL students.
Common Core State Standards (CCSS; National Governors Association Center
for Best Practices & Council of Chief State School Officers [CCSSO], 2010) have
established clear expectations for mathematical achievement, as well as expectations
for what students need to do with language as they engage in mathematics learn-
ing. Students must acquire specialized mathematics vocabulary and participate in
extended discourse in mathematics to learn and also to communicate their learning
(Moschkovich, 2012). While these language demands are inherent in the standards
for all students, they take on a particular salience for EL students, who are learning
content and language simultaneously (Bailey & Heritage, 2014).
In the context of CCSS, assessment aligned with the mathematics standards
relies to a great degree on the oral and written use of language (Bailey, Blackstock-
Bernstein, & Heritage, 2015). Standards-based assessment are now (1) multipart
rather than multichoice, (2) require students not only to work through word prob-
lems but to create their own text in providing an answer, and more specifically, (3)
require students to display their mathematical understanding and reasoning in writ-
ten explanations for their answers (for example, see Smarter Balanced practice test
items, practice.smarterbalanced.org) (Bailey, 2017). In the same way, classroom forma-
tive assessment relies on what students say or write in terms of their explanations to
provide evidence to teachers of how student learning is developing. By employing
the intersection between mathematics and language progressions, we hypothesize
that teachers will be better able to engage in formative assessment, with corre-
sponding benefits for the instruction of all students but with particular relevance to
meeting the needs of EL students. Our goal in the descriptions of learning progres-
sions and their potential to support teaching, learning, and formative assessment is
Formative Assessment: Math and Language 145
to illustrate how teachers could use the companion progressions approach in their
daily classroom practice and to suggest some preliminary thoughts around what
would be necessary to make such a practice accessible and useable.
We begin the chapter with a discussion of formative assessment and the place of
learning progressions in this assessment approach. We then review prior work on
learning progressions in mathematics and language learning in order to motivate
our work on two progressions—one on mathematics on proportional reasoning
and the other on explanation in language learning. We then discuss the use of the
proportional reasoning progression applied to a specific item and sample of student
responses followed by an exploratory analysis connecting the two progressions and
describe the possible added value of using the two together.We present an example
of a mathematical explanation from an EL student in response to a different prompt
to illustrate how an analysis of both the mathematics and the language usage tied
to companion progressions can be integrated to inform next instructional steps.
Finally, we describe potential formative assessment practices that stem from this
early exploration.
Formative Assessment
Formative assessment focuses on eliciting evidence of understanding for students
and teachers while student learning is still in progress. It is assessment that is part
of everyday classroom practice; its purpose is to assist teachers and students to
advance learning. In the 1980s (Crooks, 1988; Natriello, 1987; Sadler, 1989), form-
ative assessment was seen as a way to connect the two assessment activities of
making judgments about student learning and providing feedback to students to
move learning forward. A major landmark in establishing formative assessment as
an explicit domain of practice was Paul Black and Dylan Wiliam’s research synthe-
sis (Black & Wiliam, 1998), which Shepard (2009) characterized as encompassing
“diverse bodies of research, including studies addressing: teachers’ assessment prac-
tices, students’ self-perception and achievement motivation, classroom discourse
practices, quality of assessment tasks and teacher questioning, and the quality of
feedback” (p. 32).
While there are a variety of perspectives on the definition and purpose of learn-
ing progressions, there is a consensus view that, in essence, learning progressions
trace development of students’ thinking in relation to core ideas and principles
in a domain, shifting from rudimentary to increasingly sophisticated (Corcoran,
Mosher, & Rogat, 2009). They reflect the transitions or incremental changes in
student thinking over time. Unlike grade-level standards, progressions are not
prescriptive, but rather they convey a sequence of “expected tendencies” in stu-
dent learning along a continuum of developing expertise (Confrey & Maloney,
2010). Furthermore, standards do not illuminate how partial or naïve understand-
ings might present themselves. Partial or naïve understandings are a prime interest
in formative assessment; teachers need to understand what these are in order to
move students to more complete understandings (Heritage, 2008; CCSSO, 2008;
Sztajn, Confrey, Wilson, & Edgington, 2012). It is important to note, however, that
a learning progression is not intended to be a prescription for how all students
146 Caroline Wylie et al.
or less sweet but cannot describe the solution using numbers. At a more advanced
level, students recognize the importance of the numbers in the ratio. However,
initially as understanding develops a student may apply additive approaches rather
than multiplicative strategies, leading to the misconception, constant additive differ-
ences strategy, that has been identified by various researchers (e.g., Baxter & Junker,
2001; Noelting, 1980). This incorrect strategy can lead students to argue that the
ratios 2:3 and 3:4 are equivalent since the difference between the two quantities in
each ratio is the same. Students can use more or less sophisticated approaches—
from creating models or representations of the situation (using objects or diagrams),
using build-up (scale-up) strategies to calculating unit rates—to solve problems.
The build-up strategy is mathematically correct, unlike the additive misconception.
Using a build-up strategy a student is able to reason about a problem such as the
following: If four students can sit around a table, how many tables are needed for
12 students? A student can “build up” a solution, either with a diagram or through
writing an explanation, noting that if four students can sit at one table, then eight
students can sit at two tables, and 12 students can sit at three tables. As students have
more experience with multiple strategies, depending on the problem type and type
of unit rate (integer or not), they come to understand the multiplicative structure of
ratios and then ultimately learn to use more sophisticated strategies such as calculat-
ing unit rates or writing an equation.
In this progression, as in others, there are two types of transitions: concep-
tual transitions and fluency transitions. A conceptual transition or a conceptual
jump involves a distinct shift in understanding, while a fluency transition consists
of establishing a deeper understanding of concept ratio or proportional reason-
ing situation, which mostly manifests itself through more flexible procedural skills
and representational fluency. While the transitions from level 1 to level 2 and from
level 2 to level 3 involve a conceptual jump, the transitions from level 3 to level 4
and then to level 5 are more focused on increasingly deeper understanding of the
concepts acquired in level 3 and on gaining broader procedural skills and represen-
tational fluency. The transition from level 1 to level 2 requires a shift in perceiving
the situation from a holistic intuitive way to perceiving the situation as one that can
be modeled mathematically using ratios. The second transition also requires a leap
in understanding from an additive perception to a multiplicative view of the rela-
tionship between the quantities. In level 3, students know how to work with a pair
of ratios (i.e., the full proportional relationship); however, they still cannot apply it
either to complex situations that include more than two ratios or to when the unit
rate is not an integer. Hence the two advanced levels are achieved by acquiring
procedural and representational skills leading to deeper conceptual understanding.
For the analysis presented in this chapter, we focused on the first four levels of this
learning progression.
As part of the development of items mapped to proportional reasoning learning
progression, we engaged in the use of evidence-centered design principles (e.g.,
Mislevy, Almond, & Lukas, 2003) to identify the claims that we wanted to meas-
ure with respect to the learning progression, the nature of the evidence needed to
support those claims, and the types of items that would elicit evidence of student
understanding. We collected content validity evidence for those items from outside
experts. We drew on previous work by Graf and colleagues (van Rijn, Graf, &
148 Caroline Wylie et al.
Deane, 2014; Graf & van Rijn, 2016), who have described one component of the
larger validation process for progressions as the “recovery of the learning progres-
sions,” which entails examining “whether or not the data support the sequence as
well as the specification of the levels” (p. 9), along with an exploration of an “alter-
native account for a pattern of results” (p. 10).
We conducted a large online field test with middle school students of 98 items
that mapped to the Proportional Reasoning Learning Progression (Wylie, Bauer, &
Arieli-Attali, 2015). We collected responses for each item from at least 600 students
from across 59 volunteer schools throughout the U.S.The items included multiple-
choice, numeric-fill-in-the-blank, and short constructed response items. The items
were designed to target different levels of the progression. As such, we treated them
as independent when we used them to estimate students’ demonstrated level in the
learning progression with item response theory.
Using a three-parameter-logistic item response model, items were placed on a
theta scale grouped by predicted learning progression level. Theta is an estimate of
(latent) ability that in this case ranges from about −4 to 4 (mean = 0, std = 1). We
developed a task progression map (Van Rijn et al., 2014) that places each item at the
ability (theta) level where there is a probability of 65% for students at this level to
answer the item correctly. For example, an item that is placed at theta = 1.0 means
that students at ability level of one standard deviation above the mean have 65%
probability of answering this item correctly. Theta can be considered a proxy for
level of the progression—if items from each predicted progression level are shown
to best reflect a specific range of ability level, and these ranges, although somewhat
overlapping, are ordered according to the order of the hypothesized learning pro-
gression, then we have some evidence that we can use items to measure the levels.
Appendix 8B shows the item maps for the Proportional Reasoning Learning Progres-
sion, with ordering of items by level as we predicted.
Feature Description
In the following section, we present a mathematical problem that calls for stu-
dents to explain their answer. We first apply the proportional reasoning learning
progression to a sample of student explanations. We then examine those explana-
tions through the lens of the language progression. In the original sample we did
not collect information about students’ English proficiency. For this stage of our
collaborative work, this is a secondary analysis of the mathematics explanations
designed to initially explore the relationship between mathematics and language
progressions.
reasoning learning progression. Since there are only two ratios involved in this
question, the prompt does not provide an opportunity for a student to demonstrate
evidence of understanding at level 5. The prompt asks students to explain their
answer rather than just calculating the number of pieces of candy that Marisa can
buy because the explanation provided greater insight into students’ mathematical
understanding than the numerical answer alone. Consequently, this prompt is plac-
ing a demand on students for the kinds of language clearly articulated in the CCSS
for mathematics. Originally a three-point rubric was then used to score student
responses to this item as shown in Figure 8.2. Responses with nothing written at all
were classified as “omit.” Double-scored items had a rater agreement level of 95%.
While some students could have elected to draw a diagram to help them under-
stand the problem, due to the online nature of the data collection, we did not see
responses from students in the form of diagrams.
This item proved to be quite challenging, with 56% of students responding
incorrectly. A score of 1 or 2 indicates a correct response, with a score of 2 reserved
for an appropriate explanation or indication of a correct approach, not just a
numeric answer.
We selected between 6 and 12 responses at each score level, 30 responses in
total, for further exploration of alignment with the Proportional Reasoning Learning
Progression as well as for “best fit” on the DLLPs (described in a later section) for a
total of 30 explanations. Although the original student responses were scored using
the 0–2 scoring rubric presented in Figure 8.2, two of the authors independently
reviewed the 30 selected student explanations to align them directly with the levels
2 1 0
Student correctly states 25 pieces of candy Student states 25, but Anything else,
and provides one of the following: provides no explanation including when
• sets up the ratio to solve for x as of how he or she the student
12/20 = 15/x arrived at the answer. provides an
• calculates pieces of candy per quarter OR incorrect
as 1.667 (unit rate) and multiplies by Student states 25 and explanation of
15 to get the correct answer provides an incomplete the process.
• sets up a grouping strategy, e.g., but not incorrect
grouping 5 pieces of candy with 3 explanation.
quarters and continues reasoning from
there
• other correct mathematical approach.
mathematical language of all the six students.This response would provide evidence
of level 4 reasoning.
High
MATH
High-High High-Low
High Low
DLLP DLLP
Low-High Low-Low
Low
MATH
FIGURE 8.3 Possible
Configurations for Student Performance on the Extremes of the
Companion Progressions
Formative Assessment: Math and Language 155
Marisa can buy 25 peices of the same candy with 15 quaters. I set up a propor-
tion to solve this. Brenda’s 20 peices per 12 quaters = Marisa’s x peices per 15
quaters. I cross multiplied to get 12x = 300. I divided both sides of the equal sign
by 12 to isolate the variable x. 12x divided by 12 = x. 300 divided by 12 = 25. x
(peices of candy Marisa can buy) = 25 peices.
[Student F—original spelling]
Student F’s explanation has an opening with a clear statement of the correct
answer. There is a range of verb forms and precise mathematics topic vocabulary
(italicized) and a cohesive tie with the deictic pronoun “this” (bolded). There
are examples of complex sentence structures (underlined) that include a very
sophisticated nominalization in parentheses (“peices of candy Marisa can buy”).
This is a nominalized phrase that functions as a noun to define the meaning of
x in this explanation. The explanation was amongst the most elaborated we ana-
lyzed in this corpus, but note there is still room for improvement in the student’s
use of language, including the fact that the use of “this” rather vaguely most
likely refers to the equation in the initial question rather than to any referent in
the immediately preceding sentence, and thus this cohesive tie could be ambigu-
ous and misleading for a reader.
There was just one (3%) explanation showing the combination of Low DLLP–
High MATH, which was anticipated given the prompt asked for students to explain
their reasoning. The following example from Student G (Table 8.2) was also placed
at level 3 on the mathematics progression. This response is similar to the previous
one but shows a reliance on mathematical notation rather than prose, which may
ultimately prove unfortunate because explanatory skills are now necessary for dis-
playing mathematical knowledge on the new mathematics assessments aligned with
new standards.
Brenda = 20/12 Equation: 20/12 = x/15 Marisa = x/15 20x = 20(15) 20x/20 =
300/20 x = 25 Marisa can buy 25 pieces of candy.
[Student G]
However, our findings suggest language alone is not of course sufficient to make a
fully adequate explanation of proportional reasoning:We placed seven (23%) expla-
nations as Low MATH despite the “best fit” placing them as High DLLP. The
following example was placed at level 2 on the mathematics progression (having
156 Caroline Wylie et al.
provided evidence of the additive misconception), but at higher phases of the pro-
gressions for most language features:
marisa can buy 23 pieces of candy. i got my answer simply by comparing ratios.
brenda’s ratio of candy to quarters is 20:12, while maris’a ratio is __:15. after seeing
it brenda’s quarter ratio is 3 nukbers smaller than marisa’s, i added 20 3 to get to
get my final answer
[Student H—original spelling and punctuation]
Linguistically and discursively the example above is very similar to the first highly
placed example (Student F). The student starts with an opening statement, includes
precise math topic vocabulary and a range of verb forms and uses complex sentence
structures. Like example Student F, Student H also uses a pronoun (“it”) as a cohe-
sive device that is possibly tied somewhat ambiguously to the referent “brenda’s
quarter.”
A further 11 (37%) student explanations were placed as low on both progres-
sions. For example:
Best Practices
To do this skillfully and productively, one essential ingredient that the teacher
needs is to have in mind an underlying scheme of progression in the topic; such
a scheme will guide [. . .] the orientation which the teacher may provide by
further suggestions, summaries, questions and other activities.
(Black et al., 2011, p. 74)
In the absence of clear progressions, despite teachers’ best intentions, the evidence
gathering is unlikely to be systematic but instead rather unconnected to a picture
of the progressive development of understanding and language skills. The best-case
scenario of planned approaches to gathering evidence that can be used to consist-
ently move learning forward is enabled by progressions.
Davis (1997) referred to the process that teachers must engage in to support
student discussions as “interpretive listening” and teacher’s own placement of stu-
dent explanations on the learning progressions can lead to informed attention to
student responses. The learning progressions allow a teacher to anticipate common
student responses and monitor student discussions either as a whole class or in
small groups (Smith & Stein, 2011). Once teachers have qualitative insights about
the status of student learning (from students’ explanations while they are engaged
158 Caroline Wylie et al.
in a mathematical task, for example), they are likely better able to decide on next
instructional steps to advance learning. However, progressions by themselves do
not suggest what specific pedagogical action a teacher might take in response the
students’ location on the progression, either in terms of language or mathematics
or both. Nonetheless, by understanding students’ current learning within a broader
trajectory of learning, teachers are better informed about potential next steps in
learning than they would be in the absence of progressions (Heritage, Kim, Vend-
linski, & Herman, 2009). Consequently, there will likely be a closer match between
what the students need next in order to progress and the pedagogical action teach-
ers take. Furthermore, with an understanding of the intersection of language and
mathematics progressions, teachers are better positioned to focus on the reciprocity
of language and mathematical understanding; that is, as teachers gain deeper insights
into students’ thinking through their explanations, and by helping students improve
their mathematical explanations, they are able to advance language learning and
mathematics understanding simultaneously. Such a situation may well obviate the
“Cinderella” status (i.e., largely ignored by educators) of language (e.g., Wong Fill-
more & Snow, 2002) and place emphasis on the role of explanation in developing
thinking and communicating thinking for the purposes of formative assessment.
sequencing of information and a relative clause (“cubes that weren’t added . . .”) in
line three, allowing the student to refer to specific cubes.
Drawing on the insights gained from both the multiplication/division learning
progression and the language features progressions, a teacher may want to assist the stu-
dent in articulating his mathematical thinking more to clarify his approach to group-
ing the cubes. Depending on the level of student understanding, the teacher may want
to provide some additional supports to deepen the student’s mathematical understand-
ing of group size, to provide practice opportunities using groups of different sizes, or to
help the student more clearly articulate his thinking, especially making salient to him
where ambiguity in cohesive ties makes replicating his procedural explanation difficult
for a naïve listener. More varied temporal connectives could also be modeled for him
in an attempt to further develop coherent explanations.
Where this time I just felt more at ease. . . . It wasn’t necessarily looking for
one or the other [math/language]. But simultaneously . . . I mean how was
one informing the other? How was language being used to help them articu-
late their reasoning?
Acknowledgments
This research was supported in part by the Institute of Education Sciences, U.S.
Department of Education, through Grant Reference R305A100518 to ETS and by
the ASSETS Enhanced Assessment Grant from the U.S. Department of Education/
WIDA Consortium at the Wisconsin Center for Educational Research to UCLA.
The contents do not necessarily represent the policy of the U.S. Department of
Education, and you should not assume endorsement by the federal government.
Alison Bailey also acknowledges serving as a consultant and advisory board mem-
ber for WIDA projects. Earlier versions of this chapter were presented at CCSSO
NCSA, 2015 in San Diego and AERA, 2016, in Washington, DC.
Notes
1 Recall the prompt did not provide an opportunity for students to provide responses at
level 5 on the Proportional Reasoning Learning Progression.
2 This work is exploratory in nature, and the dichotomy of low-high was an initial place
to begin the application of dual progressions. Future, larger-scale studies could attempt an
expanded number of points along the two progressions. However, we question the utility
of a greater number of points because this will increase complexity for teachers making
instructional decisions, and such fine-grained information from combined placement on
the two progressions may go beyond the repertoire of instructional responses that teachers
have available.
References
Bailey, A. L. (2017). Progressions of a new language: Characterizing explanation develop-
ment for assessment with young language learners. Annual Review of Applied Linguistics,
37, 241–263.
Bailey, A. L., Blackstock-Bernstein, A., & Heritage, M. H. (2015). At the intersection of
Mathematics and Language: Examining Mathematical explanations of English profi-
cient and English Language learner students. Journal of Mathematical Behavior, 40, 6–28.
Retrieved from http://dx.doi.org/10.1016/j.jmathb.2015.03.007
Bailey, A. L., Blackstock-Bernstein, A., Ryan, E., & Pitsoulakis, D. (2016). Data Mining with
natural language processing and corpus linguistics: Unlocking access to school-children’s
language in diverse contexts to improve instructional and assessment practices. In S. El
Atia, O. Zaiane, & D. Ipperciel (Eds.), Data mining and learning analytics in educational
research (pp. 255–275). Malden, MA: Wiley-Blackwell.
Bailey, A. L., Butler, F. A., Stevens, R., & Lord, C. (2007). Further specifying the language
demands of school. In A. L. Bailey (Ed.), The language demands of school: Putting academic
English to the test (pp. 103–156). New Haven, CT:Yale University Press.
164 Caroline Wylie et al.
Bailey, A. L., & Heritage, M. (2008). Formative assessment for literacy, Grades K-6: Building read-
ing and academic language skills across the curriculum. Thousand Oaks, CA: Corwin/Sage
Press.
Bailey, A. L., & Heritage, M. (2014). The role of language learning progressions in improved
instruction and assessment of English language learners. TESOL Quarterly, 48(3),
480–506.
Bailey, A. L., & Heritage, M. (2017). Imperatives for teacher education: Findings from studies
of effective teaching for English language learners. In M. Peters, B. Cowie, & I. Menter
(Eds.), A Companion to research in teacher education. Berlin: Springer.
Ball, D.L.,Thames, M. H., & Phelps, G. (2008). Content knowledge for teaching:What makes
it special? Journal of Teacher Education, 59(5), 389–407.
Baxter, G. P., & Junker, B. W. (2001, April). Designing cognitive-developmental assessments: A case
study in proportional reasoning. Paper presented at the annual meeting of the National
Council for Measurement in Education, Seattle, Washington, DC.
Black, P., & Jones, J. (2006). Formative assessment and the learning and teaching of MFL:
Sharing the language learning road map with the learners. Language Learning Journal,
34(1), 4–9.
Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education:
Principles Policy and Practice, 5, 7–73.
Black, P., Wilson, M., & Yao, S. Y. (2011). Roadmaps for learning: A guide to the navigation
of learning progressions. Measurement: Interdisciplinary Research And Perspectives, 9(2–3),
72–123.
Cayton-Hodges, G. A. (2017). The development of Learning Progressions for Elementary Students.
Manuscript submitted for publication.
Confrey, J., & Maloney, A. (2010, June). The construction, refinement, and early validation of
the equipartitioning learning trajectory. In Proceedings of the 9th International Conference of
the Learning Sciences—Volume 1 (pp. 968–975). Chicago: International Society of the Learning
Sciences.
Corcoran, T. B., Mosher, F. A., & Rogat, A. (2009). Learning progressions in science: An
evidence-based approach to reform. CPRE Research Reports. Retrieved from http://repository.
upenn.edu/cpre_researchreports/53
Council of Chief State School Officers. (2008). Attributes of effective formative assessment.
A work product coordinated by Sarah McManus, NC Department of Public Instruction,
for the Formative Assessment for Students and Teachers (FAST) collaborative. Council of
Chief State School Officers, Washington, DC.
Council of Chief State School Officers. (2012). Framework for English language proficiency devel-
opment standards corresponding to the Common Core State Standards and the Next Generation
Science Standards. Washington, DC: CCSSO.
Crooks,T. J. (1988).The impact of classroom evaluation practices on students. Review of Edu-
cational Research, 58, 438–448.
Davis, B. (1997). Listening for differences: An evolving conception of mathematics teaching.
Journal for Research in Mathematics Education, 28(3), 355–376.
Dove, M., & Honigsfeld, A. (2010). ESL coteaching and collaboration: Opportunities
to develop teacher leadership and enhance student learning. TESOL Journal, 1(1),
3–22.
Every Student Succeeds Act. (2015). Public Law No: 114–195 § 114 Stat. 1177 (2015–2016).
Fennema, E., Carpenter, T. P., Franke, M. L., Levi, L., Jacobs, V. R., & Empson, S. B. (1996).
A longitudinal study of learning to use children’s thinking in mathematics instruction.
Journal for Research in Mathematics Education, 27, 403–434.
Graf, E. A., & van Rijn, P. W. (2016). Learning progressions as a guide for design: Recom-
mendations based on observations from a mathematics assessment. In S. Lane, M. R.
Raymond, & T. M. Haladyna (Eds.), Handbook of test development (2nd ed., pp. 165–189).
New York: Routledge.
Formative Assessment: Math and Language 165
Hakuta, K., Butler, Y. G., & Witt, D. (2000). How long does it take English learners to attain
proficiency? Santa Barbara, CA: University of California Linguistic Minority Research
Institute.
Herbel-Eisenmann, B., Johnson, K. R., Otten, S., Cirillo, M., & Steele, M. D. (2015). Map-
ping talk about the mathematics register in a secondary mathematics teacher study group.
The Journal of Mathematical Behavior, 40, 29–42.
Heritage, M. (2008). Learning progressions: Supporting instruction and formative assessment. Wash-
ington, DC: Council of Chief State School Officers. Retrieved from www.ccsso.org/
content/PDFs/FAST Learning Progressions.org
Heritage, M. (2012). Formative assessment: A process of inquiry and action. Cambridge, MA:
Harvard Education Press.
Heritage, M., Kim, J.,Vendlinski, T., & Herman, J. (2009). From evidence to action: A seam-
less process in formative assessment? Educational Measurement: Issues and Practice, 28(3),
24–31.
Hoff, E. (2013). Interpreting the early language trajectories of children from low-SES and
language minority homes: Implications for closing achievement gaps. Developmental Psy-
chology, 49(1), 4.
Mislevy, R. J., Almond, R. G., & Lukas, J. F. (2003). A brief introduction to evidence-centered
design. ETS Research Report Series, 2003(1) i–29.
Moschkovich, J. (2012). Mathematics, the Common Core, and language: Recommendations for
mathematics instruction for ELs aligned with the Common Core. Commissioned papers on
language and literacy issues in the Common Core State Standards and Next Gen-
eration Science Standards. Retrieved from http://ell.stanford.edu/sites/default/files/
pdf/academic-papers/02-JMoschkovich%20Math%20FINAL_bound%20with%20
appendix.pdf
National Governors Association Center for Best Practices & Council of Chief State School
Officers. (2010). Common Core State Standards for Mathematics. Washington, DC: Authors.
Natriello, G. (1987). The impact of evaluation processes on students. Educational Psychologist,
22, 155–175.
Nippold, M. A. (2009). School-age children talk about chess: Does knowledge drive syntactic
complexity? Journal of Speech, Language, and Hearing Research, 52(4), 856–871.
Noelting, G. (1980). The development of proportional reasoning and the ratio concept Part
I—Differentiation of stages. Educational Studies in Mathematics, 11(2), 217–253.
Sadler, D. (1989). Formative assessment and the design of instructional systems. Instructional
Science, 18, 119–144.
Shepard, L. A. (2009). Commentary: Evaluating the validity of formative and interim assess-
ment. Educational Measurement: Issues and Practice, 28(3), 32–37.
Slama, R. B. (2012). A longitudinal analysis of academic English proficiency outcomes for
adolescent English language learners in the United States. Journal of Educational Psychol-
ogy, 104(2), 265.
Smith, C.,Wiser, M., Anderson, C.W., Krajcik, J., & Coppola, B. (2004). Implications of research
on children’s learning for assessment: Matter and atomic molecular theory. Paper commissioned
by the Committee on Test Design for K-12 Science Achievement. Center for Education,
National Research Council.
Smith, M. S., & Stein, M. K. (2011). Five practices for orchestrating productive mathematics discus-
sions. Reston,VA: National Council of Teachers of Mathematics.
Song,Y., Deane, P., Graf, E. A., & van Rijn, P. (2013). Using argumentation learning progres-
sions to support teaching and assessments of English language arts. R&D Connections, 22,
1–14.
Sztajn, P., Confrey, J., Wilson, P. H., & Edgington, C. (2012). Learning trajectory based
instruction: Toward a theory of teaching. Educational Researcher, 41, 147–156.
Turkan, S., De Oliveira, L. C., Lee, O., & Phelps, G. (2014). Proposing a knowledge base for
teaching academic content to English language learners: Disciplinary linguistic knowl-
edge. Teachers College Record, 116(3), 1–30.
166 Caroline Wylie et al.
van Rijn, P.W., Graf, E. A., & Deane, P. (2014). Empirical recovery of argumentation learning
progressions in scenario-based assessments of English language arts. Psicología Educativa,
20(2), 109–115.
Vygotsky, L. (1962). Thought and language (Trans. E. Hanfmann & G.Vakar). Cambridge, MA:
MIT Press.
Weaver, R., & Junker, B. W. (2004). Model specification for cognitive assessment of proportional
reasoning (Department of Statistics Technical Rep. No. 777). Pittsburgh.
Wong Fillmore, L., & Snow, C. (2002). What teachers need to know about language. In
C. A. Adger, C. E. Snow, & D. Christian (Eds.), What teachers need to know about language
(pp. 7–54). Washington, DC: Center for Applied Linguistics.
Wylie, E. C., Arieli-Attali, M., & Bauer, M. I. (2014, April). Channeling teacher noticing with
learning progression-based formative assessment. Paper presented at the annual meeting of the
American Educational Research Association, Philadelphia, PA.
Wylie, E. C., Bauer, M. I., & Arieli-Attali, M. (2015, April). Validating and using learning progres-
sions to support mathematics formative assessment. Paper presented at the annual meeting of
the National Council on Measurement in Education, Chicago, IL.
Zapata-Rivera, D., & Bauer, M. (2012). Exploring the role of games in educational assess-
ment. In J. Clarke-Midura, M. Mayrath, & D. Robinson (Eds.), Technology-based assess-
ments for twenty-first-century skills:Theoretical and practical implications from modern research (pp.
147–169). Charlotte, NC: Information Age.
APPENDIX
Level Description
1
Theta
-1
-2
-3
-4
Level 2 Level 3 Level 4 Level 5
Min Outlier Median
APPENDIX 8C Phases of the Dynamic Language Learning Progressions for Each Language
Feature
Phase Description
Not evident Students’ competencies with the language feature are not yet detectable.
They may use only language from the prompt or provide a
non-English response.
Emergent Students’ inclusion of the language feature occurs infrequently or
intermittently. Their use of the feature is incomplete and may be
inaccurate. Students do not include any repertoire of types for the
feature (i.e., rely on just one form of a feature, such as repetitive
use of and for additive relations between ideas).
Developing Students’ use of the feature occurs more often. Their use of the feature
is more complete but may still be inaccurate. Students have a small
repertoire of types for the feature (i.e., have two or more forms for the
same feature such as use of and, as well as for additive relations but also
express other types of relations between ideas, such as causal relations
using the forms because, so, etc.).
Controlled Students use of the feature occurs where expected or in obligatory
contexts. Their use of the feature is complete and most often used
accurately. Students have a broad repertoire of types (i.e., many forms
for the same feature such as use of and, as well as, in addition to but also
express many other types of relations between ideas, such as causal,
adversative, etc. and their relevant forms).
9
FORMATIVE ASSESSMENT
Science and Language With
English Learners
Focus Points
• The Next Generation Science Standards (NGSS) provide an opportunity for
teachers and teacher educators to reconsider the nature of science instruction.
Central to this reconsideration is to shift instruction away from emphasizing
the “knowing” of facts and ideas and move toward student sense making about
scientific phenomena.This shift requires teachers to use different techniques to
formatively assess students throughout units of instruction.
• Formative assessment, or assessment for learning, can be defined as a “part of
everyday practice by students, teachers, and peers that seeks, reflects upon, and
responds to information from dialog, demonstration, and observation in ways
that enhance ongoing learning” (Klenowski, 2009, p. 264). In this chapter, we
examine three main dimensions of formative assessment centered on questions
teachers and students can ask themselves: (1) Where are we going? (2) What
does the student understand now? (3) How can we get to the learning target?
• Scientific phenomena can be used to ground opportunities for science instruc-
tion and formative assessment practices. Scientific phenomena are “observable
events that occur in the universe and that we can use our science knowledge
to explain or predict.” Using an anchoring phenomenon can provide coher-
ence in a unit and a reason for instruction. It also allows for deeper engage-
ment with the science, which can allow teachers richer formative assessment
opportunities.
Chapter Purpose
This chapter examines the critical role of formative assessment in supporting all stu-
dents in the three-dimensional learning called for by the Next Generation Science
Standards (NGSS) (Achieve, 2013). The Framework for K-12 Science Education
170 Amelia Wenk Gotwals and Dawnmarie Ezzo
(NRC, 2012) and the associated NGSS move away from emphasizing just the
“knowing” of science ideas and toward a “three-dimensional” conception of learn-
ing science that integrates disciplinary core ideas, science practices, and crosscutting
concepts. Disciplinary core ideas are the big ideas, and organizing concepts in sci-
ence needed to explain phenomena (i.e., observable events that occur in the world
or universe that we can predict or explain using understandings about science;
Moulding, Bybee, & Paulson, 2015). Science and engineering practices are the
types of activities that scientists engage in as they investigate phenomena (e.g., ask-
ing questions, developing and using models). Crosscutting concepts provide a lens
that give students an organizing framework for how knowledge is interrelated and
applicable across the different science disciplines (e.g., patterns, cause, and effect).
The types of instruction that will be necessary to teach three-dimensional
science will look different than traditional science instruction, and the language
demands in these instructional environments may also be quite different. Students
will not simply listen to lectures, take notes, and do labs that only require them to
“follow the steps.” Rather, students will be investigating the world around them by
asking questions, developing and discussing models, and explaining how and why a
phenomenon happened. All of these activities involve language—including reading,
writing, listening, and speaking—and teachers will need to use formative assess-
ment to monitor and support students’ learning along the way.
Formative assessment, or assessment for learning, can be defined as a “part of eve-
ryday practice by students, teachers, and peers that seeks, reflects upon, and responds
to information from dialog, demonstration, and observation in ways that enhance
ongoing learning” (Klenowski, 2009, p. 264). Important to note about this defini-
tion is that: (1) formative assessment is an everyday practice, not a product or thing;
(2) both teachers and students are involved in this process; and (3) the information
gathered during this process must help to inform teaching and learning. In this
chapter, we focus on the role of language in the ways in which teachers can work
to elicit students’ science ideas and how teachers interpret these ideas in order to
inform future instruction.
It is especially important to consider language demands when using forma-
tive assessment practices in science. Students from non-U.S. cultural and linguistic
backgrounds bring a vast array of experiences and knowledge with them to the
classroom. It is important that their ideas are elicited and valued even if they do
not have as fluent an English with which to communicate. While the NGSS shift
away from memorization of vocabulary and facts toward deeper meaning making,
sophisticated explanations of scientific phenomena require students to be able to
use scientific language and ways of thinking. For example, in science instruction,
students will need to be introduced to aspects of argumentation and reasoning as
ways of understanding events that are different in science than in other disciplines
or in their everyday life. Similarly, they will have to understand that some words
commonly used in conversation (e.g., force) have different meanings in science
than in everyday conversation.
Engaging in science learning that embraces deeper meaning making and using
scientific language and ways of thinking may place a heavy language burden on
all students, but there are particular language considerations for English learners
(ELs) because they are learning new types of sense making while also learning a
Formative Assessment: Science and Language 171
1. Use of learning targets and goal setting: Where are we (teachers and students)
going?
2. Evidence of student understanding: What does the student understand now?
3. Closing the gap/responding to students: How do we (teacher and students) get
to the learning target?
ways of discussing and explaining phenomena (Goldenberg, 2013; Lee, Quinn, &
Valdes, 2013; Lee & Buxton, 2013). However, as students move through a unit, they
will need support in moving toward discipline-specific language [e.g., what does
observation mean in the context of science (vocabulary); what types of questions do
scientists ask (syntax); what is a science explanation compared to an everyday expla-
nation (discourse)].
Appendix M of the NGSS makes specific connections to language and literacy
standards and lays out the multiple aspects of language and literacy that are specific
to science:
Science Language
The three-dimensional nature of NGSS performance expectations requires mov-
ing beyond learning targets where students must define or recognize factual infor-
mation. Rather, these learning targets necessitate that students engage in science
and engineering practices such as developing and using models or constructing
explanations. These more language-intensive goals for science learning may pose
particular challenges for EL students. Buxton and colleagues found that in highly
supportive science learning environments, EL students performed similarly as non-
EL students on assessments of content knowledge and experimental practices.
However, EL students still fell behind when researchers assessed the complexity
of their scientific reasoning, which can require advanced language skills (Buxton,
Salinas, Mahotiere, Lee, & Secada, 2015). This likely points to the double challenge
of learning to reason scientifically while simultaneously learning a new language.
Therefore, it is particularly important for teachers to provide supports along with
learning targets to help EL students.
One way of supporting EL students (and all students) as they navigate new ways
of reasoning about scientific phenomena may be to include rubrics and criteria
that include examples for what reaching learning goals look like. For example, if
teachers have an explanation or argumentation-based learning target, they could
use the claims-evidence-reasoning (CER) framework to provide supports for stu-
dents. In this framework, a claim is an answer to an investigable question; evidence
includes scientific data (i.e., measurements and observations) that are both appro-
priate and sufficient in supporting the claim; and reasoning is an explanation of
how the evidence supports the claim, which often includes scientific principles
(McNeill, Lizotte, Krajcik, & Marx, 2006). A teacher in a sheltered English immer-
sion classroom who posted CER supports on the wall for her students found that
students used these supports as they worked in small groups to make sense of data
(Gonzalez-Howard & McNeill, 2016). As the types of discourse in classrooms shift,
learning targets that include scaffolds like the CER supports will be essential for
supporting students.
component, then, is providing opportunities for students to share their ideas and
understandings, because without knowing what students know, teachers cannot
provide feedback or make informed instructional decisions and students do not
have the information to regulate their learning (Heritage, 2007). These opportuni-
ties to elicit student understanding may be based on whole-class discussions, small
group work, one-on-one conversations, or written work, such as scientific models
or “tickets out the door.”
Significant insight into student understanding is also gained when teachers
blend instructional and eliciting strategies by working alongside their students in
“joint acts of meaning-making and knowledge construction” (Wolfe and Alexan-
der, 2008, p. 1). As students and teachers consult and work together to develop the
students’ understanding and progress on a particular task or learning goal, teach-
ers can simultaneously scaffold students’ developing understandings as they work
to elicit student thinking and sense making (Ruiz-Primo & Furtak, 2006). Thus,
this dimension of formative assessment, “evidence of student understanding,” is
not neatly separated from the third dimension, which is what teachers do with
the information that they gather. Teachers who work with their students and ask
questions that are intended to both elicit and scaffold student thinking gain richer
insight into student thinking. This can enable them to “interpret student responses
in the moment and determine next pedagogical steps, including feedback, that will
move the students closer to successful performance” (Heritage & Heritage, 2013).
In characterizing these types of assessment opportunities, Ruiz Primo and Fur-
tak (2006) suggested the use of cycles named ESRU in which the teacher: Elic-
its students’ ideas; Students respond; Recognizes students’ responses; and Uses the
information to promote future learning. This type of classroom discourse moves
beyond traditional discussion formats such as “initiate-respond-evaluate” (IRE;
Mehan, 1985; Cazden, 2001), which is especially important for EL students because
teachers must ensure they understand students’ levels of understanding in order to
support both their disciplinary learning and their English learning.
When teachers provide opportunities for students to demonstrate rich under-
standings (moving beyond just declarative knowledge), they are better able to
understand students’ ideas (Carpenter, Fennema, Peterson, Chiang, & Loef, 1989),
are better able to provide feedback targeted to students’ specific ideas, are more
likely to use student ideas in their planning ( Jones & Krouse, 1988), and are able to
make instructional decisions that promote student learning (Fuchs et al., 1991). In
addition, the ways in which teachers follow-up in these initial questions have impli-
cations for the types of information that students may be willing to share (Davis,
1997) and for the larger classroom culture (Supovitz & Turner, 2000).
Science Language
Language demands are especially important to consider when eliciting students’
ideas. Learning science is a sociolinguistic endeavor, meaning that students’ under-
standing and sense making are co-created among individuals engaged in social
interaction and discussion (Leach & Scott, 2002; Carlsen, 1991). This sociolin-
guistic perspective of learning suggests that students share and construct meaning
and develop conceptual understanding in a social context of teacher-student and
Formative Assessment: Science and Language 175
student-student interactions that are scaffolded, shaped, and guided to a large degree
by teacher prompts and questions (Carlsen, 1991). Likewise, second language acqui-
sition can also be thought of as a sociolinguistic endeavor, in which users of one
language develop the ability to use another through interactions with peers and
teachers (Duff, 2002; Lantolf, 2006; Pica, 2008).
In order to support EL students’ learning, teachers need to scaffold and support
their students’ emerging understandings of the English language while simultane-
ously attending to their students’ emerging science ideas. When teachers are able
to attend to both students’ language development and science understandings, they
are more likely to realize the potential for language-based miscommunications and
the need to ask follow-up questions of students. These follow-up questions can
allow teachers to: (a) ensure accurate understanding of the student’s science ideas,
regardless of potential struggles with the English language; (b) probe for additional
information into student science thinking; (c) press students to deepen their expla-
nations and include more sense-making processes regarding the science; and (d)
support students as they learn new academic language (van Zee, Iwasyk, Kurose,
Simpson, & Wild, 2001; Zhang et al., 2010).
Science Language
After teachers elicit students’ ideas, it is critically important for teachers to know
what to do with these ideas. Some researchers use a construct of “noticing” to
describe what teachers do with students’ ideas. Noticing consists of three parts: (1)
attending to and identifying useful student ideas, (2) reasoning about these ideas,
and (3) making instructional decisions on the basis of analyzing these student ideas
(Hiebert, Morris, Berk, & Jansen, 2007). Many teachers struggle with identifying
students’ ideas as useful and may view ideas in a dichotomous fashion of “right” or
“wrong” rather than as seeing potentially productive stepping stones in students’
ideas (Otero, 2006; Wiser, Smith, & Doubler, 2012). This may be especially true for
teachers working with students with limited proficiency in using the English lan-
guage to explain their understanding of unfamiliar science ideas. In these instances,
teachers may be more likely to misattribute a struggle with communicating an idea
in English as an inaccurate understanding of the science idea itself, particularly if the
teacher views student ideas dichotomously as either “right” or “wrong.” Thus, it is
important for teachers to support students in explaining their ideas in multiple ways
and formats (e.g., discussion, written explanations, models).
In addition to the care and attention to the ongoing development of their stu-
dents’ science ideas and understandings, science teachers working with EL students
can use what they have “noticed” about their students’ ideas to support their stu-
dents’ developing language needs as well. In particular, teachers should attend to
supporting their students’ understanding of science discourse as well as supporting
their students’ ability to use common academic language structures in science to
describe the relationships between science ideas (Schleppegrell, 2002;Avenia-Tapper,
Haas, & Hollimon, 2016).This may be especially important because developing dis-
cipline-specific language (such as in science) takes students longer than the time it
takes most students to become proficient with everyday informal language (Fang &
Schleppegrell, 2010).
The teacher may need to include additional learning targets not brought up by
students to ensure that all learning objectives are met. In addition, the teacher will
need to introduce both scientific vocabulary (e.g., molecular movement) and syn-
tax and discourse practices (e.g., using evidence to support their claims) at appro-
priate places throughout the unit in order to support students in reaching learning
goals. However, allowing students to take more responsibility and use their own
language (either non-science-specific language or home language) to decide what
they need to know provides increased instructional congruence (Lee & Fradd,
1998) and gives students a clear purpose for their learning.
As the unit progresses, having activities that relate back to the phenomenon
also provide context for coherent and quality formative assessment opportunities.
Units that are based on NGSS and engage students in making sense of phenomena
over time will include opportunities for students to engage in the science practices
and crosscutting concepts contained in the Framework for K-12 Science Educa-
tion (NRC, 2012).2 When students are working on activities that necessitate using
the science practices (and reasoning with the crosscutting concepts), there will be
rich opportunities for them to demonstrate their current levels of understand-
ing. Rather than only having worksheets or engaging in disconnected activities,
students and teachers can gauge their current levels of understanding based on
how close they are to being able to explain the phenomenon. This gives students
more opportunities to engage with self-assessment (Brown & Harris, 2013) and to
regulate their own learning (Allal, 2010). In addition, when students are engaging
with the science practices, the evidence of understanding that teachers can gather
will likely be very rich so that teachers will have better information with which to
make their next instructional decisions.
These types of organizers are helpful for all students. However, they are especially
helpful for EL students because they can connect ideas of vocabulary, syntax, and
discourse together—providing a structure for learning both science and English at
the same time.
the classrooms. Each of these aspects of formative assessment are enhanced by bas-
ing units in rich and engaging scientific phenomena. The anchoring phenomenon
for a unit provides the coherence between aspects of formative assessment that will
help students make connections between their learning goals, the learning activi-
ties, and feedback that happens in the classroom.
Too often, formative assessment has been considered from the teacher’s view-
point only (Coffey, Hammer, Levin, & Grant, 2011). However, students are the
main actors in their own learning, and so formative assessment decisions must
include specific attention to students as well as teachers. It is important to remem-
ber that the “teacher’s role in formative assessment is not simply to use feedback
to promote content learning, but also to help students understand the goal being
aimed for, assist them to develop the skills to make judgments about their learning
in relation to the standard, and establish a repertoire of operational strategies to
regulate their own learning” (Heritage, 2010, p. 6).Thus, every formative assessment
decision teachers make—from setting learning targets, to providing opportunities
for students to demonstrate their understanding, to helping students make progress
in their learning—must consider students’ prior experiences, the students’ abilities
with language, and the most effective ways to support student learning.
Notes
1 Sadler (1989) delineated three necessary components of feedback: (1) the standard, which is
to be achieved, (2) the actual level of performance, and (3) how to go about closing the gap.
Building on this, Hattie and Timperley (2007) suggested that “effective feedback must answer
three major questions asked by a teacher and/or a student:Where am I going? (What are the
goals?), How am I going? (What progress is being made toward the goal?), and Where to next?
(What activities need to be undertaken to make better progress?) (p. 86).
Formative Assessment: Science and Language 183
2 The eight practices in the NRC Framework (2012) are: (1) asking questions (for science)
and defining problems (for engineering); (2) developing and using models; (3) planning
and carrying out investigations; (4) analyzing and interpreting data; (5) using mathemat-
ics and computational thinking; (6) constructing explanations (for science) and designing
solutions (for engineering); (7) engaging in argument from evidence; and (8) obtaining,
evaluating, and communicating information.
The seven crosscutting concepts are: (1) patterns; (2) cause and effect; (3) scale, propor-
tion, and quantity; (4) systems and system models; (5) energy and matter; (6) structure and
function; and (7) stability and change.
3 Appendix D Case Studies: www.nextgenscience.org/appendix-d-case-studies
References
Achieve, Inc. (2013). The Next Generation Science Standards. Retrieved April 18, 2013, from
www.nextgenscience.org/
Allal, L. (2010). Assessment and the regulation of learning. In P. Peterson, E. Baker, & B.
McGaw (Eds.), International encyclopedia of education (Vol. 3, pp. 348–352). Oxford: Elsevier.
Avenia-Tapper, B., Haas, A., & Hollimon, S. (2016). Explicitly speaking. Science and Children,
53(8), 42.
Bell, B., & Cowie, B. (2001).The characteristics of formative assessment in science education.
Science Education, 85, 536–553.
Black, P., & Wiliam, D. (1998). Inside the black box. Phi Delta Kappan, 80(2), 139–148.
Black, P. J., & Wiliam, D. (2009). Developing the theory of formative assessment. Educational
Assessment, Evaluation and Accountability, 21(1), 5–31.
Brown, G.T. L., & Harris, L. R. (2013). Student self-assessment. In J. H. McMillan (Ed.),
SAGE Handbook of Research on Classroom Assessment. Los Angeles: SAGE.
Buxton, C.A., Salinas, A., Mahotiere, M., Lee, O., & Secada, W. (2015). Fourth-grade emer-
gent bilingual learners scientific reasoning complexity, controlled experiment practice,
and content knowledge when discussing school, home, and play contexts. Teachers College
Record, 117, 1–36.
Carlsen,W. S. (1991). Questioning in classrooms: A sociolinguistic perspective. Review of Edu-
cational Research, 61, 157–178.
Carpenter,T. P., Fennema, E., Peterson, P. L., Chiang, C., & Loef, M. (1989). Using knowledge
of children’s mathematical thinking in classroom teaching: An empirical study. American
Educational Research Journal, 26(4), 499–531.
Cazden, C. B. (2001). Classroom discourse:The language of teaching and learning (2nd ed.). Ports-
mouth, NH: Heinemann.
Coffey, J. E., Hammer, D., Levin, D. M., & Grant,T. (2011).The missing disciplinary substance
of formative assessment. Journal of Research in Science Teaching, 48(10), 1109–1136.
Crossouard, B., & Pryor, J. (2012). How theory matters: Formative assessment theory and
practices and their different relations to education. Studies in Philosophy and Education,
31(3), 251–263.
Davis, B. (1997). Listening for differences: An evolving conception of mathematics teaching.
Journal for Research in Mathematics Education, 28(3), 355–376.
Davis, E. A., Petish, D., & Smithey, J. (2006). Challenges new science teachers face. Review of
Educational Research, 76(4), 607–651.
Duff, P. A. (2002).The discursive co-construction of knowledge, identity, and difference: An eth-
nography of communication in the high school mainstream. Applied Linguistics, 23, 289–322.
Echevarria, J., Short, D., & Powers, K. (2003). School reform and standards-based education: How
do teachers help English language learners? (Technical report). Santa Cruz, CA: Center for
Research on Education, Diversity & Excellence.
Echevarria, J. J., Vogt, M. J., & Short, D. J. (2013). Making content comprehensible for elementary
English learners:The SIOP model. Upper Saddle River, NJ: Pearson Higher Ed.
184 Amelia Wenk Gotwals and Dawnmarie Ezzo
Fang, Z., & Schleppegrell, M. J. (2010). Disciplinary literacies across content areas: Support-
ing secondary reading through functional language analysis. Journal of Adolescent & Adult
Literacy, 53(7), 587–597.
Fuchs, L. S., Fuchs, D., Hamlett, C. L., & Stecker, P.M. (1991). Effects of curriculum-based
measurement and consultation on teacher planning and student achievement in math-
ematics operations. American Educational Research Journal, 28(3), 617–641.
Fuchs, L. S., Fuchs, D., Karns, K., Hamlett, C. L., Dutka, S., & Katzaroff, M. (2000). The
importance of providing background information on the structure and scoring of per-
formance assessments. Applied Measurement in Education, 13(1), 1–34.
Furtak, E. M., Thompson, J., Braaten, M., & Windschitl, M. (2012). Learning progressions to
support ambitious teaching practices. In A. C. Alonzo & A. W. Gotwals (Eds.), Learning
progressions in Science. Rotterdam/The Netherlands: Sense Publishers.
Goldenberg, C. (2013, Summer). Unlocking the research on English learners: What we
know—and don’t know—about effective instruction. American Educator, 4–38.
González-Howard, M., & McNeill, K. L. (2016). Learning in a community of practice: Fac-
tors impacting English-learning students’ engagement in scientific argumentation. Journal
of Research in Science Teaching, 53(4), 527–553.
Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research,
77(1), 81–112.
Heritage, M. (2007). Formative assessment: What do teachers need to know and do? Phi
Delta Kappan, 89(2), 140–146.
Heritage, M. (2010). Formative assessment and next generation assessment systems: Are we losing
an opportunity? Paper prepared for the Council of Chief State School Officers. Los Ange-
les: UCLA National Center for Research on Evaluation, Standards, and Student Testing
(CRESST).
Heritage, M., & Heritage, J. (2013). Teacher questioning: The epicenter of instruction and
assessment. Applied Measurement in Education, 26(3), 176–190.
Hershberger, K., & Zembal-Saul, C. (2015, February). KLEWS to explanation building in
science: An update to the KLEW chart adds a tool for explanation building. Science and
Children, 66–71.
Hiebert, J., Morris, A. K., Berk, D., & Jansen, A. (2007). Preparing teachers to learn from
teaching. Journal of Teacher Education, 58(1), 47–61.
Hudicourt-Barnes, J. (2003).The use of argumentation in Haitian Creole science classrooms.
Harvard Educational Review, 73(1), 73–93.
Jones, E. D., & Krouse, J. P. (1988). The effectiveness of data-based instruction by student
teachers in classrooms for pupils with mild learning handicaps. Teacher Education and
Special Education, 1(1), 9–19.
Klenowski, V. (2009). Assessment for learning revisited: An Asia-Pacific perspective. Assess-
ment in Education: Principles, Policy and Practice, 16(3), 263–268.
Lantolf, J. P. (2006). Sociocultural theory and L2. Studies in Second Language Acquisition, 28,
67–109.
Leach, J., & Scott, P. (2002). Designing and evaluating science teaching sequences: An
approach to drawing upon the concept of learning demands and a social constructivist
perspective on learning. Studies in Science Education, 38, 115–142.
Lee, O., & Buxton, C. A. (2013). Integrating science and English proficiency for English
language learners. Theory Into Practice, 52(1), 36–42.
Lee, O., & Fradd, S. H. (1998). Science for all, including students from non-English-Language
backgrounds. Educational Researcher, 27(4), 12–21.
Lee, O., Quinn, H., & Valdés, G. (2013). Science and language for English language learners
in relation to Next Generation Science Standards and with implications for Common
Core State Standards for English language arts and mathematics. Educational Researcher,
42(4), 223–233.
McKenna, J. J. (2016). NGSS Phenomena: A Heuristic for Coming up with Academically Produc-
tive Phenomena. Retrieved on November 18, 2016 from https://static1.squarespace.com/
Formative Assessment: Science and Language 185
static/56e316c61bbee06d13210ed6/t/582fbda2e58c625a736aedea/1479523749199/
AHeuristicforPhenomena.pdf
McNeill, K., Lizotte, D. J., Krajcik, J., & Marx, R. W. (2006). Supporting students’ construc-
tion of scientific explanations by fading scaffolds in instructional materials. Journal of the
Learning Sciences, 15(2), 153–191.
Mehan, H. (1985). The structure of classroom discourse. In T. van Dijk (Ed.), Handbook of
discourse analysis (Vol. 3, pp. 119–131). London: Academic Press.
Moje, E. B., Ciechanowski, K. M., Kramer, K., Ellis, L., Carrillo, R., & Collazo, T. (2004).
Working toward third space in content area literacy: An examination of everyday funds
of knowledge and discourse. Reading Research Quarterly, 39(1), 38–70.
Moulding, B. D., Bybee, R. W., & Paulson, N. (2015). A vision and plan for science teaching and
learning: An educator’s guide to A Framework for K—12 Science Education, Next Generation
Science Standards, and state science standards. Salt Lake City: Essential Teaching and Learning
Publications.
National Research Council. (2012). A framework for K-12 science education: Practices, crosscutting
concepts, and core ideas. Washington, DC: The National Academies Press.
Otero, V. (2006). Moving beyond the “Get it or don’t” conception of formative assessment.
Journal of Teacher Education, 57(3), 247–255.
Pica,T. (2008).Task-based teaching and learning. In B. Spolsky & F. K. Hult (Eds.), The hand-
book of educational linguistics (pp. 525–538). Malden, MA: Wiley-Blackwell.
Popham, W. J. (2008). Transformative assessment. Alexandria, VA: Association for Supervision
and Curriculum Development.
Quinn, H., Lee, O., & Valdés, G. (2012). Language demands and opportunities in relation to
Next Generation Science Standards for English language learners: What teachers need to
know. Commissioned Papers on Language and Literacy Issues in the Common Core State Stand-
ards and Next Generation Science Standards, 94, 32.
Randel, B., & Clark, T. (2013). Measuring classroom assessment practices. In J. H. McMillan
(Ed.), SAGE handbook of research on classroom assessment. Los Angeles, CA: Sage.
Ruiz-Primo, M. A., & Furtak, E. M. (2006). Informal formative assessment and scientific
inquiry: Exploring teachers’ practices and student learning. Educational Assessment, 11(3–
4), 205–235.
Sadler, D. R. (1989). Formative assessment and the design of instructional systems. Instruc-
tional Science, 18(119–144).
Schleppegrell, M. J. (2002). Challenges of the science register for ESL students: Errors and
meaning-making. In M.J. Schleppegrell & M.C. Colombi (Eds.), Developing advanced lit-
eracy in first and second languages: Meaning with power (pp. 119–142). Mahwah, NJ: Laurence
Erlbaum.
Schleppegrell, M. J. (2004). The language of schooling: A functional linguistics approach. Mahwah,
NJ: Erlbaum.
Supowitz, J. A., & Turner, H. M. (2000). The effects of professional development on sci-
ence teaching practices and classroom culture. Journal of Research in Science Teaching, 37(9),
963–980.
Valdes, G., Capitelli, S., & Alvarez, L. (2010). Latino children learning English: Steps in the journey.
New York: Teachers College Press.
van Es, E. A. (2011). A framework for learning to notice. In M. Sherin,V. Jacobs, & R. Philipp
(Eds.), Mathematics teacher noticing: Seeing through teachers’ eyes. New York: Taylor & Francis.
van Zee, E. H., Iwasyk, M., Kurose, A., Simpson, D., & Wild, J. (2001). Student and teacher
questioning during conversations about science. Journal of Research in Science Teaching,
38(2), 159–190.
Warren, B., & Rosebury, A. S. (2008). Using everyday experiences to teach science. In A. S.
Rosebury & B. Warren (Eds.), Teaching science to English language learners: Building on stu-
dents’ strengths (pp. 39–50). Washington, DC: NSTA Press.
Wiggins, G., & McTighe, J. (2005). Understanding by design. Alexandria, VA: Association for
Supervision and Curriculum Development.
186 Amelia Wenk Gotwals and Dawnmarie Ezzo
Wiliam, D. (2013). Assessment: The bridge between teaching and learning. Voices from the
Middle, 21(2), 15–20.
Windschitl, M., & Thompson, J. (2013, September). The modeling toolkit: Making student
thinking visible with public representations. The Science Teacher, 63–69.
Windschitl, M., Thompson, J., Braaten, M., & Stroupe, D. (2012). Proposing a core set of
instructional practices and tools for teachers of science. Science Education, 96(5), 878–903.
Wiser, M., Smith, C., & Doubler, S. (2012). Learning progressions as tools for curriculum
development. In A. C. Alonzo & A. W. Gotwals (Eds.), Learning progressions in science: Cur-
rent challenges and future directions. Rotterdam: Sense Publishers.
Wolfe, S., & Alexander, R. J. (2008, December). Argumentation and dialogic teaching: Alternative
pedagogies for a changing world. London: FutureLab (Beyond Current Horizons).
Yoon, S. (2008). Using memes and memetic processes to explain social and conceptual influ-
ences on students understanding about complex socio-scientific issues. Journal of Research
in Science Teaching, 45(8), 900–921.
Zhang, M., Passalacqua, S., Lundeberg, M., Koehler, M.J., Eberhardt, J., Parker, J., Urban-
Lurain, M., Zhang, T., & Paik, S. (2010). “Science talks” in kindergarten classrooms:
Improving classroom practice through collaborative action research. Journal of Science
Teacher Education, 21, 161–179.
10
THE LANGUAGE OF MATHEMATICS
AND SUMMATIVE ASSESSMENT
Interactions That Matter
for English Learners
Focus Points
• Measurement of students’ mathematics knowledge is confounded by students’
facility in the language of the assessment, threatening the validity of the assess-
ment for students who are learning English.
• The language of mathematics includes the language practices by which stu-
dents engage in mathematics learning, as well as the syntax and discourse
features across a range of genres in mathematics, including the genre of sum-
mative mathematics assessment, which is evolving.
• Mathematics test items assess more than just “the math”; they include both
construct-relevant and construct-irrelevant language demands that interact
with students’ mathematical knowledge and expertise. Assessment developers
need to do a better job of minimizing construct-irrelevant difficulty in items.
• Teachers of mathematics need to prepare their students for the discipline-
specific language demands of mathematics, and policy makers need to under-
stand the limitations on the validity of inferences based on summative tests for
English learners (EL students).
• Assessment developers, teachers, and policy makers need robust understandings
of how sociocultural factors influence the way students make sense of math-
ematics and respond to mathematics test items.
Chapter Purpose
Students interact with mathematics through the use of language. In order for stu-
dents to make sense of a problem situation in a test item, they need to be able to
read, understand, and, in many cases, produce the language of mathematics. This
presents both challenges and opportunities for students learning English, because
mathematics is multisemiotic (O’Halloran, 2000, 2005, 2015; Schleppegrell, 2007;
188 Tina Cheuk et al.
Wong Fillmore & Snow, 2000): It is construed through symbolic expressions, visual
displays such as diagrams and graphs, correlational representations such as tables and
charts, and written text, as well as verbal communication and gesture. The purpose
of this chapter is to highlight the significance of the language of mathematics for
assessing student competencies in the context of summative testing and for assessing
EL students’ mathematics competencies. We aim to provide guidance for teachers
who support mathematics learning for EL students, for assessment developers who
face the challenge of minimizing construct-irrelevant difficulty in test items, and
for policy makers who make inferences and decisions based on the results of sum-
mative tests.
First, we will provide an overview of current research on increasing EL students’
access to mathematics items. Then, we offer a framework for understanding the
language demands of summative mathematics assessment items. Lastly, we discuss
implications for interpreting EL student outcomes on high-stakes tests and offer
suggestions for future research and practice that might guide more valid inferences
about students’ competencies in mathematics and the language of mathematics.
patterns, teaching and learning styles, and epistemologies inherent in students’ cul-
tural backgrounds, as well as the socioeconomic conditions prevailing in their cul-
tural groups” (p. 553). How we assess across cultures often relies on assumptions we
have about student populations. If the assessments are designed for the dominant
culture of English-only students, then the inferences we are making about EL stu-
dents may be inaccurate or biased.
The high stakes of testing also impact the lives of teachers, principals, and district
and state leaders. Within each nested level of the system, assessments pressure indi-
viduals with political, legal, and personnel effects that influence the core of teach-
ing and learning (Black & Wiliam, 1998; Ho, 2014). When outcomes of summative
assessment are used to draw inferences about the performance and quality of teach-
ers, principals, and schools, these inferences should meet high standards of validity
and fairness.When summative assessments are tightly aligned to decisions related to
resource allocation in schools (e.g., course offerings, teacher assignments, student
placements), test developers need to ensure that an assessment system’s outcomes
accurately reflect and represent the competencies being measured.
Current approaches to interpreting summative assessment outcomes do not
accurately represent the complex nature of the language and linguistic features that
are inherent in items (Solano-Flores, 2014). Math items work together to measure
students’ competencies.Together, these items are defined by a particular mathemat-
ics concept to form a one-dimensional scale for measurement. However, linguistic
and sociocultural features are also found within these same math items. As a result,
in analysis of correlations across item sets, item difficulty is generated by multiple
sources: They assess the targeted mathematics competency, a student’s interaction
with the language of the targeted mathematics, and the student’s interaction with
the item. Students’ interactions with each item produce score variations that are
dependent on the linguistic demands inherent in the item (Solano-Flores & Li,
2009, 2013). While some linguistic demands directly interfere with students’ com-
prehension of mathematics, others do not. The challenge for item developers in
viewing language as an integral part of mathematics learning is that there are not
well-developed explicit strategies or heuristics that can work around the language
of mathematics without compromising the validity of the mathematics construct
to be tested.
The items found on high-stakes tests focus on targeted mathematics, and for the
most part, concepts are treated as unidimensional constructs that do not account
for the complexities of language demands. The language demands of mathematics
go beyond specific vocabulary or specialized terms, as mathematics is multisemiotic
in nature (O’Halloran, 2000, 2005, 2015; Schleppegrell, 2007; Wong Fillmore &
Snow, 2000). Assessment items may be represented by text, visuals, diagrams, tables,
graphs, animations of models, and symbols. Students must navigate among these
various forms, make meaning across these representations with their own linguistic
resources, and produce answers that match the expected mathematical and linguis-
tic outputs of the item.
English-dominant students and EL students may have different linguistic resources
that can help them comprehend and respond to math items. For EL students, perfor-
mance on mathematical word problems may be biased downwards by the language
demands that are inherent in both the task and in their productive responses when
190 Tina Cheuk et al.
compared to native English speakers (Abedi & Lord, 2001; Cuevas, 1984; Martiniello,
2008, 2009; Pimm, 1987; Roth, Ercikan, Simon, & Fola, 2015; Solano-Flores, 2006).
To the extent that language demands unrelated to mathematics increase the difficulty
of test items, estimates of student mathematics achievement are depressed. Variance
in scores due to construct-irrelevant difficulties can lead to errors in inferences about
mathematics achievement. While all students may face comprehension difficulties
with math items, EL students are impacted differentially.
3x + 2y = 5z
The first response describes a calculation being carried out, and in this descrip-
tion, “5z’s” has a different status than three x’s and two y’s; it is the answer.
The second response also describes a set of operations, but this response and
the third response are algebraic statements: The verb “is” or “equals” states that
the subject and the predicate refer to the same number; 3x + 2y is a different
name (expression) for “5z.” These different responses likely reflect students’ dif-
ferent experiences in moving among semiotic systems of mathematics, their
everyday language, and the language of mathematics. The processes of moving
Language of Math and Summative Assessment 191
among these spheres provide important opportunities for content and language
development for students. Teachers may point out that the “=” sign can be inter-
changeable with the word “is,” serving as an important marker for meaning
making in problem-solving contexts.
Researchers who hold the view that language is a structural component of
mathematics attend to the way linguistic features play out in the organization
of text (Crystal, 1997; Zwiers, 2014). This often includes technical and every-
day vocabulary, syntax, and grammar of the math tasks as units of analysis. It is
here that item developers have made the most significant progress in provid-
ing guidance related to supporting access for EL students (Abedi, 2008; Kief-
fer, Rivera, & Francis, 2012; Sato, 2008). These accommodations range from
modifications that include using simplified English in item design and providing
vocabulary assistance through the use of glossaries, to providing extended time
and matching the language of the test to the language of instruction or in stu-
dents’ primary language (e.g., Spanish) (Abedi, Courtney, & Leon, 2003; Kieffer,
Rivera, & Francis, 2012). These supports are intended to increase students’ com-
prehension of items and provide greater access for students to demonstrate their
competencies in mathematics.
While these accommodations aim at reducing the language load for EL students
when employed at the construct and item development level, they have mixed
student outcomes (Abedi, 2008; Abedi & Lord, 2001; Duran, 2008; Sato, 2008).
A promising language-related accommodation has been the strategy of linguistic
modification (Abedi, 2008; Sato, 2008). Without altering the math construct to be
tested, the aim of linguistic modification for items is to minimize construct-irrelevant
language so that EL students have greater access and interactions with the intended
mathematics of the item. A major challenge to this strategy is to ensure that the
integrity of the math construct is not altered.
In our current conditions of large-scale testing, we are constrained by time,
costs, scoring reliability, and limited feedback mechanisms to students and teachers.
These constraints limit testing to a meager sample of the range of work students can
produce under the current standards.To better understand this dilemma, we have to
better understand the validity of these assessments, especially as they pertain to EL
students.The validity of test scores used to make inferences that impact EL students
depends on the validity of the test items. Construct-irrelevant language difficulties
are a major threat to this validity.
We agree with researchers who argue that decisions about individuals (e.g., stu-
dents, teachers, principals) based on summative assessment outcomes pose serious
validity questions related to EL student performance (Abedi, Lord, & Hofstetter,
1998; Abedi, Courtney, & Leon, 2003; Abedi & Gándara, 2006; Kiplinger, Haug, &
Abedi, 2000; Lane & Leventhal, 2015). After all, large-scale, standardized summative
tests are not designed to be tools to guide individual decision making, as they serve
as blunt instruments in measuring mathematics achievement (Wu, 2012). Rather,
large-scale assessments rely on statistical models that support inferences from sam-
ples to populations.
If we are interested in using large-scale assessment results to learn some-
thing about how well populations of EL students are learning mathematics, then
192 Tina Cheuk et al.
In the rest of this chapter, we will focus primarily on the first of these ques-
tions, with illustrations of how to analyze the role that construct-relevant and
construct-irrelevant language demands play in assessment items. Our analysis is
intended to support further discussion of questions about item design and item
quality.
Best Practices
We cannot untangle assessment of students’ mathematical knowledge from assess-
ment of students’ facilities with the language of mathematics. However, to under-
stand how the interaction of language and mathematics generates challenges in the
context of summative assessment, we must distinguish between construct-relevant
and construct-irrelevant language.
It is unclear whether item-writing practices for summative assessments in
mathematics have sufficiently minimized construct-irrelevant comprehension
challenges. Between 25% and 37% of test items on U.S. state and national sum-
mative assessments in use during 2003–2006 were “inconsiderately written” (Daro,
Stancavage, Ortega, DeStefano, & Linn, 2007; DeStefano & Johnson, 2013). We
argue that these inconsiderately written items have contributed significantly to the
construct-irrelevant variance that manifests itself in both the true and observed
score of the item (see Figure 10.1). As we have stated, the goal for item writers is
Language of Math and Summative Assessment 193
Error
Construct-Irrelevant
Variance
Observed
Score
True
Valid Measurement Score
Construct
Construct Under-
representation
between “everyday” language and a language that is used expressly for mathemati-
cal purposes.
As stated previously, the specialized disciplinary representational schemes
can contribute to the construct-relevant difficulty of an assessment item.
Table 10.1 summarizes some common linguistic demands that may hinder
student comprehension and is divided into two parts: language features that
contribute to construct-irrelevant variance and language features that are
construct-relevant. The construct-relevant language features can contribute to
item difficulty but reflect appropriate means for conveying and measuring a
disciplinary construct.
Story features that make the mathematical Story features that engage interest away
question engaging from the mathematical question
Compound sentences used to make logical Compound sentences that could be broken
relations explicit (e.g., If. . ., then. . .) up without losing logical connections
Sentence complexity used to express logical Sentence complexity used to express
relations among quantities: clauses and context of a problem
phrases joined by “and, or, if, let . . .”
Phrases used to express quantitative Use of quantifier words referring to
relations: phrases using “of, per, each, something other than quantities of
every, some, all, for all, one, a, the, more interest
than, less than, taller, as tall as, the same as,
equal, greater than . . .”
Unnecessary embedded clauses
Use of passive voice to make abstract Unnecessary use of passive voice to
statements describe action in a word problem
Phrases used to refer and co-refer to Pronouns whose reference is ambiguous.
quantities known, unknown, or variable Using different words to refer to the
same identity (character, quantity,
situation)
In a multiple-choice problem, reference to
the answer set embedded in the context
language
Symbolic expressions Unfamiliar non-mathematical notations or
symbols
Cartesian graphs and standard tables Esoteric graphs, charts or tables
Mathematical representations specified in Unfamiliar charts and diagrams, odd
standards such as number lines, double representations of data, violations of
number lines, tables, tape or bar diagrams, common conventions in diagrams or
histograms, arrays charts
Diagrams that show relationships between/ Illustrations that provide context but
among quantities convey no information relevant to
solving the problem
Diagrams of geometric figures that Violations of conventions in diagrams of
conform to conventions geometric figures
Language of Math and Summative Assessment 195
Item Analysis
To solve a word problem, a student must make sense of the problem situation.
Making sense of the problem situation is building a “situation model” or mental
model in comprehending the text of a word problem (Davis, 1984; Kintsch, 1974;
Kintsch & Van Dijk, 1978). In other words, reading comprehension of the word
problem and making sense of the mathematics of the problem overlap to a great
extent. If the reading comprehension is made more difficult because of construct-
irrelevant demands, greater error is introduced into the construct. If such irrelevant
comprehension demand is common across items, then systematic errors are intro-
duced, impacting all learners, especially EL students.
In what follows, we analyze two test items (Figures 10.2 and 10.3) to dis-
tinguish between construct-relevant and construct-irrelevant features of word
problems. Our analysis attends to two dimensions of difficulty—language
and mathematics—and includes three parts: (1) likely sources of difficulty;
(2) quantities and inferences in the problem; and (3) construct-relevant and
construct-irrelevant language demands.
6 2 8
The three digits above can be used to make 6 different 3-digit numbers.
If one of the 3-digit numbers is picked at random, what are the chances that it will be an
odd number?
A. Impossible
B. Possible but not very likely
C. Very likely but not certain
D. Certain
All of the inferences about the “When an additional 6.5 liters of water are
quantities given require students to poured into the container, some water
engage in the CCSS Standards for overflows.”
Mathematical Practice, specifically The language in this sentence may get in the
MP1, MP2, and MP4. way of mathematical sensemaking. “When an
Although there are many points additional 6.5 liters of water are poured into
within the sensemaking process the container” is in passive voice. Presumably,
where students can go offtrack, this someone (subject) pours an additional 6.5
kind of mathematical sensemaking liters of water (object) in the container, and
is construct-relevant. Imagining the some water overflows. The subject is not
four volumes and their relationships explicitly mentioned, obscuring the intended
are construct-relevant. meaning.
The measurement terms such as The directions are in the genre of procedural
“rectangular container,” “length,” narrative: “Use words, pictures, and numbers
“width,” “height,” “depth,” and “liters” to explain your answer” leads with the
cannot be replaced or simplified objects. These could be worded better as a
and contribute to the mathematical directive: “Explain your answer using words,
construction of the problem. pictures, and numbers.”
layered on top of this mathematical content dials up the complexity of the item into
CCSS Grade 7 territory, conceptually if not technically; this contextual layer carries
much greater potential for interference in mathematical sensemaking for EL students.
The semiotic mathematical work of this item is more complex than the numeri-
cal work involved. The first sentence, which contains the references “three digits,”
198 Tina Cheuk et al.
The inferences that must be made to coordinate among these quantities, par-
ticularly given their complex representations in words, presents multiple opportu-
nities for construct-irrelevant difficulty to introduce bias in the assessment.
The current model of item development as shown in Figure 10.4 begins with
the notions we have regarding the latent construct, centered predominantly on
students’ knowledge of mathematics. What we know is that the language and lit-
eracy processes that support students’ comprehension of the mathematics are also
inherent at the construct level. The theories that item developers use in guid-
ing their construct map development and the subsequent downstream activities
need to include both our notions about mathematics learning and the construct-
relevant language demands that are required to comprehend the item. More research,
including in the instructional context of the classroom, is needed to develop strong
theories around our notions of the interactions between mathematics learning and
language so that the inferences that are made are valid and fair for all learners.
In this chapter, we have attended specifically to the construct-relevant and
construct-irrelevant demands found in summative assessment items.These language
issues create threats to validity in the inferences we make for all students, espe-
cially EL students. While reducing or eliminating unnecessary construct-irrelevant
language demands at the item level may increase the precision and validity of the
items in regard to the mathematics tested, there needs to be greater systematic vet-
ting and review of draft items before field testing with diverse populations. Also, our
theories about how students learn mathematics through language use needs to be
augmented. Kopriva, Thurlow, Perie, Lazarus, and Clark (2016) argue that, “incor-
porating the person dimension into validity arguments requires that the quality
of the evidence supporting or questioning validity be considered in terms of the
degree to which the interpretations are valid for all test takers” (p. 109). In other
words, an individual student’s interactions with the item serve as an important
dimension in how we think about individual differences across the construct, the
design of the items, and validity of the score interpretations. Theory building and
empirical testing need to work in concert with one another so that we have a more
Validity
Validity
argument
Notions
regarding Construct Items Responses Scores Inferences
trait
comprehensive view about the interactions between mathematics learning and lan-
guage use by students, both in classroom settings and assessment settings.
Using summative tests to make decisions about individuals is fraught with risks.
The problem of wide error band around an individual’s score, as well as how lan-
guage demands contribute to construct-irrelevant variance, are well known by meas-
urement experts. The risks associated with measurement errors compounded with
the construct-irrelevant variance limit the scope and depth of the inferences we
can make about EL students. The score reliability, score precision, comparability, and
equating across various subgroups pose major psychometric challenges for the field
(Abedi, 2002; Lane & Leventhal, 2015; Sireci, Han, & Wells, 2008). These risks are
compounded by bias and depression of performance for all the reasons stated earlier.
Research on item modification that tests accommodations intended to increase
students’ access to mathematics tasks is needed (Abedi, 2008; DeStefano & Johnson,
2013; Lane & Leventhal, 2015; Sato, 2008); however these types of accommoda-
tions only address students’ interactions with the items themselves. If any other
parts of the continuum are flawed (see Figure 10.4), the investments we make at
this junction can only mitigate a portion of the errors and variance that are pro-
duced when EL students are assessed. More methodological research is needed on
multidimensional models, as well as measurable constructs that integrate mathemat-
ics and language (Briggs & Domingue, 2014). Research that takes into account
both instructional and assessment strategies that influence students’ access to—and
production of—the language of mathematics will deepen our knowledge of the
interactions that matter for EL students.
References
Abedi, J. (2002). Standardized achievement tests and English language learners: Psychometric
issues. Educational Assessment, 8(3), 231–257.
Abedi, J. (2008). Linguistic modification. Part I—language factors in the assessment of English lan-
guage learners:The theory and principles underlying the linguistic modification approach.Washing-
ton, D.C.: LEP Partnership.
Abedi, J., Courtney, M., & Leon, S. (2003). Effectiveness and validity of accommodations for English
language learners in large-scale assessment. (CSE Tech. Rep. No. 608). Los Angeles, CA: Uni-
versity of California: Center for the Study of Evaluation/National Center for Research
on Evaluation, Standards, and Student Testing.
Abedi, J., & Gándara, P. (2006). Performance of English language learners as a subgroup in
large-scale assessment: Interaction of research and policy. Educational Measurement: Issues
and Practices, 26(5), 36–46.
Abedi, J., & Lord, C. (2001). The language factor in mathematics tests. Applied Measurement
in Education, 14(3), 219–234.
Abedi, J., Lord, C., & Hofstetter, C. (1998). Impact of selected background variables on students’
NAEP math performance. Los Angeles, CA: UCLA Center for the Study of Evaluation/
National Center for Research on Evaluation, Standards, and Student Testing.
Amrein, A. L., & Berliner, D. C. (2002). High-stakes testing, uncertainty, and student learning.
Education Policy Analysis Archives, 10(18), 1–74.
Au, W. (2007). High-stakes testing and curricular control: A qualitative metasynthesis. Educa-
tional Researcher, 36(5), 258–267.
Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education,
5(1), 7–73.
204 Tina Cheuk et al.
Focus Points
• Scientific explanation, argument and prediction genres are different from anal-
ogous genres in nonscientific contexts (e.g., vernacular or polemic); Bakhtin
suggests that people rely on different social languages in these different contexts.
• Developing capacity for scientific genre production poses a double challenge
for English learners (ELs) because they need to simultaneously learn the new
national language of their adopted country and the new social language of
scientific discourse.
• Producing scientific explanations, arguments, and predictions requires drawing
on a related suite of scientific knowledge and practice.
• Learning progression frameworks, which describe trajectories from less sophis-
ticated, informal discourse to more sophisticated, scientific discourse, and asso-
ciated assessments can be used to examine students’ developing capacity to
produce scientific explanations, arguments, and predictions.
Chapter Purpose
Explanation, argument, and prediction are basic functions of all languages. We
engage in these practices every day in many contexts. Scientific communities have
developed explanation, argument, and prediction as specialized genres that differ
from vernacular versions in obvious ways (e.g., using technical terminology and
mathematical formulas) and in less obvious but deeper ways involving the goals
and uses of language. Deeper characteristics that separate scientific genres from ver-
nacular ones are reflective of disciplinary discourse conventions that students must
learn in order to participate in the community of science. Bakhtin (1981) describes
the patterns of difference in language use among communities as different social
languages.
Assessing Scientific Genres 207
vernacular and scientific uses as the occasion demands. In this respect, we follow
Bakhtin (1981;Wertsch, 1991) in distinguishing between national languages (English,
Spanish, Japanese, etc.) and social languages, which Bakhtin (1981) associates with
“social dialects, characteristic group behavior, professional jargons . . . languages that
serve the specific sociopolitical purposes of the day” (p. 262).
Gee’s (1991) definition of a discourse resembles Bakhtin’s definition of a social
language:
[A] socially accepted association among ways of using language, of thinking, and
of acting that can be used to identify oneself as a member of a socially meaning-
ful group or “social network.” . . . Think of discourse as an “identity kit” which
comes complete with the appropriate costume and instructions on how to act
and talk so as to take on a particular role that others will recognize.
(p. 3)
Gee (1991) describes the challenges that we face in science education by dis-
tinguishing between primary discourses that we acquire in our homes and secondary
discourses that we learn in other social settings:
All humans . . . get one form of discourse free, so to speak. . . . This is our
socio-culturally determined way of using our native language in face-to-face
communication with intimates. . . . Beyond the primary discourse, however,
there are other discourses which crucially involve institutions beyond the
family. . . . Let us refer to these institutions as secondary institutions (such as
schools, workplaces, stores, government offices, businesses, or churches) . . . .
Thus we will refer to them as “secondary discourses.”
(pp. 7–8)
Within the broad patterns of language and practice defined by discourses or social
languages, a genre is a type of communication that is characterized by a set of expectations,
vocabularies and styles embedded in the discourse community that uses that genre (Fang,
Schleppegrell, & Moore, 2014;Wertsch, 1991, 1994).The genres we focus on—explana-
tion, argument, and prediction—are common in many different discourse communi-
ties and are associated with different expectations, vocabularies, and styles depending on
context and community. When a participant in the scientific community encounters an
example of one of these genres in a scientific context, there are expectations for what
vocabularies and styles are used and which rules or principles need to be followed.
Explanations, arguments, and predictions are all genres that students are familiar
with before they start studying science in school. Scholars have investigated how
people engage in these genres in nonscientific contexts. In our research, we have
seen that learners carry their nonscientific ways of thinking, talking, and writing
into school science contexts. Primary (i.e., home community) discourse is appro-
priate in many situations and makes it possible for people to act efficiently. How-
ever, there are contexts in which primary discourse genres lack the power and
precision of related scientific genres. Below we review some of the research we
have found to be relevant as we have sought to characterize nonscientific genre
performance approaches that students bring to school science contexts.
Assessing Scientific Genres 209
Explanations
Three research-based ideas that we have found useful for interpreting students’
intuitive explanations include force-dynamic reasoning (Pinker, 2007;Talmy, 1988),
embodied experience (Warren & Roseberry, 2008), and covering laws (Braaten &
Windschitl, 2011).
210 Beth Covitt and Charles W. Anderson
Force-dynamic reasoning
Describes how people commonly think about, talk about, and make sense of the
world in their everyday lives and home communities (Pinker, 2007; Talmy, 1988).
When people use force-dynamic reasoning to explain phenomena, they are view-
ing the world as a place where actors seek to achieve innate purposes and confront
antagonists that can hinder their ability to achieve those purposes. Force-dynamic
reasoning is shaped and constrained by the grammatical structure of language used in
everyday contexts. While force-dynamic discourse works well in everyday contexts
(e.g., to explain people’s actions and motivations), it is inconsistent with explaining
scientific phenomena such as trees growing or relationships among trophic levels in
ecosystems.
Embodied Experience
Warren and Rosebery (2008, p. 44) suggest that, “[w]hen faced with a physical or
biological phenomenon that needs to be explained, children regularly compare
aspects of it to their own lived experiences, experiences they most likely have
not analyzed before for academic purposes.” From a genre assessment perspective,
explanations based on embodied experience provide a window into how students
are making sense of a phenomenon.
investigating which type of fertilizer causes plants to grow best, but not why). Simi-
larly, in an examination of argumentation practices of high school and college stu-
dents, Jin, Mehl, and Lan (2015) found that as students constructed arguments, they
struggled with using scientific mechanisms. Their arguments often consisted only
of “a simple cause-and-effect chain” ( Jin et al., 2015, p. 1156). In these approaches
to inquiry and argument, evidence and claims are used to inform what happens
without needing to address why.
Engineering Mode
Rath and Brown (1996) found that elementary students engaging in science inquiry
often adopt an “engineering mode,” in which the objective is to make something
happen. In engineering mode, actions, manipulation of variables, and evidence from
activities serve the purpose of accomplishing some physical outcome (e.g., using
a magnifying glass to burn paper) rather than developing model-based reasoning.
Schauble, Klopfer, and Raghavan (1991) describe a similar approach, which they
call an engineering model of experimentation. In contrast, learning experiences aimed
at developing model-based reasoning must serve the purpose of building students’
capacity to use scientific models to explain and predict how and why phenomena
in the material world occur (Braaten & Windschitl, 2011).
Predictions
Scholarship addressing heuristics has also informed understanding of how nonsci-
entists make predictions. Humans have two separate systems for perceiving, think-
ing, and deciding. One system is fast and intuitive while the other is slow and
analytical (Kahneman, 2011). Humans mostly rely on our fast thinking system, but
science involves making use of the slower and more analytical system. For example,
when they make predictions, scientists have careful and systematic ways, including
statistical methods, to deal with uncertainty and stochasticity, validate patterns in
evidence, and use scientific models to make qualified projections about the future.
In contrast, people often use intuitive nonscientific ways of making predictions
such as coming to immediate conclusions that feel certain, fitting evidence into
pattern stories that make sense but may not account for all data, and relying on
information and sources that agree with their personal narratives (e.g., Haidt, 2001;
Kahneman, 2011; Silver, 2012).
212 Beth Covitt and Charles W. Anderson
Developing Frameworks
Each of our learning progressions includes a learning progression framework describ-
ing characteristic performances of students at different levels in the learning pro-
gression and assessments, including written questions and clinical interviews that
provide data for developing frameworks and methods for assessing the progress of
individual students.
Drawing on both the scientific literature and data from middle school
through university level students, K-12 teachers, and scientists, we have articu-
lated frameworks that identify characteristics evident in genre performances at
different levels of sophistication in knowledge and practice. Because our research
is grounded in sociocultural approaches and perspectives, our frameworks for
analysis, while they focus on students’ written and spoken language, are differ-
ent from frameworks adopted in functional linguistic approaches. For example,
our approach aligns with a functional linguistic analysis in aspects such as focus
on meaning (semantics) and voice (Fang, 2004; Fang, Schleppegrell, & Moore,
2014). However, our approach diverges in that we do not explicitly analyze
214 Beth Covitt and Charles W. Anderson
1. Connections among genres: How are scientific explanations, arguments, and pre-
dictions connected to one another in terms of language use, knowledge, and
practice?
2. Describing and measuring students’ progress: How can we describe and measure
learners’ progress as they develop proficiency in creating scientific explana-
tions, arguments, and predictions?
TABLE 11.1
Characteristics of Scientific Genre Performances Involving Carbon
Transforming Processes
Characteristics of Upper Level “Scientific” Macro Scale Macro Scale Large Scale
Performances Explanations Arguments Explanations &
Predictions
of conservation of matter and energy to explain what happens when wood burns.
Further, he connects the macroscopic scale phenomenon of burning to the atomic-
molecular scale when describing what happens to molecules and molecular bonds
during the chemical reaction of combustion.
basis. So I did for the 2015, just a little bit higher than 2014. And then
you kind of take that data out a little bit further and for another five
years, and I think we go up quite a bit from the current time.
Interviewer: Are you making any assumptions when you make those predictions?
Hans: Yeah, that it continues at the same rate, or that we don’t do anything
to curb our carbon dioxide use.
Interviewer: Right. Do you think that that’s exactly the amount that it’s going to
have, or that it’s going to fall in a range or how are you thinking about
that?
Hans: I think it probably is going to fall into some sort of range. I think the
2015 number can be more accurately predicted than the 2020 number
because you’re going to be doing more extrapolation.
While Hans’s performance has a few problems, his response demonstrates several
characteristics of a scientific prediction. For example, Hans connects his predic-
tion to a scientific model when he states he is assuming “we don’t do anything to
curb our carbon dioxide use.” It is problematic that Hans suggests we “use” carbon
dioxide rather than fuel, which is converted to carbon dioxide and water during
combustion. However, Hans is clearly connecting the data and his prediction to
events, processes, and actions in the world rather than viewing the graph in a purely
numerical light. Hans also recognizes that the data form an annual pattern, so he
can make his prediction specifically for the month of May. Further, Hans’s refer-
ences to extrapolation suggest he is familiar with identifying trends and recognizing
uncertainty in data (i.e., recognizing that there is more uncertainty the further one
extends into the future, in part because it is increasingly uncertain what emissions
levels will be the further we move into the future).
218 Beth Covitt and Charles W. Anderson
Summary
By applying an integrated assessment approach, we find that scientific genre perfor-
mances may be characterized as connected, canonical, and context-specific.They are connected
in the sense that they require overlapping sets of knowledge and practice. For example,
all of the performances rely on conservation of matter and energy as fundamental
principles that constrain what is possible.They are canonical in that the scientific genre
performances require access to model-based scientific explanations of carbon trans-
forming processes and a commitment to tracing matter and energy through systems.
Finally, they are context-specific in that successful performances require specific knowl-
edge about the systems and processes being explained or investigated (e.g., growing
plants, burning wood, CO2 concentrations in the atmosphere).
Thus, Eric’s explanation of combustion required that he develop a canonically
aligned explanation connecting the tracing of matter and energy across macro-
scopic and atomic-molecular scales. Olivia’s argument critique required that she
could analyze and interpret data and align those data with her scientific model in
a principled critique of how the presented claims were or were not supported by
evidence. Finally, Hans’s prediction performance required the capacity to identify
trends in data, deal with uncertainty in data, and coordinate data and evidence with
a relevant scientific model and explanation.
• If students are not tracing and conserving matter and energy, what are they
doing with matter and energy in their explanations, arguments, and predictions?
• When students have limited access to scientific models or limited capacity to
make sense of data and evidence, how do they critique arguments they encounter?
• What are alternative ways that students use data and models to make predictions?
Explanation
Table 11.2 summarizes contrasting characteristics of macroscopic scale explanations
reflecting lower (informal), intermediate, and higher (scientific) levels of discourse.
Felicia: The weight [of the plant] comes mostly from H2O it receives, which it
uses in its light reactions to eventually produce glucose to provide itself
with energy.
Assessing Scientific Genres 219
Felicia’s performance shows she is trying to trace matter and energy but that she
has not yet mastered this practice. For example, when she says the plant uses water
in light reactions to produce glucose to provide itself with energy, is Felicia tracing
energy from “light reactions” or from the plant itself? Her language is not clear on
this point. Felicia also accesses a less sophisticated model of plant growth, suggesting
that the mass of the plant comes mostly from water.
In the following example, high school student Richard explains what happens
to energy when fuel is burned.
Richard: The gasoline is burned while it’s in the engine, and all the bonds in it
are broken and rearranged. And then it goes out the exhaust into the
atmosphere as carbon dioxide. . . .
Interviewer: So where does the energy initially in the gasoline go?
Richard: It runs through the engine and then is converted to carbon dioxide.
Like Felicia, Richard is also working on tracing matter and energy but has not
yet achieved expertise in this practice. Richard recognizes that changes are occur-
ring in molecular bonds during burning, but he converts energy to matter in com-
bustion rather than tracing chemical energy from molecular bonds in gasoline to
other forms of energy.
Reaganne: I think their [the plants’] weight comes from the soil and fertilizer
because as it grows it increases in weight and fertilizer and soil are the
things that make a plant grow.
that the plant is enabled or helped to grow by fertilizer and soil. Tracing matter is
not particularly relevant to Reaganne’s way of thinking and talking about plant
growth.
Next, Jenna provides a lower level explanation of wood burning.
Jenna: The wood burns into ash and it loses weight because it is losing mass.
Argument
Table 11.3 summarizes characteristics of macroscopic scale argumentation reflect-
ing increasingly sophisticated levels of discourse.
Note that there is a hierarchical component to the genres of explanation, argu-
ment, and prediction. Thus, the characteristics of levels of explanation in Table 11.2
are also relevant for argument (Table 11.3). Similarly, characteristics of levels of
explanations and arguments are also relevant for understanding approaches to pre-
diction (summarized in Table 11.4).
TABLE 11.3
Learning Progression for Critiquing Arguments About Macroscopic Scale
Phenomena
Spencer has learned (likely in school science) that carbon dioxide from the air
is used by plants to make glucose, but he still retains the conflicting idea that most
of plant mass must come from soil. Spencer’s conflicting statements (e.g., that he
doesn’t “know that the stuff from the air would give it much mass”) show how dif-
ficult it can be for students to accommodate scientific ideas that conflict with their
own embodied experience concerning how the world works.
With regard to interpreting and analyzing data, Spencer does not attend to the
key quantitative comparison between the mass that the plant gained and the mass
that the soil lost. He focuses instead on a qualitative causal connection; adding
222 Beth Covitt and Charles W. Anderson
nutrients to the soil helps the plant gain weight. So rather than quantitatively trac-
ing matter he qualitatively traces cause and effect. In critiquing Karen and Mike’s
arguments, Spencer retains dual competing explanations about the system and does
not commit to one of the claims in his argument critique. He has pieces of the
scientific model in mind but is not committed to the scientific model. He traces
matter qualitatively (e.g., understands the weight needs to come from somewhere)
but does not see the need to trace matter using the quantitative data.
Prediction
Table 11.4 summarizes additional (beyond those shown in Table 11.2 and 11.3)
characteristics of discourse levels for large-scale predictions.
already asked the student to identify and explain the trend in the graph. Note that
while there is some variability (noise) in these data, the trend (signal) of decreasing
ice extent is relatively clear.
Interviewer: So go ahead and use a ruler or whatever and decide where you think
those would be.
Alice: [Marks Figure 11.2.] I would say this is roughly 10.
Interviewer: And 10.25 in 2014. In 2019 it’s 10? And how did you decide on those
two points?
Alice: I just followed the trend. I’m not too sure since it fluctuates a lot,
so. . . .
Interviewer: So you mentioned it fluctuating; so are these values, like, set in stone,
or is there a range?
Alice: No.
Interviewer: What do you think the range would be for these guys [predictions of
10.25 and 10]?
Alice: It could be anywhere from like 9.5 to 11. Probably not to 11, probably
to like 10.75. And then for 2019 it could go like anywhere from like
9 to 10.5. No, there’s a range.
Interviewer: How did you decide on those ranges?
Alice: I just guessed. Like looking like off this data too. Those are hypothesis,
I guess.
Interviewer: So how did you use these data to—the previous data to make your
hypothesis?
Alice: It’s like looked at the trend, like that’s where I saw the range like it
could either go up or down, like, so I figured that in. And like how far
it goes up or how far it goes down.
Alice’s performance (see also graph in Figure 11.2) reveals a mixture of less and
more sophisticated knowledge and practice. Several practices that Alice performs
quite well include identifying a data trend line for Arctic Sea ice and extend-
ing that trend line to make a prediction for the future. Alice also recognizes vari-
ability in the graph and how that variability lessens the ability to make a precise
prediction—leading Alice to acknowledge that there is a range within which the
extent of ice is likely to fall. Somewhat less sophisticated practices evident in Alice’s
performance include that she only refers to the graph itself to inform her predic-
tion and does not consider factors in the real-world system that the graph reflects
(e.g., global temperature trends and variabilities). Also, Alice does not have a sophis-
ticated way to talk about dealing with uncertainty. Her approaches include “guess-
ing” and looking at “how far it goes up or how far it goes down.”
Interviewer: And tell me your thinking about that.You went up a little bit for 2015
and then down and back up again.
Analise: Yes. The reason I think I did that is because I think there was a peak.
And then it decreases and then it goes up like I think I just looked at
this part. Here it goes up and then down.
Interviewer: And do you think that that’s exactly where it will be, or can you pic-
ture a range of possible answers?
Analise: I think there could be a range of possible answers because like here
with this time frame these years it kind of all, like, was the same and
then it just dropped. So for this part it could just drop or it could just
go up.
Like Alice, Analise does not perceive this to be a question that requires a scien-
tific explanation. She focuses on the shape of the graph to the exclusion of the real-
world system that the graph represents. Analise also has trouble dealing with noise
in the graph and may be conflating variation or noise in the graph with a pattern,
making the incorrect inference that there is a pattern in which the ice extent goes
up and then down each successive year. Making this inappropriate inference leads
Analise to make a prediction that is more specific than the data and model suggest
is reasonable, following an “up then down” pattern of variation.
Summary
Examining students’ intermediate and lower level genre performances illuminates
the knowledge and practices that these students bring with them to engaging in
science. Examples of lower level genre performances highlight that many students
apply force-dynamic reasoning, folk inquiry, and heuristic approaches to explain-
ing, arguing, and predicting. Similarly, examples of intermediate level genre perfor-
mances show that, as students learn the language of science, they may struggle with
practices such as invoking scientific models to develop explanations and aligning
those scientific models with quantitative data and evidence to critique arguments
or make predictions.
which words students use and more with how they use them. For example, an
assessment response that refers to “sugar” in cellular respiration without making
a matter-to-energy conversion would be coded at a higher discourse level than
a response that uses the scientific term “glucose” but then indicates that glucose
can be converted into energy.
Classification activities provide one example of how to connect meaning in
vocabulary work with EL students (DeLuca, 2010). For example, we have found
categorizing examples of organic versus inorganic materials to be very challenging
for students. Organic in science class means something quite different from organic
at the grocery store. Sorting and discussing classifications in both vernacular and
science language discourses can help students master important distinctions and
build multilingual discourse proficiency.
In general, vocabulary teaching and assessment with EL students should focus
on making meaningful connections. School science has the potential to serve as a
bridge rather than a dead end between home and science discourses if it helps stu-
dents connect as well as differentiate meanings across vocabulary from their home
languages and Tier 1, 2, and 3 words in English (DeLuca, 2010). Strategies includ-
ing using English to home language vocabulary cognates and connecting meanings
across vocabulary Tiers 1, 2, and 3 can support EL students in learning to participate
in the social language of science (DeLuca, 2010; Goldenberg, 2013).
arguments, and predictions and they continue to use those genres in appropriate
situations. But they also master scientific genres and learn when and how these
genres can be powerful and effective. EL students face the added challenge of learn-
ing two new languages in science class (a new national language and the new social
language or discourse of science). Like English-speaking students, EL students also
need to learn to decide when the social language of science, as opposed to a ver-
nacular language, is called for.
Learning progression-based curricula with associated formative assessment
resources can provide resources for leveraging student learning. Learning progres-
sion frameworks distinguish discourses across levels of achievement. Formative
assessment resources associated with these frameworks can help teachers identify
how their students are making sense of the world and what next instructional steps
are appropriate to help students develop more sophisticated knowledge and prac-
tice. Learning progression-based formative assessments are designed to be accessible
to students and to facilitate categorization of student responses within the levels
of a learning progression framework. These assessments can help teachers identify
where their students are along learning progression levels of discourse, and associ-
ated curricular materials can help teachers respond to the levels of discourse that
their students bring to science class.
Our own learning progression research concerning carbon transforming
processes has informed a curricular program called Carbon: Transformations in
Matter and Energy (Carbon TIME). The Carbon TIME program includes teach-
ing units, an online assessment system, teacher professional development, and
local teacher networks. The teaching units and assessment system are publicly
available at carbontime.bscs.org. The units, designed for middle and high school
science classes, focus on processes that transform matter and energy in organ-
isms, ecosystems, and global systems including combustion, photosynthesis, cel-
lular respiration, digestion, and biosynthesis. Several teachers of EL students have
adapted the Carbon TIME curriculum to integrate strategies such as use of realia
and firsthand experiences to help students connect language with real-world
phenomena; use of individual student white boards for diagramming, sharing,
and discussing; and provision of textual and graphical supports for writing tasks
and assessments.
Carbon TIME and other programs like it illustrate the power of attending to
language and discourse in science education assessment and instruction. Science
is a subculture with an associated language and a defined notion of literacy. Thus,
science teachers of both English-speaking and EL students may be thought of as
language teachers tasked with helping students develop the capacity to create and
critically evaluate scientific genres such as explanation, argument, and prediction
both in the classroom and, importantly, beyond.
Note
1 While we consider implications for EL students, the research was not conducted with
an a priori focus on EL students, and thus first language status was not collected with the
reported data.
228 Beth Covitt and Charles W. Anderson
References
Bakhtin, M. M. (1981). The dialogic imagination: Four essays. (Ed., M. Holquist;Trans. C. Emer-
son & M. Holquist). Austin, TX: University of Texas Press.
Bloom, D., Puro, P., & Theodorou, E. (1989). Procedural display and classroom lessons. Cur-
riculum Inquiry, 19(3), 265–291.
Braaten, M., & Windschitl, M. (2011).Working toward a stronger conceptualization of scien-
tific explanation for science education. Science Education, 95(4), 639–669.
Bruna, K. R.,Vann, R., & Escudero, M. P. (2007). What’s language got to do with it? A case
study of academic language instruction in a high school “English Learner Science” class.
Journal of English for Academic Purposes, 6(1), 36–54.
Covitt, B., Dauer, J., & Anderson, C.W. (2017).The role of practices in scientific literacy. In C.
Schwarz, C. Passmore, & B. Reiser (Eds.), Helping students make sense of the world using next
generation science and engineering practices (pp. 59–83). Arlington,VA: NSTA Press.
Dauer, J. M., Doherty, J. H., Freed, A. L., & Anderson, C. W. (2014). Connections between
student explanations and arguments from evidence about plant growth. CBE-Life Sciences
Education, 13(3), 397–409.
DeLuca, E. (2010). Unlocking academic vocabulary lessons from an ESOL teacher. The Sci-
ence Teacher, 77(3), 27–32.
Fang, Z. (2004). Scientific literacy: A systemic functional linguistics perspective. Science Educa-
tion, 89(2), 335–347.
Fang, Z., Scheppegrell, M. J., & Moore, J. (2014). The linguistic challenge of learning across
academic disciplines. In C. A. Stone, E. R. Silliman, B. J. Ehren, & G. P. Wallach (Eds.),
Handbook of language and literacy: Development and disorders (2nd ed., pp. 302–322). New
York: Guilford Press.
Gee, J. (1991). What is literacy? In C. Mitchell & K. Weiler (Eds.), Rewriting literacy: Culture
and the discourse of the other (pp. 3–11). Westport, CT: Bergin & Gavin.
Goldenberg, C. (2013). Unlocking the research on English learners: What we know—and
don’t yet know—about effective instruction. American Educator, 37(2), 4–11, 38.
Haidt, J. (2001). The emotional dog and its rational tail: A social intuitionist approach to
moral judgment. Psychological Review, 108(4), 814–834.
Hakuta, K., Santos, M., & Fang, Z. (2013). Challenges and opportunities for language learn-
ing in the context of the CCSS and the NGSS. Journal of Adolescent & Adult Literacy, 56(6),
451–454.
Jiménez-Aleixandre, M. P., Rodríguez, A. B., & Duschl, R. A. (2000). “Doing the lesson”
or “doing science”: Argument in high school genetics. Science Education, 84(6), 757–792.
Jin, H., & Anderson, C. W. (2012). A learning progression for energy in socio-ecological
systems. Journal of Research in Science Teaching, 49(9), 1149–1180.
Jin, H., Mehl, C. E., & Lan, D. H. (2015). Developing an analytical framework for argu-
mentation on energy consumption issues. Journal of Research in Science Teaching, 52(8),
1132–1162.
Kahneman, D. (2011). Thinking, fast and slow. New York: Farrar, Straus and Giroux.
Lee, O., & Buxton, C. A. (2013). Integrating science and English proficiency for English
language learners. Theory into Practice, 52(1), 36–42.
Mohan, L., Chen, J., & Anderson, C. W. (2009). Developing a multi-year learning progres-
sion for carbon cycling in socio-ecological systems. Journal of Research in Science Teaching,
46(6), 675–698.
National Research Council. (1996). National science education standards. Washington, DC: The
National Academies Press.
National Research Council. (2007). Taking science to school: Learning and teaching science in
grades K-8. Washington, DC: The National Academies Press.
National Research Council. (2012). A framework for K-12 science education: Practices, crosscutting
concepts, and core ideas. Washington, DC: The National Academies Press.
Assessing Scientific Genres 229
NGSS Lead States. (2013). Next Generation Science Standards: For states, by states. Washington,
DC: The National Academies Press.
Parker, J., Covitt, B., Lee, M., & Anderson, C. (2016). A learning progression for students’
interpretation of earth systems data. Conference of the National Association for Research in
Science Teaching, Baltimore, MD.
Pinker, S. (2007). The stuff of thought: Language as a window into human nature. New York:
Viking.
Quinn, H., Lee, O., & Valdés, G. (2012). Language demands and opportunities in relation to
Next Generation Science Standards for English language learners: What teachers need
to know. Commissioned Papers on Language and Literacy Issues in the Common Core State
Standards and Next Generation Science Standards, 94, 32–43.
Rath, A., & Brown, D. E. (1996). Modes of engagement in science inquiry: A microanalysis of
elementary students’ orientations toward phenomena at a summer science camp. Journal
of Research in Science Teaching, 33(10), 1083–1097.
Schauble, L., Klopfer, L. E., & Raghavan, K. (1991). Students’ transition from an engineer-
ing model to a science model of experimentation. Journal of Research in Science Teaching,
28(9), 859–882.
Silver, N. (2012). The signal and the noise:Why so many predictions fail-but some don’t. New York:
Penguin Books.
Talmy, L. (1988). Force dynamics in language and cognition. Cognitive Science, 12(1), 49–100.
Toulmin, S. (2003). The uses of argument. Cambridge: Cambridge University Press.
Warren, B., & Rosebery, A. (2008). Using everyday experience to teach science. In A. Rose-
bery & B.Warren (Eds.), Teaching Science to English language learners (pp. 39–50). Arlington,
VA: NSTA Press.
Wertsch, J. V. (1991). Voices of the mind: A sociocultural approach to mediated action. Cambridge,
MA: Harvard University Press.
Wertsch, J.V. (1994). The primacy of mediated action in sociocultural studies. Mind, Culture,
and Activity, 1(4), 202–208.
Windschitl, M. (2004). Folk theories of “inquiry”: How preservice teachers reproduce the
discourse and practices of an atheoretical scientific method. Journal of Research in Science
Teaching, 41(5), 481–512.
Yore, L. D., & Treagust, D. F. (2006). Current realities and future possibilities: Language and
science literacy—empowering research and informing instruction. International Journal of
Science Education, 28(2–3), 291–314.
APPENDIX
Focus Points
• A strong developmental theory of learning in a particular domain or a learning
progression is largely missing in existing programs that integrate science and
language/literacy.
• The lack of a learning progression leads to lack of coherence among curriculum,
instruction, and assessment, resulting in assessments not fulfilling their full poten-
tial in generating feedback to teachers, students, and decision makers about the
state of student learning as well as ways to improve teaching and learning.
• We describe one particular approach to the construction and empirical vali-
dation of a learning progression. This approach is built on four core princi-
ples: (a) a developmental perspective on student learning, (b) a match between
instruction and assessment, (c) management by teachers; and (d) assessments
that uphold high-quality standards of reliability and validity.
• The learning system developed using our approach will achieve tight align-
ment among curriculum, instruction, and assessment. Assessment results will
inform teachers and students about English learner (EL) students’ progress
against multiple curricular goals in both science and language/literacy domains.
• Additionally, the results will reveal necessary changes in the curriculum,
instruction, and underlying developmental theories, which will better support
EL students’ engagement and learning in science while simultaneously facili-
tating their language development. Further, our approach can support investi-
gation of which complex relationships among science and language learning
targets may be unique to and among EL students.
Chapter Purpose
In this chapter, we argue that the critical feature of a learning progression—a strong
developmental theory of learning in a particular domain—is largely missing in
232 Mark Wilson and Yukie Toyama
use classroom assessments for which little evidence about validity or reliability exists
yet which are potentially more informative for making instructional decisions.
Even though classroom and large-scale assessments need not be mutually exclu-
sive tools, it is extremely rare to find valid and reliable tests that can be linked
directly to classroom practices and instructional activities. We will discuss our
approach to develop a coordinated assessment system with robust validity and relia-
bility evidence, which enables the linkage between formative and summative assess-
ments. First, we turn to major programs that integrate science and literacy/language
instruction to facilitate learning in both domains as fertile contexts to reflect on
relationships among curriculum, instruction, and assessment.
the types of assessments found in these integration programs2 (see Appendix 12A
for the full list of the assessment issues found in the review). Our purpose here
is to show that while these integration programs provide engaging instructional
approaches, curriculum materials, and/or teacher professional development oppor-
tunities, their assessments, for either formative or summative purposes, are rather
weak, in terms of (a) their alignment with the curriculum and instruction and (b)
their capacity to be the basis for valid inferences about students’ current levels of
understanding and identifying appropriate next steps.
The most common assessment tools we found in the studies of the six pro-
grams are state standardized tests or commercially available norm reference tests
(e.g., Greenleaf, Hanson, & Schneider, 2011, Llosa et al., 2016, Romance & Vitale,
1992). These assessments are considered to be “distal” from what students have
been learning with the enacted curriculum (Ruiz-Primo, Shavelson, Hamilton, &
Klein, 2002; Sussman, 2016).While they may be useful to get a general sense of stu-
dent achievement, they seldom provide meaningful formative feedback to students,
teachers, and program designers. For example, an earlier efficacy study of Reading
Apprenticeship Improving Secondary Education (Greenleaf et al., 2011), a pro-
gram integrating disciplinary reading into content area instruction, used three state
standardized tests from the California Standardized Tests program. The researchers
found statistically significant treatment effects in the respective three areas tested
(i.e., English language arts, reading comprehension, and biology), although they
admitted that these tests were not well aligned with their program: The reading
comprehension items were largely about literature rather than science, and biology
items focused mostly on factual recall, which involved very little reading. Clearly,
these assessments generate little diagnostic information that guides teachers and
students toward productive pathways for specific learning goals that the program
emphasizes, although it is possible that they provided evidence of a program’s effi-
cacy in increasing student achievement in a distal learning area.
Another popular assessment type is researcher-developed measures. Some of
these assessments are put together using publicly available items. For example, in
the efficacy studies of Promoting Science Among English Learners (Llosa et al.,
2016; Lee, Deaktor, Hart, Cuevas, & Enders, 2005), science tests were developed
using public release items from the National Assessment of Educational Progress
(NAEP) and Trends in International Mathematics and Science Study (TIMSS).
Similarly, science unit tests were put together using released items from the state
tests as well as the textbook publisher’s item bank in Quality English and Science
Teaching (August et al., 2014; August, Branum-Martin, Cardenas-Hagan, & Francis,
2009), a program that provides supplementary materials and professional develop-
ment for science curriculum to specifically address EL students’ language develop-
ment needs.
The majority of these publicly released items are multiple-choice items, which
are unlikely to capture complex cognitive processes and language functions involved
in the science practices that the integration programs typically target. Nor were
these items developed with specific consideration for EL students’ developmental
learning trajectories. Further, very few details are available about the development
process as well as the technical qualities of these researcher-developed assessments.
We conjecture that the alignment between these assessments, on the one hand, and
236 Mark Wilson and Yukie Toyama
the curriculum and instruction, on the other, is achieved only at a broad topic level,
which is not sufficient to generate informative feedback to move learning forward.
Some researchers developed their own assessments from scratch, aiming to
achieve tighter alignment among the assessments, curriculum, and instruction. For
example, in Concept Oriented Reading Instruction (CORI; www.cori.umd.edu/;
Guthrie, Anderson, Alao, & Rinehart, 1999; Guthrie et al., 2004), a program that
promotes reading engagement through hands-on science activities for elementary
and middle school students, Guthrie and his colleagues developed assessments that
are closely aligned with CORI’s instructional framework as well as with its under-
lying learning and motivational theories. CORI’s most prominent assessment is
a weeklong performance assessment, which involves reading of multiple texts on
a science topic, such as how crabs and turtles live and grow. For this assessment,
students activate their prior knowledge through writing and drawing, write down
questions they may have for the topic, take notes on a reading log as they read, and,
finally, represent what they have learned from the texts by drawing and writing.
These assessment tasks are so tightly aligned with CORI’s instructional activities
that students would not distinguish the former from the latter. Additionally, because
student motivation is seen as a key element for improving student reading achieve-
ment, a student survey tool was developed to measure this construct (Wigfield &
Guthrie, 1997).
While CORI’s performance assessment tasks appear seamless with instruction
and curriculum and its student survey captures student motivation (the key con-
struct in the program’s theory of change) CORI’s performance assessment and
the student questionnaire are offered only as summative tools for teachers, and no
matching tools seem to be available for formative assessment, apart from a casual
suggestion that students create portfolios of their work and teachers use infor-
mal assessment practices such as oral and written reports, student observations,
and discussions (Swan, 2003). However, the description of these practices is short
and generic, with no specific information about what student’s emerging ideas or
misconceptions might look like in a particular unit and what the teacher should do
to help student learning in both reading and science. Equally absent is a common
framework that provides developmental coherence between the summative assess-
ments and suggested formative practices.
Seeds of Science/Roots of Reading (Seeds/Roots, Cervetti et al., 2012) is
another science/literacy integrated program, which offers a comprehensive set
of researcher-developed assessment tools for formative and summative purposes.
Designed for grades 2–5, Seeds/Roots provides 12 science units in which students
learn about important science concepts as well as science practices through first-
hand inquiry, content-rich informational texts, and scientific discourses with peers
and through writing.
Seeds/Roots’s formative assessments take various forms, including the teacher’s
informal actions during learning activities, such as “quick checks for understand-
ing” (e.g., attending to students’ use of target academic words during a group work).
Additionally, the program offers more formal formative assessments called “embed-
ded assessments” that require students to produce oral or written work as part of
learning activities, and their work products are then evaluated with rubrics pro-
vided in the curriculum. As for summative purposes, Seeds/Roots offer pre-post
Assessment in Science and Literacy Curricula 237
measures in four broad areas: (a) science (concepts, inquiry, and nature of practices);
(b) integrated science/literacy (science vocabulary and writing); (c) literacy (read-
ing comprehension); and (d) attitudes toward science.Teachers can select a subset of
these assessments from a booklet to meet their curricular goals and needs.
While the Seeds/Roots assessment system’s coverage is impressive, we neverthe-
less see the need for learning progressions to be elaborated for the curriculum’s
multiple learning goals and to serve as anchors connecting various assessments. We
do see an emerging articulation of learning progressions in the form of the rubrics
offered in the Seeds/Roots assessment system. However, these rubrics are cur-
rently used rather peripherally, only with the open-ended items in the summative
assessments and the embedded assessments. Other forms of student responses, such
as those given to multiple-choice questions or those informally elicited through
teachers’ questioning during a lesson, are not subject to the analysis with the rubrics.
Therefore, these rubrics have not assumed a central role, as learning progressions
would do in driving instruction and assessment, enabling the identification of a
student’s current level of proficiency in relation to what the student has achieved
as well as what s/he can should strive for next. This is seen as critical, especially in
supporting a teacher’s formative assessment practices.
Indeed, based on a qualitative study of the Seeds/Roots curriculum implemen-
tation involving six Norwegian teachers, Haug and Ødegaard (2015) argue “the
teaching material, including the embedded assessment, is necessary but not suf-
ficient” (p. 650) for teachers to make meaningful use of student responses from the
variety of Seeds/Roots assessments. A fully developed learning progression, which
we will discuss in the next section, would be the core tool to design the professional
development and to guide teachers’ daily practices to fill this gap.
The second potential area for improvement is the item format. The vast major-
ity of Seeds/Roots summative assessment items and those of the other integration
programs that use publicly available items (e.g., Greenleaf et al., 2011; Lee et al.,
2005; Llosa et al., 2016) follow the multiple-choice format with a single correct
answer choice. This item format has long been criticized for its narrow focus on
factual recall, limiting student responses, and potential mismatch with and nega-
tive impacts on instruction (e.g., Shavelson, Baxter, Pine,Yure, Goldman, & Smith,
1991; Cizek, 2005). In reading comprehension assessments, research has shown that
multiple-choice items tap into students’ test-taking strategies rather than compre-
hension processes (Farr, Pritchard, & Smitten, 1990; Rupp, Ferne, & Choi, 2006).
This design-choice casts doubt on whether the assessments can capture the sense-
making processes or scientific discourse practices that the curriculum emphasizes.
This is not to say that the summative assessment items were developed without
attending to any design principles—quite the contrary. Seeds/Roots’s multiple-
choice items were designed into a certain type, such as interrelatedness vocabulary
items or literal-recall comprehension questions. However, the scoring procedures
(i.e., simply summing up item level scores at the test level) do not allow one to
make inferences about students’ ability at the item-group level or at the item level.
We will discuss how our alternative approach provides a better framework for
interpreting scores in the next section.
To summarize, we overlay our observations of the six integration programs’
assessments on the traditional curriculum-instruction-assessment triangle (see
238 Mark Wilson and Yukie Toyama
Figure 12.1A). Typically, these programs provide rich descriptions of synergies that
exist between science inquiry and language/literacy development for attaining
learning goals in both areas and of specific instructional activities that facilitate the
synergistic learning toward the goals. In this sense, there is a strong tie between the
curriculum (i.e., what needs to be learned) and the instruction (i.e., how to achieve
the goals). This is represented as the arrow between curriculum and instruction in
Figure 12.1A. Additionally, there is the traditional role for the assessments in these
programs which are used to assess how well the students have learned from the
instruction (arrow from instruction to assessment) and hence how much they have
learned of the curriculum (arrow from assessment to curriculum).
However, these two linkages, one between curriculum and assessment and the
other between assessment and instruction, tend to be weak in the integration pro-
grams we reviewed. As we noted, many of the programs use readily available stand-
ardized tests in their published studies.These tests are not very well aligned with the
particular curriculum goals and the related instructional activities. Some programs
developed their own assessments, often with the publicly released items, the major-
ity of which are multiple-choice items. While these researcher-developed assess-
ments may achieve alignment with the curriculum at a broad topical level, they
are unlikely to capture higher order thinking skills, complex knowledge, and/or
language functions unique to science practices, which the curricula emphasize. In
fact, responses for multiple-choice items typically indicate a dichotomy between got
it right and got it wrong, with little capacity to generate information about what may
lie in between the two extremes and how an intermediate understanding can be
turned into more complete understanding. Further, the total scores or composite
scores typically generated from either form of the assessments lack an interpretive
framework that provides useful evidence of, and feedback for, student learning.
Lastly, the existing integration programs lack a strong developmental theory of
learning, as indicated by Figure 12.1A.
Figure 12.1B (Black et al., 2011) represents an alternative approach to the learn-
ing triangle in which a developmental theory of learning drives the design of cur-
riculum, instruction, and assessment. The first step in this approach involves the
FIGURES 12.1A & 12.1B The Learning Triangle Common Among the Integration Pro-
grams (A) and An Alternative Approach to the Learning Tri-
angle (B).
Adapted from “Road Maps for Learning: A Guide to the Navigation of Learning Progressions,” by
Black, Wilson, & Yao, 2011, Measurement: Interdisciplinary Research & Perspective, 9, pp. 80–81.
Assessment in Science and Literacy Curricula 239
focus on EL students and their challenges in learning science while learning the
language. In what follows, we will discuss, to the extent possible, how our approach
can be extended to address EL students’ needs.
In the BEAR Assessment System, these four principles are concretely expressed
as the four building blocks shown in Figure 12.2. These blocks serve as tools to
construct robust assessments, which can generate both formative and summative
information in varied contexts (for details, see Wilson, 2005). The instrument
development process is iterative, meaning that one is likely to move through
all four building blocks several times in the process of designing an assessment.
Additionally, this process is neither linear nor strictly sequential. The developer
typically moves back and forth among the building blocks, especially among
the first three, as indicated by a dotted inner loop (Berson, Yao, Cho, & Wilson,
2010) in Figure 12.2.
Principle 1: Principle 2:
Developmental Perspective on Learning Match between Instruction & Assessment
Construct Items
Map Design
Measurement Outcome
Model Space
Principle 4: Principle 3:
Evidence of High Quality Management by Teachers
FIGURE 12.2 The Principles and Building Blocks of the BEAR Assessment System
Assessment in Science and Literacy Curricula 241
could tailor the last two constructs to specifically address necessary language and
literacy skills (e.g., science registers, scientific argumentation), specifically for EL
students to succeed in the curriculum. Further, alternate or multiple pathways may
need to be established to represent EL students’ development of proficiency toward
curricular goals as well as their ways of expressing understanding, as their develop-
ment may be different from that of non-EL students (Sato et al., 2012).
Figure 12.3 shows a construct map for the Using Evidence construct. The map
shows increasing anchor points of sophistication in using evidence as students pro-
gressed through the curriculum. Importantly, the construct map defines a single
continuum, as indicated by an arrow in Figure 12.3, along which students may be
arrayed, from a low point of sophistication through successive qualitatively distinct
ordered points at the top (Wilson, 2005). The construct map also helps to elaborate
observable behaviors associated with each anchor point as well as to design items
that elicit such behaviors.
Typically, a set of construct maps is needed to represent the curriculum. This
would be particularly true for science and language/literacy integrated programs,
which seek to achieve multiple learning goals in the two or more domains. In
FIGURE 12.3
A Sketch of the Construct Map for the Using Evidence Construct of
the Issues, Evidence, and You (IEY) Curriculum for Middle School Stu-
dents Building Blocks
Assessment in Science and Literacy Curricula 243
such context, some considerations should be taken into account. First, a set of
construct maps is not equivalent to a curriculum; it is always less, in the sense that
it serves as a summary of the curriculum. Moreover, it is not a summary of the
content of the curriculum so much as a summary of the intended effects of the
curriculum. These effects must have certain characteristics: They must be impor-
tant enough to warrant assessment on a regular basis (essentially a curriculum or
instructional decision); they must be few enough for a teacher to keep track of
easily; and they must be organized into a particular sort of summary—a devel-
opmental perspective (which can be based on research as well as on professional
judgments; Wilson, 2009).
instructional activities in terms of content and cognitive processes. Thus the assess-
ment does not fall into a trap of targeting what is easy to assess through multiple-
choice questions. Indeed, all IEY assessment tasks for the embedded assessments are
open ended, requiring students to fully explain their responses.
Figure 12.4 shows an example of an IEY assessment prompt, which is related to
the Evidence and Trade-offs construct. The prompt is taken from Activity 12: “The
Peru Story.” It is a typical assessment task in that it requires students to integrate
information from readings they did in previous lessons and provide reasoning. In
addition to these embedded assessments, the IES assessment system offers “Link
Tests,” which are given at major transition points to provide snapshots of student
learning. They can be used both summatively and formatively. An item is linked to
at least one IEY construct, but the link-test item is not curriculum-embedded, like
the Peru story assessment task. To increase efficiency in scoring, some multiple-
choice items are used for the Link Tests as long as they were mapped to one of the
five IEY constructs to ensure coherence in the assessment system. This mapping
helps generate useful information from students’ responses about where students
may be on the construct map.
You are a public health official who works in the Water Department.Your supervisor
has asked you to respond to the public’s concern about water chlorination at the next
City Council meeting. Prepare a written response explaining the issues raised in the
newspaper articles. Be sure to discuss the advantages and disadvantages of chlorinating
drinking water in your response, and then explain your recommendation about
whether the water should be chlorinated.
FIGURE 12.4
An Assessment Task Embedded in the Issues, Evidence, and You (IEY)
Curriculum for Middle School Students
FIGURE 12.5 Scoring Guide for the Evidence and Trade-Offs Construct and Exemplary Response for Score 3 by a Middle School Student.
Adapted from “From Principles to Practice: An Embedded Assessment System,” by Wilson and Sloane, 2000, Applied Measurement in Education, 193, pp. 193–194.
246 Mark Wilson and Yukie Toyama
The principle underlying this third building block is that the assessment system
must be managed by teachers. Two broad issues are involved in this principle. First,
it is teachers who will use the assessment information to inform and guide the
teaching and learning process. To do this fully, teachers must be involved in collect-
ing student work and be able to score and use the results immediately, rather than
waiting for scores generated by a third party. Teachers also must be able to interpret
the results in instructional terms and offer necessary accommodations for EL stu-
dents during the assessments.
Second, issues of teacher professionalism and teacher accountability demand teach-
ers to play a more central and active role in collecting and interpreting evidence of
student progress and performance (Tucker, 1991). If they are to be held accountable for
their students’ performance, teachers need a good understanding of what students are
expected to learn and of what counts as adequate evidence of student learning. They
are then in a more central position for presenting, explaining, and defending their
students’ performances and the “outcomes” of their instruction.
In the BEAR Assessment system, we organize moderation sessions where teachers
discuss with one another how they score individual student work, which often leads
to improved consistency between teacher-scorers and to helping teachers become
more consistent in their own scoring. This process also helps teachers internal-
ize the developmental goals of the curriculum as they focus attention on student
performance in light of the construct map and discuss with other teachers the
implications of the assessment for their instruction. Indeed, many teachers we have
worked with told us that participating in the moderation sessions was the best pro-
fessional development they have participated in, as these offered them opportunities
to closely examine their own student responses in relation to the construct levels
and compare them to the responses from other students, and thus they more deeply
understood how to tailor their instruction to each student’s need. Undoubtedly, the
moderation sessions were one of the critical components in the IEY program that
contributed to the creation of a teacher professional learning community, which
in turn brought desired changes in classroom practices in line with IEY’s learning
goals (Roberts & Wilson, 1998).
The idea of scoring guides is not new. “Rubrics” are often used to score essays
in both large-scale and teacher-generated assessments. However, these “rubrics” are
often written as item-specific, without any grounding in a more general underly-
ing structure that is tied to a theory of student learning. We believe that the BEAR
scoring guides not only increased coherence between the theorized ideas of the
construct and the interpretation of actual student responses but also increased effi-
ciency in scoring, as the scorers do not have to learn a completely new set of cri-
teria for each task.
Scoring guides can be useful even for multiple-choice items, when develop-
ing both the item itself and the answer choices. For example, one can develop the
answer choices that represent various anchoring points on the construct map and
appeal to students whose understanding is at a particular point. Indeed, ordered
multiple choice items (Briggs, Alonzo, Schwab, & Wilson, 2006) capitalize on
this idea, generating useful information about student understanding even from
“wrong” answer choices, when properly designed.
Assessment in Science and Literacy Curricula 247
Beyond
Correct
Incomplete
Incorrect
Off Target
FIGURE 12.6
A Progress Map for an Individual Student’s Performance on the IEY
Designing and Conducting Investigation Construct at Five Time Points.
Reprinted from “The BEAR Assessment System: A Brief Summary for the Classroom Context” by C. A.
Kennedy, 2005, p. 8. Copyright 2005 by Berkeley Evaluation & Assessment Research Center. Reprinted
with permission.
validity evidence are accumulated through (a) the careful development and revi-
sion of the first three building blocks, (b) cognitive interviews and think-alouds,
(c) the match between the construct map and the empirically derived map, and (d)
coordination of the empirical student estimates with other variables with which
they have expected relations. For the IEY assessment system, these sources of valid-
ity evidence have been used to develop a validity argument (Wilson & Sloane,
2000) as well as traditional indexes of reliabilities (for details, see Wilson & Sloane,
2000). Further, field-testing data showed that the IEY assessment system, which
was designed to provide prompt feedback to curriculum and instruction through
tightly aligned assessments, was effective in promoting statistically significant learn-
ing gains (Wilson & Sloane, 2000).
Last but not least, the output from a measurement model can be used to
obtain student and school locations on the construct map that can be interpreted
substantively. This is necessary to ensure the assessment is useful for instruction
Assessment in Science and Literacy Curricula 249
validated empirically, in turn, it can guide the structure and sequence of curriculum
and instruction to best facilitate student learning toward the mastery of the targeted
learning goals.
To illustrate the relationship between a learning progression and construct
maps, we will use a visual metaphor that superimposes images of one or more
construct maps on an image of a learning progression. Figure 12.7 illustrates one
of the most straightforward relationships, in which the ideal anchoring points
of a learning progression are those of a single construct map. In the figure, the
anchoring points of a learning progression are portrayed as the successive lay-
ers of the “thought clouds,” representing the successive levels of sophistication
of the student’s thinking or reasoning. The increase in the cloud’s size as it goes
from lower left to upper right is intended to represent increasing sophistica-
tion in student thinking later in the sequence. The vertical rectangle within the
cloud layers is a much-reduced illustration of a generic construct map. As can be
seen, the lines separating the ideal points on the construct map are aligned with
the boundaries of the learning progression clouds. The person in the picture is
someone (e.g., a science or literacy educator or a researcher interested in sci-
ence learning issues for EL students) who is thinking about student progression.
It is important to point out that learning progression resides in the teachers’
thoughts, and that it is a hypothesis about the progression of students’ thoughts
that should be examined empirically.
A learning progression can more appropriately be “mapped out” with more
than one construct map, especially in a context such as a science-literacy integrated
instruction, where multiple learning outcomes and their interrelations are of inter-
est. There are multiple possibilities, and some may be quite complex. Here we
briefly describe two of the possibilities to illustrate how construct maps could serve
as the backbone of a learning progression and the ideal points of a construct map
can be related to those of a learning progression (see Wilson, 2009, 2012 for more
detailed descriptions, as well as examples of other relationships between construct
maps and a learning progression).
Figure 12.8A illustrates one possible relationship, where the ideal points of mul-
tiple construct maps all align, at least conceptually, and they, in turn, constitute
the anchoring points of a learning progression. Figure 12.8B shows a more com-
plex situation where the attainment of a certain level of a requirement construct,
say proficiency in explanation discourse, is a precursor to the attainment of a correct
and complete level of the target construct, say proportional reasoning (note the two
construct examples are drawn from Chapter 8 of this volume, which examined
the relationship between two companion learning progressions in the context of
formative assessment of mathematics and language). We have been exploring ways
to model such linked constructs within a learning progression with a family of
statistical models called “structured construct models” (SCMs; see Wilson, 2009,
2016 for details).
An example of linked constructs within a learning progression can be found
in the molecular theory of matter for the middle school level (Black et al., 2011;
Morell, Collier, Black, & Wilson, 2017). Figure 12.9 shows a hypothesized learning
progression for the structure of matter. In this example, each of the boxes can be
thought of as a construct map, but anchoring points within each construct map
are left unspecified in this diagram. Consistent with all other previous examples,
the progression is assumed to move upwards, with simpler understandings at the
bottom and more sophisticated understandings at the top.While this representation
suggests that dependencies exist among constructs, more subtle relationships could
be specified among certain anchoring points of different construct maps, initially as
hypotheses based on the literature and professional judgments among the multiple
stakeholders.
Further, it is possible that some of these constructs may have cyclical relation-
ships, as many educational ideas, such as “crosscutting themes” in the Next Gen-
eration Science Standards (NGSS Lead States, 2013), are expressed as cycles. Such
conceptualization poses very different and an interesting range of possible ways of
thinking about a learning progression. This topic is worth investigating and will be
the subject of further work (Wilson, 2014).
FIGURES 12.8A & 12.8B
More Complex Possible Relationships between a Learning Progression and Construct Maps.
(A) is one possible relationship where the anchor points of the learning progression are the ideal
points of several construct maps. (B) is another possibility, where a link (or dependency) exists at
certain points across different construct maps. Specifically, in the latter, reaching a higher point on
the target construct T requires reaching a preceding point on both T and a requirement construct
R (as indicated by the arrows).
Assessment in Science and Literacy Curricula 253
ATOMIC-MOLECULAR
THEORY OF
MACRO PROPERTIES
6
MACRO
EVIDENCE FOR
PARTICULATE
STRUCTURE
5 CHANGES OF
STATE
4 DENSITY AND
MASS & VOLUME
3
when planned for non-EL students (Mislevy & Durán, 2014). Efforts to sort out
this complexity have just begun (e.g., Bailey & Heritage, 2014; Sato et al., 2012),
and much remains to be done, drawing on different expertise from the multiple
stakeholders. In this chapter, we have offered a framework that we hope will guide
such collaborative efforts.
When our suggested approach is taken to develop and validate a learning progres-
sion for all learners, but with special focus on EL students, teachers and researchers
may encounter three different “degrees of complication” in the way that the learning
progression for EL students relates to the “standard” progression. The possibilities are:
1. EL students are essentially on the same learning progression, but they tend to
be at lower anchoring points on the progression(s) than their non-EL peers;
2. EL students are essentially on the same learning progression; however, some
assessment items or a class of items, which are developed based on the learning
progression, behave differently for EL students compared to non-EL students
who have the matching level of ability being measured by the items (this
phenomenon has the generic label “differential item functioning” or DIF; see
Chapter 10 of this volume); and
3. EL students are on a different learning progression from those of non-EL students.
Clearly, (1) is least complex but might be naïve. It may suggest that EL stu-
dents’ progression along the common learning pathway is a matter of time and
more intensive support. With regard to (2), researchers have investigated differen-
tial item functioning with respect to EL students, in particular whether construct-
irrelevant language factors in items may prevent EL students from demonstrating
what they know and can do, most notably in large-scale summative assessments
in mathematics (e.g., Mahoney, 2008; see also Chapter 10, this volume). Lastly, (3)
is the most complex, but little research exists investigating this possibility. Despite
the lack of clearly established models of how EL students learn, represent their
knowledge, and develop competence, Sato et al. (2012) has noted that emerging
research suggests that EL students may not follow the hypothesized “more com-
mon” pathways toward learning goals, which are based primarily on research and
models for non-EL students. Given individual differences in educational history,
sociocultural background, and literacy and fluency in their native or home lan-
guage, EL students access, interpret, and engage with academic content differently
(e.g., Solano-Flores & Trumbull, 2003). Adding yet another layer of complexity is
that substantial differences exist among EL students in terms of cognitive, cultural,
and linguistic resources they bring to assessment and learning tasks (Abedi, 2004).
The implication of these complexities for practice is that teachers need to take
an active role in adapting assessment tasks (and learning activities) to meet unique
local needs of their students, especially for formative purposes. In particular, for EL
students, teachers should attend to local and cultural variations in how EL students
access, engage, and respond to assessment tasks, which generic learning progressions
may miss (Mislevy & Durán, 2014). Teachers should also consider EL students’ edu-
cational histories, including their experiences in science as well as language domains
(both in their native/home language as well as English). The related implication
for research is that teachers’ practice-based knowledge about variations among
Assessment in Science and Literacy Curricula 255
EL students needs to be directly tapped and combined with research specifically aimed
at developing alternative learning progressions for and among EL students.
In closing, it is important to note that the alternative approach we have proposed
in this chapter is both time-consuming and resource-intensive, requiring a more
determined effort than has typically been taken in the development of assessments
for curricula. However, we believe it is an invaluable investment because class-
room and summative assessments developed using our approach address a wealth
of important issues, as has been detailed in this chapter. In addition, we believe our
approach could turn the direction of influence of summative standardized assess-
ment on formative assessment (and hence to instruction and learning) around to
the opposite—where instruction and learning, in hand with formative assessment,
give the direction to summative assessments and hence to accountability.
Notes
1 These six programs are: Seeds/Roots, CORI, Reading Apprenticeship, Quality English
and Science Teaching, Promoting Science Among ELL, and Guided Inquiry Supporting
Multiple Literacies.
2 We note that descriptions of the assessments used in the six integration programs were
most readily available in published efficacy studies. The program websites were also
reviewed, although little information on assessment was available.
References
Abedi, J. (2004). The No Child Left Behind Act and English language learners: Assessment
and accountability issues. Educational Researcher, 33(1), 4–14.
American Educational Research Association. (AERA), American Psychological Association
(APA), & National Council on Measurement in Education (NCME). (2014). Standards
for educational and psychological testing. Washington, DC: AERA.
August, D., Branum-Martin, L., Cardenas-Hagan, E., & Francis, D. J. (2009).The impact of an
instructional intervention on the science and language learning of middle grade English
language learners. Journal of Research on Educational Effectiveness, 2, 345–376.
August, D., Branum-Martin, L., Cárdenas-Hagan, E., Francis, D. J., Powell, J., Moore, S., &
Haynes, E. F. (2014). Helping ELLs meet the Common Core State Standards for literacy
in science: The impact of an instructional intervention focused on academic language.
Journal of Research on Educational Effectiveness, 7(1), 54–82.
Bailey, A. L., & Heritage, M. (2014). The role of language learning progressions in improved
instruction and assessment of English language learners. TESOL Quarterly, 48(3),
480–506.
Berson, E.,Yao, S., Cho, S., & Wilson, M. (2010, April). The qualitative inner-loop of the BEAR
assessment system. Paper presented at the International Objective Measurement Workshop
(IOMW) in Boulder, CO.
Black, P.,Wilson, M., & Yao, S.Y. (2011). Road maps for learning: A guide to the navigation of
learning progressions. Measurement: Interdisciplinary Research & Perspective, 9(2–3), 71–123.
Briggs, D. C., Alonzo, A. C., Schwab, C., & Wilson, M. (2006). Diagnostic assessment with
ordered multiple-choice items. Educational Assessment, 11(1), 33–63.
Brookhart, S. M. (2003). Developing measurement theory for classroom assessment purposes
and uses. Educational Measurement, Issues and Practices, 22, 5–12.
Brown, N. J., Furtak, E. M., Timms, M., Nagashima, S. O., & Wilson, M. (2010). The evi-
dence-based reasoning framework: Assessing scientific reasoning. Educational Assessment,
15(3–4), 123–141.
256 Mark Wilson and Yukie Toyama
Cervetti, G. N., Barber, J., Dorph, R., Pearson, P. D., & Goldschmidt, P. G. (2012).The impact
of an integrated approach to science and literacy in elementary school classrooms. Journal
of Research in Science Teaching, 49(5), 631–658.
Cizek, G. J. (2005). High-stakes testing: Contexts, characteristics, critiques, and consequences.
In R. P. Phelps (Ed.), Defending standardized testing (pp. 23–54). Mahwah, NJ: Lawrence
Erlbaum Associates.
Corcoran, T. B., Mosher, F. A., & Rogat, A. D. (2009). Learning progressions in science: An evi-
dence-based approach to reform (CPRE Report). Philadelphia, PA: Consortium for Policy
Research in Education.
Duschl, R. A., Schweingruber, H. A., & Shouse, A. W. (Eds.) (2007). Taking Science to school:
Learning and teaching science in grades K-8. Washington, DC: National Academies Press.
Farr, R., Pritchard, R., & Smitten, B. (1990). A Description of what happens when an exami-
nee takes a multiple-choice reading comprehension. Journal of Educational Measurement,
27(3), 209–226.
Fisher, W. P., & Wilson, M. (2015). Building a productive trading zone in educational assess-
ment research and practice. Pensamiento Educativo: Revista de Investigacion Educacional Lati-
noamericana, 52(2), 55–78.
Greenleaf, C. L., Hanson, T. L., & Schneider, S. A. (2011). Integrating literacy and Science in
Biology: Teaching and learning impacts of Reading Apprenticeship Professional Devel-
opment. American Educational Research Journal, 48(3), 647–717.
Guthrie, J. T., Anderson, E., Alao, S., & Rinehart, J. (1999). Influences of concept-oriented
reading instruction on strategy use and conceptual learning from text. The Elementary
School Journal, 99(4), 343–366.
Guthrie, J.T.,Wigfield, A., Barbosa, P., Perencevich, K. C.,Taboada, A., & Davis, M. H., Scafiddi,
N. T., & Tonks, S. (2004). Increasing reading comprehension and engagement through con-
cept-oriented reading instruction. Journal of Educational Psychology, 96, 403–423.
Haug, B. S., & Ødegaard, M. (2015). Formative assessment and teachers’ sensitivity to student
responses. International Journal of Science Education, 37(4), 629–654.
Irribarra, D. T., Freund, R., Fisher, W., & Wilson, M. (2015). Metrological traceability in edu-
cation: A practical online system for measuring and managing middle school mathematics
instruction. Journal of Physics: Conference Series, 588(1), 012042.
Kennedy, A. C. (2005). The BEAR assessment system: A brief summary for the classroom context.
Berkeley Evaluation & Assessment Research (BEAR) Center Report Series. Retrieved
on August 2, 2017, from http://bearcenter.berkeley.edu
Lee, O. (2005). Science education with English language learners: Synthesis and research
agenda. Review of Educational Research Winter, 75(4), 491–530.
Lee, O., Deaktor, R. A., Hart, J. E., Cuevas, P., & Enders, C. (2005). An instructional inter-
vention’s impact on the science and literacy achievement of culturally and linguistically
diverse elementary students. Journal of Research in Science Teaching, 42(8), 857–887.
Lee, O., & Fradd, S. H. (1998). Science for all, including students from non-English language
backgrounds. Educational Researcher, 27(3), 12–21.
Lee, O., Quinn, H., & Valdes, G. (2013). Science and language for English language learners
in relation to Next Generation Science Standards and with implications for Common
Core State Standards for English language arts and mathematics. Educational Researcher,
42(4), 223–233.
Llosa, L., Lee, O., Jiang, F., Haas, A., O’Connor, C.,Van Booven, C. D., & Kieffer, M. J. (2016).
Impact of a large-scale Science intervention focused on English language learners. Ameri-
can Educational Research Journal, 53(2), 395–424.
Mahoney, K. (2008). Linguistic influences on differential item functioning for second lan-
guage learners on the National Assessment of Educational Progress. International Journal
of Testing, 8(1), 14–33.
Mislevy, R. J., & Durán, R. P. (2014). A sociocognitive perspective on assessing EL students
in the age of Common Core and Next Generation Science Standards. TESOL Quarterly,
48(3), 560–585.
Assessment in Science and Literacy Curricula 257
Morell, L., Collier, T., Black, P., & Wilson, M. (2017). A Construct-Modeling approach to
develop a learning progression of how students understand the structure of matter. Journal
of Research in Science Teaching, 54(8), 1024-1048. doi:10.1002/tea.21397.
National Research Council. (2001). Knowing what students know:The science and design of edu-
cational assessment. Committee on the Foundations of Assessment. (Eds. J. Pellegrino, N.
Chudowsky, & R. Glaser). Washington, DC: National Academy Press.
Next Generation Science Standards Lead States. (2013). Next Generation Science Standards: For
states, by states. Washington, DC: National Academies Press.
Rasch, G. (1960). Probabilistic models for some intelligence and attainment tests [Reprinted by
Chicago: University of Chicago Press, 1980].
Roberts, L., & Wilson, M. (1998). Evaluating the effects of an integrated assessment system: Chang-
ing teachers’ practices and improving student achievement in science (BEAR Report Series, SA-98–
92). Berkeley, CA: University of California.
Romance, N. R., & Vitale, M. R. (1992). A curriculum strategy that expands time for in-
depth elementary science instruction using science-based reading strategies: Effects of a
year-long study in grade four. Journal of Research in Science Teaching, 29(6), 545–554.
Ruiz-Primo, M. A., Shavelson, R. J., Hamilton, L., & Klein, S. (2002). On the evaluation
of systemic science education reform: Searching for instructional sensitivity. Journal of
Research in Science Teaching, 39(5), 369–393.
Rupp, A. A, Ferne, T., & Choi, H. (2006). How assessing reading comprehension with mul-
tiple-choice questions shapes the construct: A cognitive processing perspective. Language
Testing, 23(4), 441–474.
Sato, E., Nagle, K., Cameto, R., Sheinker, A., Lehr, D., Harayama, N., Cook, H. G., & Whet-
stone, P. (2012). Understanding learning progressions and learning maps to inform the develop-
ment of assessment for students in special populations. Symposium 2011 Topic 2 White Paper.
Menlo Park, CA/Lawrence, KS: SRI International and Center for Educational Testing
and Evaluation.
Shavelson, R. J., Baxter, G. P., Pine, J.,Yuré, J., Goldman, S. R., & Smith, B. (1991). Alternative
technologies for large scale Science assessment: Instrument of education reform 1. School
Effectiveness and School Improvement, 2(2), 97–114.
Solano-Flores, G., & Trumbull, E. (2003). Examining language in context: The need for new
research and practice paradigms in the testing of English language learners. Educational
Researcher, 32(2), 3–13.
Stoddart, T., Pinal, A., Latzke, M., & Canaday, D. (2002). Integrating inquiry science and
language development for English language learners. Journal of Research in Science Teaching,
39(8), 664–687.
Sussman, J. (2016). Standardized tests as outcome measures for evaluating instructional interventions
in mathematics and science (Unpublished doctoral dissertation). University of California,
Berkeley, Berkeley, CA.
Swan, E. A. (2003). Concept-oriented reading instruction: Engaging classrooms, lifelong learners. New
York: Guilford Press.
Tucker, M. (1991).Why assessment is now issue number one. In G. Kulm & S. Malcom (Eds.),
Science assessment in the service of reform (pp. 3–16). Washington, DC: American Association
for the Advancement of Science.
Wellman, R. T. (1978). Science: A basic for language and reading development. In M. B.
Rowe (Ed.), What research says to the science teacher (Vol. 1). Washington, DC: National
Science Teachers Association.
Wigfield, A., & Guthrie, J. T. (1997). Relations of children’s motivation for reading to the
amount and breadth of their reading. Journal of Educational Psychology, 89, 420–432.
Wilson, M. (2005). Constructing measures: An item response modeling approach. Mahwah, NJ:
Erlbaum.
Wilson, M. (2009). Measuring progressions: Assessment structures underlying a learning pro-
gression. Journal of Research in Science Teaching, 46(6), 716–730.
258 Mark Wilson and Yukie Toyama
APPENDIX 12.1
Assessment Issues Identified in Science-Literacy Integration Programs
(Organized by the Four Building Blocks)
Issue Instances
Constructs are rarely defined as • Evident in almost all science and literacy
having qualitatively distinct integrated curricula.
successive levels, constituting a • CORI’s passage comprehension measure draws
developmental continuum (instead, on the novice-expert paradigm rather than the
typically only the content of developmental paradigm: It compares student’s
learning is described, without the coherence score based on ratings on relatedness
pathways that may unfold over the of pairs of keywords from a passage against that
course of instruction). of an expert via the PathFinder algorithm.
• Seeds/Roots provides a unique perspective
for how definitional vocabulary knowledge
develops into a semantic and schematic
conceptual knowledge. However, this idea
did not get explicated enough to guide the
assessment development.
Building Block 2: Items design/match between assessment and instruction
Assessments target easy-to-assess • Seeds/Roots’ summative assessments largely
knowledge and skills with multiple- rely on traditional formats (multiple-choice,
choice or fill-in-the-blank format, fill-in-the-blank).
which do not seem to be well • In contrast, CORI uses an elaborate
aligned with the curriculum goals performance assessment in which students
and instruction. read multiple passages on a related topic,
generate questions, take notes, and explain their
understanding by drawing and writing, which
are all well aligned with the curriculum.
(Continued)
APPENDIX 12.1 (Continued)
Issue Instances
Impact/efficacy studies rely on distal • QuEST2 (August et al., 2014) used a norm
measures, such as state standardized referenced test (GRADE) and two researcher-
tests or researcher-developed developed unit tests with very traditional item
measures with publicly released formats, all of which do not seem to be well
large-scale assessment items aligned with the goals and core characteristics
(e.g., TIMSS, NAEP), without of the curriculum.
establishing a clear alignment • An earlier study of Reading Apprenticeship
among the assessments, curriculum, (Greenleaf et al., 2011) exclusively used
and instruction. standardized tests (we note that Greenleaf et al.
admit limitation of their distal measures).
While the curriculum addresses • P-SELL (Llosa et al., 2016) aimed at
a broad range of learning goals promoting both science inquiry and language
(e.g., science content and science development for EL students, but only science
practices/inquiry, or various literacy outcomes were assessed using a researcher-
outcomes such as speaking, reading, developed measure with public release items
writing, academic vocabulary), from large scale assessment programs (e.g.,
assessments in impact studies NAEP, TIMSS, state standardized tests).
narrowly focus on a handful of • An earlier version of P-SELL (Lee et al., 2005)
outcomes without justification. did include a literacy assessment. However,
among many literacy and language outcome
areas that the program targeted, the assessment
narrowly focused on writing, modeling after
the state’s writing assessment (which does not
seem to be well aligned with the curriculum).
Assessments and curriculum do not • Evident in the majority of science-literacy
seem to be developed based on the integrated curricula.
same theory of learning. Compared
to rich description of curriculum,
little information is offered on
accompanying assessments.
Building Block 3: Outcome space/management by teachers
Item scores are aggregated to a total • Evident in the majority of science and literacy
score, which loses interpretability integrated curricula.
in terms of what students know or • Simple sum is the typical aggregation method,
can do. but CORI used a multiplication of drawing
scores and writing scores from its performance
assessment to construct a composite conceptual
understanding score, as its developers claim it
better reflects the interactive nature of the two
(although this is not well explained).
Building Block 4: Measurement model/evidence of high quality
A measurement model provides • Reading Apprenticeship aims to increase high
student ability estimates that school students’ discipline-specific reading
are misaligned with how the ability in biology, language arts, and U.S. history.
student outcome is defined in the While ETS developed discipline-specific
curriculum. reading tests, general rather than discipline-
specific reading abilities were estimated using
a unidimensional 2PL model, using student
responses to all three discipline-specific tests.
AFTERWORD
Alison L. Bailey, Carolyn A. Maher, and
Louise C. Wilkinson
We conclude this volume with a brief commentary that organizes and highlights
the key ideas to emerge from these chapters regarding language and literacy chal-
lenges in learning STEM disciplines. First, we synthesize what chapter authors
identified as essential about the integration of the STEM disciplines and oral lan-
guage, reading and writing, particularly as these integration efforts may be unique
to English learner (EL) students’ STEM instruction and learning. Second, we char-
acterize the hallmarks of effective mathematics and science teachers working with
EL students suggested by the chapter authors and consider, briefly, the potential for
modifications to teacher education and credentialing that may improve the prepa-
ration of teachers in the STEM disciplines working with EL students. Thirdly, we
critically examine the implications for the assessment of mathematics and science
that take account of students’ development of English language and literacy.
For both mathematics and science learning, we found the authors of this volume
concur that there are some best practices for language and literacy that teachers
should consider implementing in their classrooms, so that all students develop their
competence in the language and literacy demands of the STEM disciplines.
For mathematics, the authors have argued persuasively that constructing engag-
ing tasks is essential. Tasks should (a) offer students multiple opportunities to think
through and build conceptual understanding of the mathematical problem; this
includes reasoning and justification; (b) allow students to represent those under-
standings using a single or the multiple semiotic resources that they prefer (to
include oral language and literacy); and (c) encourage students to share those
understandings with their fellow students and teachers in the classroom. Language
and literacy are central to all these endeavors. However, given that mathematics is a
multisemiotic system, elements of this learning process can include visual represen-
tations, objects, and gestures. Authors have provided commentary and examples of
tasks that meet these criteria, for example: “Mathematically Speaking” (Moschko-
vich) and “Language in Mathematics” (Avalos, Medina & Secada). With regard to
EL students, authors referenced specific techniques and ideas gleaned from the lit-
erature and their own research that render the tasks truly accessible for this student
population, which means that learners “feel safe” and have access to talk (Moschko-
vich). For example, Barwell describes the efforts of Curtis and Alex, who are bilin-
gual in Cree and English, in their mathematical dialogues. They drew on multiple
discourses, including mathematical discourses (vocabulary, forms of explanation,
use of symbols); general everyday discourses of explanation (e.g. using the word
“like”); and from multiple voices, including making use of both English and Cree
at different times. Several authors remind us that mixing “everyday English,” L1,
and some elements of the mathematical register are appropriate as students work
through their mathematical discovery.
For the sciences, the authors concurred that science learning in the classroom
must be rooted in discovery, with tasks that support students’ sense making of
scientific phenomena. Authors argued that the science classroom should present
an environment rich with potential scientific discovery and which serves as an
opportunity for language use and language learning.Tasks that require language and
multisemiotic representations of scientific understandings are at the core of success-
ful learning in classrooms. This is true for all students. However, for EL students,
the challenge is how to engage students who may have “less than perfect English”
(Lee, Grapin & Haas) in the process of scientific discovery and with the necessity of
representing the results of their inquiries in ways that communicate effectively with
other students and the teacher. Authors concluded that there may well be differ-
ences in learning for EL students; however, in many cases, there is a lack of research
directly addressing these questions in their areas of expertise.
Afterword 263
Reference
De Jong, E. J., & Harper, C. A. (2005). Preparing mainstream teachers for English language
learners: Is being a good teacher good enough? Teacher Education Quarterly, 32, 101–124.
CONTRIBUTORS
Page numbers in italic indicate a figure and page numbers in bold indicate a table on the
corresponding page.
ideational metafunction 103 – 104, 104 interactions 42 – 43, 43, 49; one-to-many
IEY (Issues, Evidence and You) 239 – 240, interactions 49
241 intermediate level critique of arguments
immersion classrooms xii, 173 221 – 222
imperatives 104 intermediate level explanations of carbon
implications for research and practice: of transforming processes 218 – 219
instructional shifts 49 – 51; of learning intermediate level prediction about arctic
progressions 253 – 255; of mathematical sea ice extent 222 – 224
writing 112 – 113; of phenomenon-based interpersonal metafunction 104
instruction 181 – 182 interpreting summative assessment 189
implicit relationships in science textbooks interpretive framework, using learning
82 – 83 progressions as 157 – 158
importance of conceptual interpretive listening 143
understanding 21 interpretive reasoning 117
improving: assessment of the STEM interrelatedness vocabulary items 237
disciplines 264 – 265; item format of investigations, science in the classroom
assessments 237 45 – 46
indicators of knowledge 214 IRE (initiation—response—evaluation)
indigenous languages 113n1 pattern 26
indigenous peoples, marginalization of IRT (item response theory) 247
113n1 Issues, Evidence and You see IEY (Issues,
inferences 196, 197, 198; about student Evidence and You)
competencies 201 Item Response Model 148
informal language 18; in mathematics 24 item-writing practices for summative
informational density 127 – 128 assessment 192 – 193
insight, gaining into student understanding IVI (intensified vocabulary instruction) 87
174
Institute of Educational Sciences 73 journal writing 103
instruction: best practices for mathematics
lesson design 25 – 29; best practices Keeling Curve 216
for writing in the science register keywords 89
129 – 134; consistent pedagogical KLEWS chart 179 – 180
themes for improving comprehension knowledge: background knowledge 118;
90; curriculum-instruction-assessment cognate knowledge 118; conceptual
triangle 237 – 239, 238; CVI 87; eliciting understanding 13; externalized
student reasoning in phenomenon-based cognition 119; “horizon knowledge”
instruction 178 – 179; IVI 87; NGSS 146; indicators of 214; “knowing” of
instructional shifts 35; phenomenon- science ideas 170; mathematical 190;
based 176 – 180; “revoicing” 17; of STEM mathematics knowledge, measuring 199;
with English learners 5 – 6; supporting procedural fluency 13
sense making in phenomenon-based
instruction 179 – 180; talk moves language xii, 128 – 129; academic language
that support productive whole-class 14, 89; background on research and
mathematical discussions 25 – 26; three- theory informing Dynamic Language
dimensional 170 – 171; transmission Learning Progressions project 148 – 150;
model of instruction 72 Bakhtin, Mikhail 108 – 109; “building
instructional shifts: classroom vignette blocks” 40; characteristics of narrow
43 – 48; implications for research and views of academic language 19;
practice 49 – 51; in language 40 – 43, 43, classroom language 26; as complex
47 – 48; in science 38 – 40 adaptive system 41; cross-language
integration programs: assessment in science development 96; “curricularizing” 40;
language/literacy-integrated programs derivation 118; discursive model 103;
234 – 240, 238; curriculum-instruction- dual-language development 96; everyday
assessment triangle 237 – 239, 238 language 17, 48, 190; exploiting home
Index 275
science texts for English learners, units: anchoring 177; NGSS 171 – 172;
improving comprehension of 87; trade science units, grounding in phenomena
books as alternative to 93 178 – 179
text level of language 116 – 117, 126 unpacking 89, 118, 132 – 133
textual complexity, teaching 132 – 133
textual metafunction 104 validity arguments 247 – 248
thematic condensation 61 variables of reading 80 – 81
thematic introduction 126 variance in test scores, testing construct-
theory of learning 107 irrelevant language contributing to 200
theory of the megaflood 44 verb phrases 82; nominalized verbs 89
theta 148, 168 verification of solved problems 107
think-aloud interviews, Language in Math vernacular arguments 212 – 213
project 65 – 66, 65, 66 vernacular language 207 – 208
thought clouds 250 vignettes, NGSS instructional shifts 43 – 48
three-dimensional learning 35, 39, 47, visual display 117
170; evidence of student understanding visual images 105
173 – 175; learning targets 172 – 173; visualizations 226
NGSS instructional shifts vignette vocabulary 171 – 172, 212 – 213; connective
43 – 48; responding to students’ ideas vocabulary building 225 – 226; CVI
175 – 176 87; influence of culture in acquiring
three dimensions of heteroglossia 109 131 – 132; interrelatedness vocabulary
TIMSS (Trends in International items 237; IVI 87; mathematics as 17;
Mathematics and Science Study) 235 morphology 89; repeating experiences
TOMA-2 (Test of Mathematical Abilities with key science vocabulary 132; science
2nd Edition) 64 vocabulary, teaching 130; sophistication
topic shift 126 150; technical vocabulary 58
Toyama,Yukie xvi voices 108 – 109
trade books as alternative to textbooks 93 volume 195 – 196
traditional approaches to science 38, 39 VRs (visual representations) 61–63, 62, 63, 71
transcription 120; in cognitive writing Vygotsky, Lev 41, 107
model example 123 – 127, 124 – 126
transitions 147 well-elaborated texts, choosing for
translation 120; in cognitive writing model improved reading comprehension 90 – 92
example 121 – 123, 122, 123 – 127, whole-class discussions 48
124 – 126; of word problems 59 Wilkinson, Louise C. xvi, xvii
transmission model of instruction 72 Williams, Dylan 145 – 146
“true score” model of classical test theory 247 Wilson, Mark xvi
the tulip problem 102 Word and Phrase Tool 131
word formation 123
uncertainty of educational reform 135 Word Generation 132
understanding: conceptual understanding word-internal resources 89
20; eliciting strategies 174; evidence of word level of language 116 – 117, 127 – 128
171; expressive understanding 17; gaining word/lexical-semantic level of language
insight into 174; naïve understandings in 124 – 126
formative assessments 145 – 146; receptive word problems 56 – 57; complex strings 58;
17; scaffolding 175, 180; scoring rubrics context of problems 59 – 60; declarative
152 – 153, 153; talking to learn math with sentence 56; difficulties encountered
13 – 14, 28 by ELLs in solving 56 – 57; improving
unique solutions for improving ELL accessibility to ELLs 59; lexical- and
reading comprehension 96 sentence-level features of 58; substitution
unitary language 108 – 109, 109 59; syntax 59
Index 283