Download as pdf or txt
Download as pdf or txt
You are on page 1of 33

Learning to Teach Writing through Dialogic Assessment

Author(s): Sarah W. Beck, Diler Cavdar and Allison Wahrman


Source: English Education , July 2018, Vol. 50, No. 4 (July 2018), pp. 305-336
Published by: National Council of Teachers of English

Stable URL: https://www.jstor.org/stable/10.2307/26797007

REFERENCES
Linked references are available on JSTOR for this article:
https://www.jstor.org/stable/10.2307/26797007?seq=1&cid=pdf-
reference#references_tab_contents
You may need to log in to JSTOR to access the linked references.

JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide
range of content in a trusted digital archive. We use information technology and tools to increase productivity and
facilitate new forms of scholarship. For more information about JSTOR, please contact support@jstor.org.

Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at
https://about.jstor.org/terms

National Council of Teachers of English is collaborating with JSTOR to digitize, preserve and
extend access to English Education

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
B e c k , C a v d a r , a n d W a h r m a n > L e a r n i n g t o Te a c h W r i t i n g

Learning to Teach Writing through Dialogic


Assessment

Sarah W. Beck, Diler Cavdar, and Allison Wahrman

Learning to teach writing is a complex process influenced by many factors. Formative assessment
holds promise as a place for preservice teachers to gain a better understanding of students’ unique
struggles as writers and of writing as a complex, challenging skill. The authors of this article de-
scribe how working with a dialogic method of formative assessment gave two preservice teachers
unique insights about their students as writers and transformed their understanding of writing
development. We argue for the benefits of incorporating more experience with formative writing
assessment into the preservice education of English teachers.

S kill in written communication is crucial to readiness for college and


success while there, as well as to active participation in a democratic
society. Supporting preservice teachers in becoming better teachers of
writing should therefore be a top priority for English teacher educators.
One way to provide English teachers with better preparation for teaching
writing is to give them rich experiences with formative writing assessment
during their preservice teaching experience. Typical methods of formative
writing assessment include commenting on students’ papers, either with
or without a rubric as a guide; conferring with students or organizing peer
conferences; teaching self-assessment; and progress monitoring with refer-
ence to curricular standards. In formative—as opposed to summative—writ-
ing assessment, teachers use the information they gain from assessments
to inform subsequent instruction, and they also typically provide feedback.
In fact, according to one recent meta-analysis of empirical research on
formative writing assessment, the provision of feedback was the aspect of
formative assessment associated with the greatest effect sizes (Graham,
Harris, & Hebert, 2011). Feedback is perhaps the characteristic of formative

English Education, July 2018 305

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 305 7/2/18 12:30 PM
English Education, V50 N4, July 2018

assessment that distinguishes it most clearly from summative and high-stakes


assessment, the latter often decried for its negative impact on the quality of
writing instruction (Hillocks, 2002).
In this article we describe how experience with a particular type of
formative assessment that we call “dialogic writing assessment” changed two
student teachers’ understanding of writing as a skill and of their students’
challenges in developing that skill. In a dialogic writing assessment session,
the teacher asks a student to think aloud—that is, to verbalize all the thoughts
that occur to the student as he or she is composing a piece of writing—while
the teacher listens, observes, asks questions, and offers prompts. In the stan-
dard think-aloud method of research and assessment the teacher or examiner
is not supposed to intervene or ask questions; however, in dialogic writing
assessment, questions and prompts are essential. To prepare for a dialogic
assessment session, the teacher analyzes the writing task that the student
will be working on during the session and prepares a set of questions and
prompts. The aim in designing the questions and prompts is to illuminate
the extent to which students have the knowledge and skills necessary to
complete the task, and whether and to what extent prompts or questions
can enable students to overcome those challenges.
Dialogic writing assessment shares many goals and characteristics with
effective writing conferences. For example, it should aim to uncover what
students know about and are able to do with writing (Atwell, 1998; Graves,
1994); serve as a site for productive reflection and deliberation (McIver &
Wolf, 1999); allow for individualized support for student needs (Sperling,
1991); and provide opportunities for students to ask questions or express
confusion about the writing task or content (Ewert, 2009). However, in keep-
ing with its origin in the think-aloud method of research, dialogic writing
assessment emphasizes process over product by including prompts to sup-
port students’ thinking-aloud as they work through stages of the composing
process, and it offers teachers the possibility of intervening in that process
by both providing support and discerning what students can do with that
support. The latter aspect of the assessment—discerning what students can
do with that support—should give teachers a sense of where to target their
instruction to strengthen emerging skills. Another way of thinking about the
difference between dialogic writing assessments and writing conferences is
that conferences are typically concerned with the development of a particular
piece of writing, whereas in dialogic writing assessment the ultimate aim
should be the development of the writer.

306

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 306 7/2/18 12:30 PM
B e c k , C a v d a r , a n d W a h r m a n > L e a r n i n g t o Te a c h W r i t i n g

Literature Review and Conceptual Framework


Preparing Preservice Teachers to Teach Writing
In the field of English teacher education, writing instruction has tended to
take a backseat to the teaching of literature (Tremmel, 2001). A national
survey found that the vast majority (over 70 percent) of high school teach-
ers across subjects did not feel that their teacher preparation coursework
adequately prepared them to teach writing, including over 40 percent of
English and language arts teachers (Kiuhara, Graham, & Hawken, 2009). One
consequence of this scant preparation is that new teachers tend to choose
writing tasks and evaluation methods that are aligned with the large-scale
assessment systems within their teaching contexts, for example, the five-
paragraph theme (Johnson, Thompson, Smagorinsky, & Fry, 2003). Specific
factors that have been found to influence teachers’ sense of competence in
teaching and assessing writing include comfort and experience with the
writing process (Bass & Chambless, 1994), the nature of their prior experi-
ences with writing instruction (Street, 2003), support from teacher education
faculty in methods courses (Street, 2003), and experience with practical as
well as conceptual tools for writing. Teachers need both kinds of tools to de-
velop expertise in teaching, because insufficient practical tools lead teachers
to default to standardized writing curricula or packaged resources in use in
their teaching sites, while a lack of conceptual tools prevents teachers from
productively critiquing their teaching practice and learning from mistakes
(Grossman et al., 2000).
A few studies have explored how working individually with student
writers can shape teachers’ development of pedagogical content knowledge
related to writing at various levels of schooling. At the elementary level,
Roser et al. (2014) found that preservice teachers gained understanding
not only of the unique abilities of the students they were matched with but
also “a more general understanding of how literacy develops” (p. 160). In
another study (Shin, 2006), preservice teachers assigned to tutor individual
university English as a Second Language (ESL)1 students found that they
needed to vary their approach from collaborative to more directive, depend-
ing on the stage of the writing process in which the student was working,
and on how the student responded to the tutor’s suggestions. Finally, studies
of preservice teachers working with student writing via email showed that
this approach can help teachers learn to view writing as a developmental
process (Davenport, 2006) and to appreciate the students’ perspective on

307

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 307 7/2/18 12:30 PM
English Education, V50 N4, July 2018

what makes writing challenging (Sipe, 2000). In the latter case, opportunities
to ask supportive questions were considered by participants to be essential
in gaining this perspective.
Writing assessment is one context in which preservice teachers have
the opportunity to acquire deeper knowledge of students as writers, along
with the responsibility to learn how to make use of that knowledge. As we
discuss in the next section, best practices for educating preservice teachers
about assessment are an important and, according to some, understudied
aspect of teacher education.

Educating Preservice Teachers about Assessment


Recommendations for improving adolescent literacy have identified con-
tinuous and systematic assessment as a characteristic of effective literacy
programs for adolescents (Biancarosa & Snow, 2006) and have called for
systematic assessment of literacy skills in the content areas as well as teach-
ers’ involvement in the implementation and review of such assessment
programs in schools (Carnegie Council on Advancing Adolescent Literacy,
2010). At the same time, many assessment experts have stressed the need
to develop teachers’ knowledge of and proficiency in assessment practices
across subject areas and skill domains (Mertler, 2009; Popham, 2009; Volante
& Fazio, 2007). Preservice teacher education is a logical starting point for such
efforts. The National Writing Project, well known for its networked program
of professional development in writing instruction, has shown that teacher
participation in the development and use of a product-focused systematic
assessment tool (the Analytic Writing Continuum) led to new insights about
the characteristics of effective writing as well as ways of teaching writing
(Swain & LeMahieu, 2012). It is reasonable to suspect that experiences with
a process-oriented approach to classroom assessment might yield similar
benefits for preservice teachers.
Instructional support can improve preservice teachers’ assessment
skills as well as their sense of competence in assessing student performance
in writing (Graham, 2005; Keen, 2005), especially when the focus of the in-
struction is on a specific assessment skill or tool, such as a rubric (Dempsey,
PytlikZyllig, & Bruning, 2009). However, a recent review of research on pre-
paring preservice teachers to teach writing suggests that it is still unknown
whether practice in giving feedback helps teachers learn to make writing
instruction more meaningful (Morgan & Pytash, 2014). In other disciplines,
namely science and mathematics, research does suggest that the most chal-
lenging aspect of formative assessment for teachers to learn and implement

308

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 308 7/2/18 12:30 PM
B e c k , C a v d a r , a n d W a h r m a n > L e a r n i n g t o Te a c h W r i t i n g

is the planning of appropriate instruction based on information obtained


from formative assessments (Sabel, Forbes, & Zangori, 2015). This can be
challenging even for more experienced teachers: according to Heritage,
Jinok, Vendlinski, and Herman (2009), insufficient integration of content
knowledge and pedagogical knowledge is what prevents teachers from
translating formative assessment data into instructional action.
Given the potential for formative assessment to influence instruction
and learning, it makes sense to place formative
assessment at the center of preservice teacher Extensive experience with
education (Shepard, 2000). Extensive experience formative assessment in teacher
with formative assessment in teacher training training can foster a deeper
can foster a deeper understanding of various understanding of various
formative assessment tools and an inclination formative assessment tools and
to use them (Morrison, 2005), and it can develop
an inclination to use them, and it
preservice teachers’ appreciation for the link
can develop preservice teachers’
between assessment and learning (Jones, 2014).
appreciation for the link between
We also know that using formative assessments
over an extended period of time can shift preser-
assessment and learning.
vice teachers’ conceptions of assessment from
one focused on tests and outcomes to one focused on learning (Wallace &
White, 2014).
The research cited above suggests that direct interaction with stu-
dents and their written work can yield meaningful changes in preservice
teachers’ knowledge about and perspectives on assessment. This research
also highlights the link between assessment and instruction as particularly
challenging for preservice teachers to understand.

Conceptual Framework: Dialogue and Writing Development


Dialogic writing assessment is a method of writing assessment based on the
first author’s (Sarah’s) inquiry with colleagues into the use of the think-aloud
protocol (Ericsson & Simon, 1993) as a tool for writing assessment (Beck,
Llosa, Black, & Trzeszkowski-Giese, 2015). The think-aloud protocol has a
long history as a research tool for the study of writing and writing processes,
illuminating the nature of writing as a problem-solving activity (Flower &
Hayes, 1981) as well as expert and novice differences in performing writing
tasks (Bereiter & Scardamalia, 1987). More recently it has shown promise
as an assessment tool that can offer insights into students’ struggles with
the writing process that are not discernible from their writing alone. This
research also revealed the teachers’ desire to use this approach in a more

309

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 309 7/2/18 12:30 PM
English Education, V50 N4, July 2018

individualized and dynamic way. From this wish grew the idea of dialogic
writing assessment (Beck et al., 2015).
Practitioners of formative assessment often ground their work in as-
sumptions that (1) formative assessment has tremendous power to enhance
student learning (Leung, 2007; Morrison 2005; Shepard, 2000); (2) forma-
tive assessment involves coordinating knowledge of content, knowledge
of what constitutes a learning progression, and knowledge of pedagogical
tools (Heritage et al., 2009); and (3) ability is not fixed but malleable and
responsive to instructional support (Shepard, 2000). This third assumption
may be especially important for developing skill in formative assessment of
writing, insofar as preservice teachers are inclined to view writing as a fixed
rather than a malleable skill (Norman & Spencer, 2005).
Dialogic writing assessment is a way for a teacher to witness the fluid
and dynamic nature of writing ability at close range, while simultaneously
supporting development of that ability. In this way it is akin to what is
known as dynamic assessment, an approach to
Dialogic writing assessment is assessment in which the examiner deliberately
a way for a teacher to witness attempts to teach the subject a skill that will
the fluid and dynamic nature of improve the subject’s performance (Haywood
writing ability at close range, & Lidz, 2007), thereby integrating assessment
while simultaneously supporting and instruction in the same activity. Central to
development of that ability. the practice of dynamic assessment is Vygotsky’s
notion of the zone of proximal development
(1978), defined as “the distance between the actual developmental level
as determined by independent problem solving and the level of potential
development as determined through problem-solving under adult guidance
or in collaboration with more capable peers” (p. 86). Dialogic writing as-
sessment has a similar conceptual foundation. Our use of the term dialogic
also invokes Bakhtin’s (1981) claim that all speech is dialogic, in that an
utterance reflects not just the person speaking but all of those to whom the
utterance is addressed; in this way written language is considered a social and
cultural construction (Bakhtin, 1981). As Nystrand (1989) argued, “writing,
like all language, is . . . inherently interactive and social” (p. 70). Dialogic
writing assessment, then, is a way of making space for the ordinary practices
of talking and listening in the teacher’s toolkit for assessing writing. Unlike
more typical formative English and language arts assessment tools such as
exit slips or rubrics, it also allows teachers to help students learn to situate
themselves authentically as writers.
To assess learning in a dynamic way requires teachers to imagine
targets for growth. Leung (2007) suggests as much when he asserts that in

310

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 310 7/2/18 12:30 PM
B e c k , C a v d a r , a n d W a h r m a n > L e a r n i n g t o Te a c h W r i t i n g

order for the formative potential of dynamic assessment to be sufficiently


understood, attention must be paid to how teachers who employ it conceptual-
ize learning in their subject and what they take as evidence of such learning.
Studying preservice teachers’ work with formative writing assessment tools,
then, can shed light on their evolving understandings of constructs relevant
to writing instruction, such as genre, writing skill, and writing development.
In what follows we describe a collaborative inquiry that took place within
Diler’s and Allie’s student teaching placements. The overarching question
for this inquiry was: What do preservice teachers learn about student writing
and students’ writing processes from using a dialogic, interactive method of
formative writing assessment?
We conducted this inquiry from the vantage point of multiple inter-
secting roles: researchers (Sarah, Allie, Diler), preservice teachers (Allie
and Diler), and a teacher educator (Sarah). Our different roles offered us
an opportunity to reflect on the significance of our findings from multiple
vantage points with regard to what they contribute to pedagogical content
knowledge about writing: for example, knowledge of writing development,
knowledge about genres of writing, and knowledge of what makes writing
challenging for students.

Method of Inquiry
Research Participants and Sites
The idea for this project began when Allie and Diler were selected for par-
ticipation in an honors seminar for undergraduate students in a teacher
education program at a private university in a major metropolitan area of the
eastern United States. A requirement for enrollment in this seminar was that
students work with a faculty member on a mentored research project during
their senior year. Sarah was a faculty mentor for this research seminar, and
she selected Allie and Diler to work with because of their interest in writing
instruction and writing assessment. Sarah had developed the concept of dia-
logic writing assessment based on work with experienced certified teachers,
and she was interested in exploring how working with this method of writ-
ing assessment could shape preservice teachers’ knowledge of writing and
writing instruction. The mentored honors project took place over the whole
of Allie’s and Diler’s senior year: during the fall semester, they worked with
Sarah on coding data collected in other teachers’ classrooms and reviewing
published literature on writing instruction and formative assessment. In
the spring semester, they conducted their own dialogic writing assessment
sessions with students in their student teaching placements.

311

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 311 7/2/18 12:30 PM
English Education, V50 N4, July 2018

Allie’s and Diler’s student teaching placements, in which they were


responsible for teaching one of their cooperating teacher (CT)’s classes, were
both in small urban schools designed to serve specific populations of students.
There were important differences between the two sites, however. Diler’s
school, Navigator Academy,2 was created specifically to serve current and
recently reclassified English Learners and has both an English as a Second
Language (ESL) program and a bilingual program. The students at Navigator
represented a wide range of proficiency in English: some had passed the state
ESL proficiency exam in middle school, while others were recent arrivals to
the United States with limited knowledge of English. Eighty-three percent
of the students were eligible for free or reduced-price lunch. All students at
Navigator were required to take an ESL class as well as an English language
arts (ELA) class until they were deemed proficient according to the statewide
standardized assessment for determining English language achievement, at
which point they no longer had to take ESL classes and could use the extra
time for English or other electives. College readiness was a priority at Navi-
gator: the school offered a college and career readiness club, had partnered
with a local city college to allow students to take college-level classes for
credit, and planned to expand AP offerings in the near future.
The ELA class that Diler worked with for her student teaching place-
ment was designed for English Learners at the advanced intermediate level
of proficiency; most students in the class were 10th or 11th graders. In the
fall semester of that year, the class had mostly focused on preparation for
the high-stakes ELA high school exit exam, with the bulk of the assigned
reading in literary genres. Instruction focused on reading for comprehen-
sion first, followed by literary analysis, with particular attention to locating
literary devices and analyzing their use, a typical focus of the questions on
the exit exam. After reading a literary text, students would typically write
a literary analysis essay to prepare for the writing tasks on the exit exam.
During the period in which Diler assumed responsibility for instruction
during her spring-semester student teaching requirement, the students read
literature ranging from short stories by Gary Soto to a simplified version of
Romeo and Juliet and, finally, to the young adult historical fiction text Fever,
1793 (Anderson 2000).
Pioneer Academy, by contrast, had only 8 percent English Learners but
was designed to facilitate individualized instruction for a diverse range of
learners. Pioneer offered a project-based curriculum built around portfolio
assessments to determine students’ eligibility for promotion from one grade
to the next. The ELA class in which Allie student taught was focused on de-

312

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 312 7/2/18 12:30 PM
B e c k , C a v d a r , a n d W a h r m a n > L e a r n i n g t o Te a c h W r i t i n g

veloping reading skills to support students’ analytic writing about texts, and
the unit in which Allie assumed major responsibility for teaching lessons was
focused on making inferences. One of the main strategies that Allie and her
CT used in this unit was something called an “inference equation,” using the
format of a mathematical addition formula: [quote] + [prior knowledge] =
[inference]. Using this formula, students selected a quote from the text, then
recorded the prior knowledge based on their real-life experience that helped
them interpret the quote, and finally composed an inference about what
the piece of evidence is implying. The completed equations were intended
to serve as skeletons of body paragraphs in an argumentative essay. During
this unit the whole class read David Levithan’s (2013) Every Day and, after
finishing the book, students were assigned to write an essay in response to
the story. This was the first point in the year in which the whole class had
read a novel together; prior to this they had read novels independently and
short texts (stories and poems) as a class.

Data Collection
Our inquiry involved collecting several types of data: (1) an open-ended ques-
tionnaire that Diler and Allie completed regarding their beliefs about writing
and writing instruction, which was supplemented by conversations with
Sarah in weekly project meetings; (2) writing samples and audio-recorded
dialogic writing assessments with students in the classes for which Diler
and Allie assumed instructional responsibility during their student teach-
ing placements; and (3) Diler’s and Allie’s record-keeping sheets from these
dialogic writing assessment sessions.

The Writing Tasks


Allie used the dialogic assessment method with two students, Jessica and
Beverly. Allie knew from conversations with her CT and from her own
experience grading student work that Jessica was typically an A student,
while Beverly, although she showed potential in her contributions to class
discussions, typically earned lower grades due to late completion of work and
lack of preparation for quizzes. Jessica read above grade level while Beverly
read at grade level. Jessica spoke another language at home and her writing
typically showed the influence of her native language, with errors in verb
endings and plural forms.
The task that Allie used for the assessment (see Figure 1) was the
culminating assignment of a unit titled “Questioning the Text and Making
Inferences with Every Day.” The goal for Jessica and Beverly during their

313

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 313 7/2/18 12:30 PM
English Education, V50 N4, July 2018

dialogic assessment sessions was to draft the body paragraphs of their essays.
Prior to these sessions, the students had filled out a worksheet that broke
the task down into steps that included (1) formulating their own prompt—or
“essential question,” the answer to which would become their thesis state-
ment—and (2) collecting appropriate evidence from the class text. Students
were also provided with “inference equation” worksheets, described above,
to help them outline their body paragraphs.
Diler used dialogic writing assessment with three students. Two of
the students, Alyssa and Beatrice, were typically academically successful
students, while one, Cory, typically struggled more with his writing. As was
the case for Allie, the writing task that Diler set for her students’ dialogic
assessment was designed with the help of her CT. They created a writing
prompt based loosely on the task typically assigned for the statewide high
school exit exam in ELA. This prompt is depicted in Figure 2. The students
who had volunteered to participate would be retaking the state exam later in
the spring to improve their scores, so the dialogic assessment activity would
provide additional practice for them.
The passage Diler chose for the prompt was one that students had
recently read during their whole-class study of Romeo and Juliet, but follow-
ing her CT’s model for writing task design, she gave the three students the
passage, but not the prompt, ahead of time and told them to read it, write
what her CT referred to as “the gist”—a summary of what the passage was
about (Frey, Fisher, & Hernandez, 2003)—and annotate any writing strategies
they noticed. Diler hoped that the students would arrive at the session with a
strong grasp of the main idea so that they could use the dialogic assessment
session to delve deeply into the composing process.

Every Day Writing Assignment


For the past few weeks, we have been learning about gender through A’s story in Every Day.
Frequently Asked Questions about this task:
What should I write about?
Anything having to do with gender identity or issues that arise from Every Day. If there
is an idea that has been nagging you during our study of Every Day, go with that and see
where it takes you!
Review your question handouts, annotations, and other writing connected to the book.
What topics, issues, challenges from the text do you find the most interesting?
How long should it be?
The length of the writing piece is up to you. Because you are generating and answering
your question, it is up to you to decide whether or not your essay is long enough.
When is it due?
Different drafts will be due at different times. See the calendar below.
Figure 1. Allie’s Assignment

314

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 314 7/2/18 12:30 PM
B e c k , C a v d a r , a n d W a h r m a n > L e a r n i n g t o Te a c h W r i t i n g

Text-Analysis Response
Your Task: Closely read the following text (pages 4–5), which is an excerpt from Act I, Scene
I, of Romeo and Juliet. After you read, write a well-developed, text-based response of three
paragraphs. In your response, identify a central idea in the text and analyze how the au-
thor’s use of one writing strategy (literary element or literary technique or rhetorical device)
develops this central idea. Use strong and thorough evidence from the text to support your
analysis. Do not simply summarize the text. You may use the margins to take notes as you
read and scrap paper to plan your response. Write your response in the spaces provided on
the following pages.

Guidelines:
Be sure to:
*Identify a central idea in the text.
*Analyze how the author’s use of one writing strategy (literary element or literary
technique or rhetorical device) develops this central idea. Examples include: charac-
terization, conflict, denotation/connotation, metaphor, simile, irony, language use,
point-of-view, setting, structure, symbolism, theme, tone, etc.
*Use strong and thorough evidence from the text to support your analysis.
*Organize your ideas in a cohesive and coherent manner.
*Maintain a formal style of writing.
*Follow the conventions of standard written English.

Figure 2. Diler’s Assignment

While Diler’s task was a simulation of the kind of timed writing as-
sessment characteristic of high-stakes exams, Allie’s was a culmination of a
long process of planning and preparation. Related to this difference was the
process by which the students were expected to have arrived at the central
idea of their text. What the assignments shared was an emphasis on using
textual evidence to support claims and central ideas; however, for Allie the
focus was on literary meaning, while for Diler the focus was on literary
form. Allie’s students were expected to have generated a question that could
be applied to the text but also answered by someone who had not read the
text—called a “Level 3 Question” in the terminology of her classroom. Diler’s
students were to focus on identifying the writing strategies (sometimes also
called “literary elements”) that authors used to convey a central idea.

Implementing Dialogic Assessment


Prior to the dialogic assessment sessions, both Allie and Diler analyzed
the assigned writing tasks and predicted the skills they thought students
would need to complete these tasks, along with the challenges they thought
students might have in completing them. They also created a list of ques-
tions they could ask to determine whether students had these skills or were
experiencing these challenges. Allie’s questions are depicted in Figure 3,
Diler’s in Figure 4.

315

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 315 7/2/18 12:30 PM
English Education, V50 N4, July 2018

Skill Questions:
1. How are you going to start? What are you going to do first?
2. What is your essential question?
3. What is your claim?
4. What part or piece of the book are you going to use? Can you tell me what your evidence
is?
5. What are you going to say after your evidence?
6. What prior knowledge helped you to understand the quote/evidence?
7. How are you going to move on to the next paragraph?

Challenge Questions:
1. Are you having trouble starting/interpreting/understanding the task?
2. Are you sure that that is a level 3 question? Can what you’re asking be applied to the
book, but be answered by anyone?
3. Do you think that you are stating your claim clearly? What do you mean? Does your
thesis answer your question?
4. Do you think that your evidence supports or matches your claim? Could you maybe find
some stronger evidence?
5. Do you think you’re done discussing how the evidence relates to the claim? Do you
think your reader will understand how those two pieces of the pie connect?
6. Have you used your knowledge outside the text, your prior knowledge, to connect the
claim and evidence in your discussion?
7. Have you connected the last paragraph to this new one? What can you say to glue these
two paragraphs together?
Figure 3. Allie’s Questions

The dialogic assessment sessions ranged in length from 25 minutes to


40 minutes. Following these sessions, which they listened to and transcribed,
Allie and Diler both wrote reflective notes on what they learned from the
experience about each student as a writer, using record-keeping sheets that
Sarah designed, and also on how they would use dialogic assessment in their
future teaching. They listened and re-listened to these audio recordings
several times and selectively transcribed sections that illustrated the most
significant challenges that students experienced.
Sarah then presented them with excerpts from their initial responses
to the questionnaire in which they articulated their beliefs about writing
as a skill, writing instruction and writing development, and asked them to
reflect in writing on how, if at all, those beliefs had changed.

Data Analysis
The data that we analyzed for this article included the initial questionnaires,
their reflections on their responses in these initial questionnaires, the
audio-recorded and transcribed dialogic writing assessment sessions, and
their reflective notes on these sessions. Keeping in mind the central ques-
tion guiding our inquiry—What do preservice teachers learn about student

316

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 316 7/2/18 12:30 PM
B e c k , C a v d a r , a n d W a h r m a n > L e a r n i n g t o Te a c h W r i t i n g

Beginning

How are you going to start?


You already read the passage, wrote the gist, and annotated it, right?
OK, you can begin writing now.
[Ask the following questions if the student appears stuck or doesn’t start]
What is your task?
How are you going to organize your writing?
How many paragraphs are you going to be writing? And what is each paragraph going
to be about?
What is your gist of the text?
What is your central idea of the text?
Why did you choose this as your central idea?

What writing element/technique/device are you using? How did you figure out that you
wanted to use this device?
How was your process in deciding to choose this device?
Why did you choose this writing strategy? Did you just pick the one that was easiest? Or
the one you saw come up the most? The one that made the most sense?
[Ask the following questions if the student appears stuck] What do we have to do
with the strategy? Can we just mention it and be done? (need to define/describe the
strategy, examples, and how it explains central idea)

Middle
How are you using the text-based evidence to support your thoughts and ideas?
How did you choose your evidence?
Are you connecting multiple pieces of evidence to explain your idea?
As you write, are you finding anything you want to change or catching any mistakes?
Are you tempted to just continue with what you have or are you able to revise and change
what you have written?
As you write, do you find yourself checking back to reread the prompt?
IF THEY SEEM STUCK IN MIDDLE, OR PAUSE, OR STRAY FROM THE TASK, ASK: Do you
want to check the prompt again to remind you of your task?
TOWARD THE END, WHEN THEY DON’T KNOW OR SEEM TO FORGET WHAT TO DO: Do
you want to check the prompt again to see what you’ve done so far and see if there’s
anything else you still need to do?
If you’re struggling on how to explain something, try thinking out loud in Spanish and
then translate it to English.

Last Paragraph
Did you figure out any new thoughts to add to your central idea?
Any other analysis that came about from doing the writing, even after you figured out
your central idea and device?
Figure 4. Diler’s Questions

writing and students’ writing processes from using an interactive writing


assessment method?—we relied primarily on a coding scheme that Sarah
had used in previous research on the think-aloud as an assessment tool to
identify students’ challenges with writing (Beck et al., 2015) as a way of il-
lustrating different aspects of Allie’s and Diler’s evolving understanding of
writing as a skill and domain of knowledge. This scheme had three facets:
writing process (e.g., planning, evaluating, revising, generating); written

317

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 317 7/2/18 12:30 PM
English Education, V50 N4, July 2018

product (thesis, structure, cohesion, evidence); and student traits (effort,


engagement, intelligence). We inductively supplemented this scheme as
needed with codes to characterize phenomena that the original scheme did
not capture, such as “confidence” and “originality.”
Allie’s and Diler’s reflections received an additional layer of coding:
we reread these data and highlighted excerpts that illustrated three major
themes: (1) understanding of writing development; (2) conceptual tools; and
(3) practical tools. The conception of writing development as a meaningful
analytic frame derives from Leung’s (2007) assertion that to understand
how dynamic assessment can be useful to teachers, we need to understand
how teachers recognize learning in their subject. We invoke the distinction
between conceptual and practical tools (Grossman et al., 2000) because it
has proven useful for characterizing the intersection of theory and practice
in preservice teachers’ developing knowledge of how to teach writing. To
apply our codes, we first worked individually on sets of data, with Allie and
Diler coding their dialogic assessment session transcripts and Sarah coding
their initial interviews and reflections, and then met in in-person sessions to
review our codes and resolve disagreements. We also shared materials in a
Google Drive folder, which allowed collaborative commenting on documents.

Findings
Our collaborative reflection and analysis of the various sources of data
converged around two central findings, one related to conceptual tools
and the other to practical tools. The conceptual tools had to do with Diler’s
and Allie’s shift toward interpretive engagement with a source text as a
fundamental component of academic writing at the high school level. The
practical tools involved their use of questions and prompts to help students
develop their writing by supporting this interpretive engagement. Most of
the shifts in Allie’s and Diler’s beliefs about writing development had to do
with the idea that increasingly sophisticated reading was intertwined with
writing development, as did most of the instructional ideas that they took
away from working with the dialogic assessment. They both (1) recognized
that interpretation was the site of most challenges for their students and (2)
figured out how to provide suggestions and feedback to support the students
in overcoming these challenges.

The Interdependence of Interpretive Reading and Analytic Writing


Working in a context where she taught ELA in a school populated with En-
glish Learners, Diler saw writing development in terms of the progression

318

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 318 7/2/18 12:30 PM
B e c k , C a v d a r , a n d W a h r m a n > L e a r n i n g t o Te a c h W r i t i n g

that ELs needed to go through in order to evolve as writers. In the initial


questionnaire Diler stated that a priority for her class was for students “to
learn to organize writing, actually put thoughts on paper coherently” to ex-
press themselves. She believed that organization was an essential foundation
for students’ writing development. After reflecting on her work with dialogic
writing assessment, however, she saw organizational skills as subordinate to
the skills of identifying topics for their writing and developing their ideas on
these topics, and having confidence in these ideas. She also came to believe
that as a prerequisite for developing these abilities, the students needed to
possess the skills of analytic reading: while they read, they needed to be able
to notice literary devices in a text and then, after reading, to reflect on how
those devices might contribute to deeper analysis. If students didn’t have
sufficiently developed ideas about what they were writing about, she now be-
lieved, organizational skills would be of no use to them. While she embarked
on the dialogic assessment process expecting to focus on organization, she
discovered that she spent much more time focusing on helping them develop
their analysis, and this change in focus led her to see one of her students in
a new light: she learned that Cory, who was considered by the CT (and by
Diler herself, prior to working with the dialogic assessment) to be the far
less proficient writer, actually had more sophisticated and original ideas
than the other two students, once she was able to look beyond his writing as
evidence of his reading comprehension. This pleasant surprise illustrates
how shifting one’s focus in formative assessment can reveal previously un-
recognized strengths in students (Beck et al., 2015).
The shift in Diler’s sense of a developmental trajectory for writing
came from how she saw her EL students rely on a schoolwide structure for
writing—the MEAL paragraph, which stands for Main idea, Evidence, Analy-
sis, and Link—as a composing tool. Prior to working with dialogic writing
assessment, Diler had thought that the best way to identify when writers
were struggling was to note whether they were stuck on how to start writing.
Working with dialogic writing assessment, she realized that, with the MEAL
structure available to them, students were less challenged with starting
because the structure gave them a formula to generate content. In doing so,
however, the structure illuminated new problems, for example, not knowing
what to write about, not being able to identify literary devices in a text, or
not being certain whether a topic was appropriate or important enough to
write about. This led Diler to shift her view of what was most important to
focus on in teaching her students about writing: Whereas at the beginning
of her student teaching placement she had thought that teaching structure
to help students write “in a coherent manner” was most important, after

319

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 319 7/2/18 12:30 PM
English Education, V50 N4, July 2018

working with dialogic writing assessment she reflected that a bigger prob-
lem was that “Students have trouble with confidence or just understanding
the text well. If they don’t know what they’re writing about, they won’t be
able to write ‘in a coherent manner’ because that isn’t their priority.” Even
as she recognized that a sufficiently deep understanding of the text was a
prerequisite for strong writing, though, she also saw that dialogic writing
assessment could be a means to “bring about the revelation that students
know more than they think and have much to express.”
Just as the dialogic assessment experience changed Diler’s perspective
on the structural tool that her teaching context provided her, Allie’s experi-
ence with dialogic assessment changed her understanding of the tool that
her teaching context offered for supporting students’ deep analytical rea-
soning—the inference equation. Allie came to understand that the equation
and the writing task that was built around it were not sufficient to allow all
students to develop their ideas. The complexity of articulating interpretive
understanding became a conceptual lens through which she compared the
performance of the two students she worked with: Allie came to appreciate
the extent to which expressing interpretive understanding depends on a com-
mand of the grammar of written language, in the sense of having a range
of options for expressing analytic ideas in writing. Jessica, being a second-
language writer, tended to make grammar errors that reflected interference
from her first language but did not usually hamper her ability to express
herself, whereas Beverly was challenged in the latter way. Although Beverly
and Jessica could say equally insightful things about the book in conversa-
tion, Beverly had much greater difficulty expressing those interpretations in
writing. This led Allie to challenge one of her original beliefs about writing:
the idea that proficient reading would automatically produce better writing.
At the beginning of her student teaching placement she believed that “the
more you read, the better you write,” and that a student’s writing about
literature was an accurate, complete representation of their understanding
of a text. However, her experience with listening to students think aloud as
they composed led her to conclude that this was not necessarily true. In a
post-assessment reflection she lamented the “frustrating aspect of only us-
ing one’s writing to assess reading, is that adolescents might struggle with
writing, inhibiting you from seeing all of the wonderful reading that they
have done.” At the same time, she recognized that providing students with
strategic support during the writing process—specifically, support for link-
ing their prior knowledge to texts—could enable students to use writing to
reflect on their reading. After working with dialogic writing assessment, she,

320

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 320 7/2/18 12:30 PM
B e c k , C a v d a r , a n d W a h r m a n > L e a r n i n g t o Te a c h W r i t i n g

like Diler, developed an awareness of the ways Writing can facilitate reading,
in which writing and reading were both distinct but a certain level of engagement
and interrelated skills. with reading deeply is a prereq-
To the extent that writing and reading are uisite for composing interpretive
interrelated skills, it can be difficult for a teacher
prose. When students are caught
to discern which skill precedes the other. Writ-
between having read superficial-
ing can facilitate reading, but a certain level of
ly, but with insufficient strategies
engagement with reading deeply is a prerequisite
for composing interpretive prose. When students
to use writing to deepen their
are caught between having read superficially, understanding of a text, how can
but with insufficient strategies to use writing to a teacher move students past this
deepen their understanding of a text, how can a point?
teacher move students past this point? In what
follows we present some examples of Diler’s and Allie’s interactions with
their students that suggest answers to this question.

Assessing and Facilitating Interpretation through Interactional


Strategies
Realizing that writing and reading skills were interrelated went hand-in-hand
with recognizing that dialogic writing assessment was an opportunity to sup-
port students’ thinking by intervening in their writing process. The primary
strategies that Diler and Allie used were “saying back” what students had
verbalized, requesting elaboration and clarification, providing encourage-
ment, and prompting metacognitive reflection. In working with Beverly, for
example, Allie realized that Beverly had a misconception about what consti-
tuted a successful essential question—her teacher’s way of conceptualizing a
thesis. Beverly thought her present question was better because the previous
one had been “boring” and “hard,” but in fact the real problem was that
the new question was not universal. In the excerpt below, Allie successfully
guides Beverly to modify her question so that it is not just about the novel
Every Day but rather about more universal concerns.

Allie: So let’s go back to our essential question, “How would you han-
dle relationships if you were living an impossible life like A?” OK, so
what do you think is an issue with our essential question? What does
our essential question have to be? Remember our level 3 questions?
beverly: A question about how your life can connect to the book.

Allie: So what do you think is happening right now with that question?

321

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 321 7/2/18 12:30 PM
English Education, V50 N4, July 2018

Do you think it is too much in [the novel] Every Day, and not
enough about our world around us?
beverly: Yeah.

Allie: Yeah, because we are talking about the characters, right? So


how can we rephrase that so we can make it more about our lives so
everyone can answer it, even people who weren’t lucky enough to
read the book?
beverly: Well, you could say, “How would you handle relationships if
you were living an impossible life?”
Allie: Or “How *do* you handle . . .” Yeah, that could be a start. Let’s
write that.

Allie used Beverly’s already drafted question —“How would you handle
relationships if you were living an impossible life like A?”—not only as a
starting point for refining the paper but also as a way to address Beverly’s
misunderstanding about what constituted a deep and original thesis state-
ment. By saying this question back to Beverly, and prompting her to recall the
requirements for a Level 3 question, Allie was able to reveal what her student
did not understand about the inference equation formula. This was a crucial
piece of information that she was not able to obtain only from studying the
equations they produced. Saying the question back and prompting recall of
the question requirements was also a way for Allie to compensate interaction-
ally for what she saw as a shortcoming in the inference equation formula.
Diler used a different strategy to support students in articulating a
sufficient understanding of the text’s gist and the strategies Shakespeare
used to convey that gist: she requested elaboration. For example, in her ses-
sion with Alyssa, the student was able to recall plot points that the passage
represented but was unsure about major ideas:

So—but—because I really don’t get the gist ‘cause if we talk about Benvolio
and the Montagues, we’re not gonna, uh, Romeo’s family right? But . . . I
mean and we have the prince and the things he does, so I don’t know what’s
the gist, if I’m writing about Romeo, or if I’m writing about the prince,
because I know that Benvolio and Montague are worried for Romeo, but
we see that the prince is like, you know—

As a first step in supporting Alyssa to develop an interpretation, Diler asked


her to elaborate on her perceptions of characters’ emotions and actions:

diler: How would you try to summarize all this? Or what do you

322

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 322 7/2/18 12:30 PM
B e c k , C a v d a r , a n d W a h r m a n > L e a r n i n g t o Te a c h W r i t i n g

think all of that falls under? So you mentioned Benvolio being


worried, you mentioned the prince. What did the prince do for
example? He—
alyssa: He gave orders and put everything in the right place.

diler: Yeah. So how did everyone seem to act in this passage, this
part of the play?
alyssa: Seems to be mad and worried and sad at the same time.

diler: OK.

Alyssa was momentarily distracted by something occurring elsewhere in the


room, and Diler repeated Alyssa’s words back to her in an effort to refocus her:

diler: So you were saying everyone is sad and mad and worried.

alyssa: Oh yeah.

diler: And then you said, “Should I say—”

alyssa: I forgot.

diler: So yeah, so everyone is sad and mad and worried, so what


would be like the central—yeah?

Diler did not finish her thought because Alyssa jumped in with an idea for
a gist statement:

alyssa: So it can be like their actions, “What are the effects of the
results of your actions on the people around you?”
diler: Good, write that down.

Diler’s questions exemplify formative assessment with immediate feedback:


On the one hand, Diler’s questions assess what Alyssa understands about the
text—for example, “how did everyone seem to act in this part of the play”—and
whether she is able to summarize events. At the same time, the questions
also function to foster, or at least reveal, this skill and understanding. With
Alyssa, Diler used questions that developed her thinking while also assess-
ing it, as in “What would be the central [idea]?” She also guided Alyssa in a
process of progressively more sophisticated questioning, asking first about
actions (what did the Prince do), then about emotional states (sad, mad,
worried), and then finally prompting for the central idea. She also made
statements to aid Alyssa’s memory of what she has already said, using the
“saying back” strategy.

323

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 323 7/2/18 12:30 PM
English Education, V50 N4, July 2018

Diler offered a similar kind of support in the following exchange with


Cory, which took place after he crafted the “M” (main idea) part of the first
MEAL paragraph: How a change of attitude of one of the members of the family
could affect the whole family. He had decided which evidence he wanted to
present in paraphrase form but was struggling to express his analysis—the
“A” part of the MEAL paragraph:

diler: What are you trying to say? Tell me about it.

cory: The new situation is affecting Romeo’s, it’s affecting in the


family, in the way that they see him, weak.
diler: Oh! How did they see him, you said?

cory: Weak.

diler: Good.

cory: And he’s crying, he’s hiding, and it’s also affecting him be-
cause he’s hiding and separate from the real world, he’s only living
in his imaginary world and running away from his problems. Now,
how the hell am I going to write that?
diler: That’s perfect actually. What you just said to me is, Romeo’s
new situation is affecting him and his family, because—what did you
say? How did you say that they see him?
cory: Weak.

diler: And what else?

cory: Well, that’s analysis. [Referring to the “A” in the M.E.A.L.


paragraph structure] I’m going to write that it’s affecting him
because—it’s affecting his family because they don’t know anything
about it, they might think it’s probably serious, or if he’s just child-
ish, which he is, and if they can actually help him, which they can.
diler: So, all of these things that you’re saying, do you think you can
write this down?
cory: I’m gonna show those things, to make an idea about that. I
will describe the way that they see him. [He begins composing on
the paper, fluently and with minimal hesitation.]

Listening to Cory talk through his struggles in composing, Diler realized that
his level of analysis and insight was beyond what Alyssa and Beatrice—two
students deemed more successful in the CT’s class—had demonstrated in
the same assignment. They had not discussed in class the effect of Romeo’s

324

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 324 7/2/18 12:30 PM
B e c k , C a v d a r , a n d W a h r m a n > L e a r n i n g t o Te a c h W r i t i n g

behavior on his family, so this was a completely original idea. Cory also ar-
ticulated an original angle on Shakespeare’s use of literary devices when he
wrote in his essay that “The speaker uses third-point-of-view to show how the
different characters perspective can make a different reality of what’s really
happening.” Yet his flat tone and diffident manner led Diler to suspect that
he was not aware of how sophisticated his analysis was. In this interaction,
she attempted to both praise him (“good,” “that’s perfect”) and encourage
him to say more; her confidence-boosting tactic of praise functioned simul-
taneously as both feedback and scaffolding prompt. And indeed, following
this interaction he embarked on a sustained composing episode with few
pauses or digressions.
Interactions with the students in the dialogic assessment sessions
weren’t always successful, however. In contrast to Alyssa and Cory, Beatrice
came to the task with a sense of what she already wanted to write about:
her central idea, which was “fighting between the two families and how
it negatively affected Verona.” Diler used questions to prompt Beatrice’s
metacognitive reflection on her interpretive process:

diler: How did you figure out that that was your central idea?

beatrice: I figured it out because it’s at the beginning, so I read this,


and it’s all about when they start fighting each other, and, when the
prince says that they must stop fighting because they interrupt the
peace in Verona, so I thought that was the central idea.
diler: OK, that’s good. And how did you choose to use a metaphor?

beatrice: The prince used the metaphor, because he says a lot of


things, but meaning other things. Also Montague used a metaphor.

However, although Beatrice was able to explain why she settled on this
particular central idea and the literary device of metaphor, several turns
later she decided that metaphor was not a good fit for her central idea:

beatrice: So, I can’t find like a metaphor in this because, these two
paragraphs are about the fighting, and the others, are about um,
Montague . . . Romeo. So I think I have to change the literary device.

After Diler prompted her to reread the speech by Prince Escalus, Bea-
trice decided to use monologue rather than metaphor as her literary device.
This had not been Diler’s intention—rather, she had guided Beatrice to this
passage because of its vivid metaphor (“quench the fire of your pernicious
rage/With purple fountains issuing from your veins”), which could have

325

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 325 7/2/18 12:30 PM
English Education, V50 N4, July 2018

been used to support her central idea of “fighting between the two families
and how that negatively affected Verona.” However, Beatrice did not notice
this metaphor, even with Diler’s prompting to look for something that was
“not literal.”
Although she wasn’t able to support Beatrice in fulfilling her original
plans, in reflecting on this interaction, Diler came to suspect that Beatrice’s
challenge in this writing task came about because she had chosen her central
idea without thinking about literary devices.

Developing Students’ Confidence through Interaction


A third finding from Diler’s and Allie’s reflections on their work with dialogic
writing assessment had to do with the importance of building students’ con-
fidence in their interpretations. This was evident in Diler’s interaction with
Cory, reported above, when she combined encouragement with requests for
elaboration. Another instance of this technique occurred when Diler learned
at the beginning of their session that Cory hadn’t read the prompt the night
before because she was able to observe his initial uncensored reactions to the
prompt. At first he balked at the idea of using literary elements, claiming he
“didn’t remember this stuff,” and just wanted to write about a central idea,
his “comprehension.” Diler encouraged him to go ahead with this plan,
figuring that he would be more likely to complete the task if he felt he had
some control over it. Once she gave him permission to take this approach,
however, her subtle suggestion that he annotate the passage prompted him
to find the elements he had feared he would not be able to find:

diler: Did you get to read this at all [the passage] before? It’s OK if
you didn’t.
cory: No, but I remember it.

diler: OK. Skim it over.

[He reads quietly]


diler: And if you happen to find a device, any kind of figurative
language or a writing technique, you should mark it—
cory: What is this [referring to the pronoun “he”] talking about?

diler: Oh, “he”? He is Romeo.

cory: I was like [inaudible].

diler: [observing him marking the text] Good, that’s good, see you
found stuff already.

326

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 326 7/2/18 12:30 PM
B e c k , C a v d a r , a n d W a h r m a n > L e a r n i n g t o Te a c h W r i t i n g

cory: [reading the text] He’s been seen there many mornings, crying
tears that add drops to the morning dew and making a cloudy day
cloudier with his sighs. The morning dew—making a cloud, that’s—a
hyperbole or—something. He cannot just make a cloud out of his
tears.
diler: Yeah, good, that’s a hyperbole because—

cory: –or a metaphor.

diler: A metaphor? Ooh. Yeah, OK.

cory: Yeah, a metaphor.

[Brief interval as he reads this passage in the text: But as soon as the
sun rises in the east, my sad son comes home to escape the light. He
locks himself up alone in his bedroom, shuts his windows to keep out
the beautiful daylight, and makes himself an artificial night.]
diler: What are you thinking?

cory: Hyperbole here because—you’re not going to run from the sun,
from the light. That’s impossible.

Because Cory expressed hesitation and a lack of confidence in his abil-


ity to complete the task, Diler figured that she should not pressure him too
much to conform to the expectations of the prompt. However, once she saw
what he was capable of when she took a more subtle and indirect approach,
she shifted her tactic to providing encouragement: “That’s good—see you
found stuff already,” and “Ooh. Yeah, OK.”
Like Diler, Allie discovered how dialogic writing assessment allowed
her to identify when confidence-boosting tactics were necessary. Both of
them came to see confidence in interpretation as an important component
of writing development, and realized that the dialogic assessment method
could be used not only to identify when students lack confidence but also to
support students’ confidence in their writing.
The following example illustrates how Allie offered Jessica encourage-
ment that allowed her to sustain and develop an idea in her writing:

jessica: [reading] Finn is treated differently by Rhiannon when he’s


in another body because he weighs over 300 lbs. . . . I have some
grammar issues that I’ll correct later.
allie: That’s fine actually, to go back. That’s great actually, to do
that.

327

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 327 7/2/18 12:30 PM
English Education, V50 N4, July 2018

jessica: For prior knowledge and combined, I wrote [reading from


her inference equation worksheet] so Rhiannon doesn’t put her
hand back because—I’m just writing a shorter way—Finn’s like all
the other guys so he doesn’t feel comfortable holding hands with a
fat dude, I guess? [Jessica ends her sentence with a rising inflection,
suggesting uncertainty.]
allie: That’s great. I really like how you’re going along so far. I think
the evidence you’re using is awesome. I think you’re on the right
track.
jessica: [changes “fat dude” to “fat person” as she resumes writing]
And then to add onto that. I think Rhiannon doesn’t hold hands with
people like Finn. So I’m gonna write that?
allie: Dustin maybe doesn’t look like Finn.

jessica: Yeah.

allie: She has a certain type.

jessica: —of, like, standard.

allie: Mmhmm. For sure. Yeah. You’re definitely making that very,
very clear. And that’s something I hadn’t really thought about be-
fore. So that’s awesome, Jessica.

Allie reassures Jessica three times in this brief exchange: first about her
choice to overlook any inaccuracies in grammar for the moment and move
on with her writing, then about the accuracy of her inference that Rhiannon
is uncomfortable with Finn’s appearance, and finally about the originality
of her inference.

Instructional Takeaways
Using knowledge obtained from assessment to plan for subsequent instruc-
tion is an integral component of the formative assessment process. With this
in mind, we considered how what Diler and Allie learned from working with
dialogic assessment would affect their future work in the classroom, either
with these students or with others. Diler’s main instructional conclusion
had to do with literary elements: while essential to the writing task she had
given them—and of high school ELA writing tasks generally—her students
varied in their ability to make use of these elements to develop their analysis.
Beatrice was limited in her options for finding meaning in the text, within
the parameters defined by the writing task, because metaphor was the only

328

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 328 7/2/18 12:30 PM
B e c k , C a v d a r , a n d W a h r m a n > L e a r n i n g t o Te a c h W r i t i n g

device she understood well, while Cory was adept at invoking literary de-
vices but did not want to foreground them in his analysis. This led Diler to
think that it would be a good instructional tactic for students like Beatrice
and Alyssa to do some informal writing focused on the literary elements in
a text prior to composing a formal essay, to give her a baseline sense of how
much help students needed with this aspect of analysis before developing
their ideas in a full-length essay. For students like Cory, on the other hand,
a better approach might be to first write down their general thoughts and
emerging ideas on a text, and then encourage them to elaborate on these
ideas through discussion of literary elements. She also realized quickly
that each of her three students approached the task from a unique starting
point: whereas Alyssa came to the task unsure about the gist of the passage,
Beatrice had already settled on an idea, while Cory had not even read the
task beforehand (as she had asked the students to do). This required her
to use different questions from her preplanned list (see Figure 4); it also
heightened her awareness of how important it is to individualize support
and feedback for student writers.
A consequence of Allie’s and Diler’s new understanding of the role of
interpretive reading in writing development was that the practical tools they
saw as helpful to students had more to do with reading than with writing—for
instance, prior knowledge, the inference equation, and literary elements.
Diler and Allie saw their perspective on these tools evolve as a consequence
of using the dialogic assessment method. While Diler came to recognize that
students needed different approaches to working with literary elements to
explore meaning in a text, Allie came to question the effectiveness of the
inference equation as a pedagogical tool for some students, once she real-
ized how much help Beverly needed in working with it. She also began to
doubt whether the conceptual tool of “prior knowledge” was sufficient as
a construct to facilitate her own—and her students’—understanding of the
nature of literary interpretation. After their dialogic assessment session Allie
realized that Beverly had “lots of difficulty” and “needed a lot of mediation in
accessing [and] generating prior knowledge.” She also reflected that “prior
knowledge is a pretty vague and abstract name,” and she admitted that when
she first started her student teaching placement at that school she needed
to have several conversations with other teachers to really understand the
concept. Were this her own classroom, she reflected, she would construct a
different kind of written scaffold to support students’ literary interpretation.
Another category of implications for instructional practice has to do
with refinements to the use of dialogic assessment. Diler realized, after work-
ing with three different students who each had different needs, that not all

329

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 329 7/2/18 12:30 PM
English Education, V50 N4, July 2018

of the questions she anticipated being useful (Figure 4) would have been
useful for all of the students, and that the sessions were more useful when
she used questions selectively, in an individualized way. Allie was particu-
larly interested in which kinds of students would benefit most from dialogic
assessment, recognizing that it would be unfeasible to use with all students
regularly. Reflecting on how she attributed her own writing development to
extensive coaching and support from her parents throughout her schooling,
including into her undergraduate years, Allie thought that dialogic assess-
ment could be especially useful as a form of support for students who do not
have access to guidance or mentorship around academic literacy in their
homes or communities. After graduation, Allie took a full-time teaching posi-
tion as a 10th-grade reading teacher at an urban charter school. After Allie
began working at this school, she reflected that she would most likely use it
with students who can articulate thoughtful analyses of texts in classroom
discussions but struggle to articulate this analysis in writing.

Reflections and Implications


The overarching goal of this inquiry was to understand whether and to what
extent two preservice teachers’ understanding of writing as a skill, and of
their students as writers, would change through experience with dialogic
writing assessment. The changes we observed related to both conceptual
and practical tools. At the conceptual level, we found that Allie and Diler
changed their understanding of academic writing development, gaining
an understanding of the interrelatedness of writing and reading in shaping
how students progress as writers. At the beginning of her student teaching
placement, Allie thought that if students comprehended a literary work,
they would be able to write an essay about it, but this view was challenged
as she observed her students struggling with articulating their interpretive
understandings in the conventions of written language. Diler, for her part,
experienced a shift in her view of the importance of structure in writing.
Whereas at the beginning of her student teaching she considered mastery of
essay structure to be a foundational prerequisite for acquiring more sophis-
ticated writing skills, as she worked with the dialogic assessment method
she came to see structure as subordinate to the ability to articulate analytic
ideas about texts.
These changes in Diler’s and Allie’s understanding of writing provide
some support for Leung’s (2007) claim that in order to understand how dy-
namic assessment can be used to support teachers’ instructional decisions
and student learning, we need to understand how teachers conceptualize

330

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 330 7/2/18 12:30 PM
B e c k , C a v d a r , a n d W a h r m a n > L e a r n i n g t o Te a c h W r i t i n g

learning in their subject. We also think it is important to understand how


teachers conceptualize obstacles to learning and the tools at their disposal to
help students overcome those obstacles. Working with dialogic writing assess-
ment, Allie and Diler came to understand the impact of challenges related to
analytic interpretation and confidence and learned to deploy interactional
strategies that both made students’ challenges more visible and supported
students in working through those challenges. Both Allie and Diler came to
appreciate verbal encouragement as a practical tool, an affective version of
what Baker, Gersten, and Scanlon (2002) have called “procedural facilitators”
for student writing. They also came to realize that pairing encouragement
with requests for elaboration can help students develop their ideas further
because they have confidence that their ideas are worth developing.
By using such interactional tactics as encouragement, “saying back”
students’ ideas, requesting elaboration and clarification, and prompting
metacognitive reflection, Allie and Diler were able to learn more about their
students’ strengths and challenges as writers than they had been able to
learn from the students’ writing alone.
Diler learned that Alyssa and Beatrice The development of instructional
were proficient in the organizational responsiveness is crucial to growth and
aspects of essay writing but needed learning as a teacher. We see dialogic
help developing more sophisticated writing assessment as a way for teachers
analyses, while Cory needed help with
to practice flexibility and the ability to
recognizing the quality of the analytic
individualize teaching strategies based on
insights he had before he was open to
students’ unique needs.
developing them further. Allie learned
to revise some of her assumptions
about the students with whom she worked. Most significantly, the dialogic
assessment method revealed to her that (1) Jessica was eager for correction
of the grammatical errors she made as a result of interference from her first
language, and (2) Beverly had thoroughly read the novel and understood it
well, and the inference equation tool was actually the source of her difficul-
ties with writing about it.
The development of instructional responsiveness is crucial to growth
and learning as a teacher. We see dialogic writing assessment as a way for
teachers to practice flexibility and the ability to individualize teaching strat-
egies based on students’ unique needs. Diler’s and Allie’s experience with
this method illustrates how meaningful learning can occur when teachers
have the chance to practice this method. For example, Diler noted that by
the time she worked with the third student she felt less tied to the questions
she had prepared for the assessment and better able to tailor her questions

331

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 331 7/2/18 12:30 PM
English Education, V50 N4, July 2018

to surface the challenges that were unique to her third student, Cory. Even
over the relatively short time span of one week, with repeated practice, she
saw her own instructional responsiveness improve.
Working with formative assessments during student teaching experi-
ences can transform preservice teachers’ knowledge about students and
about instruction. Like the research that has followed preservice teachers
engaged in close analysis of students’ writing (Davenport, 2006; Roser et
al., 2014; Shin, 2006; Sipe, 2000), our project revealed how such close work
can deepen student teachers’ understanding of writing and the challenges
it entails. Most importantly, we think, it shows how close work with dialogic
writing assessment can help preservice teachers—and perhaps all teachers—
to assess the value of certain instructional approaches such as inference
equations and MEAL paragraphs in a precise and nuanced way.
One limitation of the inquiry we conducted together is that Allie
and Diler used dialogic writing assessment with only one kind of writing
assignment: literary analysis. While a high-priority genre in high schools
and arguably the most representative genre for academic writing in the
subject of English, it does not represent the full range of genres relevant to
English language arts. That said, dialogic writing assessment is a flexible
approach that is not genre-specific. While strategies such as “saying back”
and encouragement are likely applicable across a range of genres, others
such as “requesting elaboration” seem more specific to expository genres
such as argument, persuasion, and explanation. Sarah is engaged in a long-
term program of research on dialogic writing assessment that explores what
kinds of feedback strategies are most helpful to students, and under what
kinds of conditions. Another aspect of this program of research involves
helping teachers identify which students benefit the most from this method,
because, like most individualized assessments, dialogic writing assessment
is time-consuming and labor intensive.
For Allie and Diler, awareness of the need for individualized instruction
and attention to the role of discrete skills in students’ literacy development
is a longer-term lesson they learned from participating in this inquiry. Since
graduating from her teacher education program Allie has been working as a
10th-grade reading teacher in a charter school that is part of a charter net-
work, in which reading and writing are taught in separate classes. While she
still subscribes to the view that reading and writing development are inter-
twined, she supports the network’s decision to organize literacy instruction
in this way because it allows for targeted work on separate skills. While this
curricular configuration makes it more difficult for teachers to use writing

332

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 332 7/2/18 12:30 PM
B e c k , C a v d a r , a n d W a h r m a n > L e a r n i n g t o Te a c h W r i t i n g

as a means to deepen understanding of texts, it does allow her to address


reading challenges more precisely and recognize the strengths that students
demonstrate in conversational assessments of reading—something that she
sees as not possible when writing is the primary means of assessing reading.
Her work on dialogic writing assessment with Beverly was where she first
became aware of this limitation of teaching reading and writing together.
For Diler, who is now studying for a law degree with a focus on educa-
tion policy, the experience of working with dialogic writing assessment has
led her to consider the importance of providing for individualized assessment
and instruction as a key issue in education policy. Though no longer working
in classrooms, she considers the finding that there are so many subtle nu-
ances of the writing process, with which each student may struggle or excel,
to be an enduring lesson from participating in this project. The impact of
our inquiry on Allie’s and Diler’s intellectual and professional development
suggests that intensive experience with formative writing assessment should
be a standard practice in English teacher education.

Notes
1. A note on acronym usage: ESL is the standard term for college-age second-
language writers; we use it to describe programs and classes as well. When referring
to K–12 students, we use the term English Learners.
2. The names of Allie’s and Diler’s schools, and of their students, are pseudonyms.

References
Anderson, L. H. (2000). Fever, 1793. New York: Simon & Schuster.
Atwell, N. (1998). In the middle: New understandings about writing, reading and
learning (2nd ed.). Portsmouth, NH: Heinemann.
Baker, S., Gersten, R., & Scanlon, D. (2002). Procedural facilitators and cognitive
strategies: Tools for unraveling the mysteries of comprehension and the writing
process, and for providing meaningful access to the general curriculum. Learn-
ing Disabilities Research & Practice, 17(1), 64–77.
Bakhtin, M. (1981). The dialogic imagination. (Trans. C. Emerson & M. Holquist).
Austin, TX: University of Texas Press.
Bass, J. A., & Chambless, M. (1994). Modeling in teacher education: The effects on
writing attitude. Action in Teacher Education, 16(2), 37–44.
Beck, S. W., Llosa, L., Black, K., & Trzeszkowski-Giese, A. (2015). Beyond the rubric.
Journal of Adolescent & Adult Literacy, 58(8), 670–681.
Bereiter, C., & Scardamalia, M. (1987). The psychology of written composition. Hills-
dale, NJ: Lawrence Erlbaum Associates.
Biancarosa, G., & Snow, C. E. (2006). Reading next: A vision for action and research
in middle and high school literacy—A report from Carnegie Corporation of New
York (2nd ed.). Washington, DC: Alliance for Excellent Education.

333

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 333 7/2/18 12:30 PM
English Education, V50 N4, July 2018

Carnegie Council on Advancing Adolescent Literacy. (2010). Time to act: An agenda


for advancing adolescent literacy for college and career success. New York: Carn-
egie Corporation of New York.
Davenport, N. A. M. (2006). Connecting preservice teachers with students: Using
email to build skills for teaching writing. Journal of Reading Education, 31(2),
13–19.
Dempsey, M., PytlikZillig, L. M., & Bruning, R. H. (2009). Helping preservice teach-
ers learn to assess writing: Practice and feedback in a Web-based environment.
Assessing Writing, 14(1), 38–61.
Ericsson, K. A., & Simon, H. A. (1993). Protocol analysis: Verbal reports as data.
Cambridge, MA: MIT Press.
Ewert, D. E. (2009). L2 writing conferences: Investigating teacher talk. Journal
of Second Language Writing, 18(4), 251–269. http://dx.doi.org/10.1016/
j.jslw.2009.06.002
Flower, L., & Hayes, J. (1981). A cognitive process theory of writing. College Compo-
sition and Communication, 32(4), 365–387.
Frey, N., Fisher, D., & Hernandez, T. (2003). What’s the gist? Summary writing for
struggling adolescent writers. Voices from the Middle, 11(2), 43–49.
Graham, P. (2005). Classroom-based assessment: Changing knowledge and practice
through preservice teacher education. Teaching and Teacher Education, 21,
607–621.
Graham, S., Harris, K., & Hebert, M. A. (2011). Informing writing: The benefits of
formative assessment. A Carnegie Corporation Time to Act report. Washington,
DC: Alliance for Excellent Education.
Graves. D. H. (1994). A fresh look at writing. Portsmouth, NH: Heinemann.
Grossman, P., Valencia, S., Evans, K., Thompson, C., Martin, S., & Place, N.
(2000). Transitions into teaching: Learning to teach writing in teacher
education and beyond. Journal of Literacy Research, 32(4), 631–662.
doi:10.1080/10862960009548098
Haywood, H. C., & Lidz, C. (2007). Dynamic assessment in practice: Clinical and
educational applications. New York: Cambridge University Press.
Heritage, M., Jinok, K., Vendlinski T., & Herman, J. (2009). From evidence to action:
A seamless process in formative assessment? Educational Measurement: Issues
and Practice, 28(3), 24–31.
Hillocks, G. (2002). The testing trap: How state writing assessments control learning.
New York: Teachers College Press.
Johnson, T. S., Thompson, L., Smagorinsky, P., & Fry, P. (2003). Learning to teach
the five-paragraph theme. Research in the Teaching of English, 38(2), 136–176.
Jones, J. (2014). Student teachers developing a critical understanding of formative
assessment in the modern foreign languages classroom on an initial teacher
education course. The Language Learning Journal, 42(3), 275–288. doi:10.1080/
09571736.2014.908941
Keen, J. (2005). Assessment for writing development: Trainee English teachers’
understanding of formative assessment. Teacher Development, 9(2), 237–253.

334

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 334 7/2/18 12:30 PM
B e c k , C a v d a r , a n d W a h r m a n > L e a r n i n g t o Te a c h W r i t i n g

Kiuhara, S., Graham, S., & Hawken, L. (2009). Teaching writing to high school stu-
dents: A national survey. Journal of Educational Psychology, 101(1), 136–160.
Leung, C. (2007). Dynamic assessment: Assessment for and as teaching? Language
Assessment Quarterly, 4(3), 257–278.
Levithan, D. (2013). Every day. New York: Ember.
McIver, M., & Wolf, S. (1999). The power of the conference is the power of sugges-
tion. Language Arts, 77(1), 54–61.
Mertler, C. (2009). Teachers’ assessment knowledge and their perceptions of the
impact of classroom assessment professional development. Improving Schools,
12(2), 101–113.
Morgan, D. N., & Pytash, K. E. (2014). Preparing preservice teachers to become
teachers of writing: A 20-year review of the research literature. English Educa-
tion, 47(1), 1–6.
Morrison, J. A. (2005). Using science notebooks to promote preservice teachers’
understanding of formative assessment. Issues in Teacher Education, 14(1), 5–21.
National Writing Project & Nagin, C. (2003). Because writing matters: Improving
student writing in our schools. San Francisco: Jossey-Bass.
Norman, K. A., & Spencer, B. H. (2005). Our lives as writers: Examining preser-
vice teachers’ experiences and beliefs about the nature of writing and writing
instruction. Teacher Education Quarterly, 32(1), 25–40.
Nystrand, M. (1989). A social-interactive model of writing. Written Communication,
6(1), 66–85.
Popham, W. (2009). Assessment literacy for teachers: Faddish or fundamental?
Theory Into Practice, 48, 4–11.
Roser, N., Hoffman, J., Wetzel, M., Price-Dennis, D., Peterson, K., & Chamberlain,
K. (2014). Pull up a chair and listen to them write: Preservice teachers learn
from beginning writers. Journal of Early Childhood Teacher Education, 35(2),
150–167. doi:10.1080/10901027.2014.905807
Sable, L., Forbes, T., & Zangori, L. (2015). Promoting prospective elementary teach-
ers’ learning to use formative assessment for life science instruction. Journal of
Science Teacher Education, 26(4), 419–445.
Shepard, L. (2000). The role of assessment in a learning culture. Educational
Researcher, 29(7), 4–14.
Shin, S. (2006). Learning to teach writing through tutoring and journal
writing. Teachers and Teaching: Theory and Practice, 12(3), 325–345.
doi:10.1080/13450600500467621
Sipe, R. B. (2000). Virtually being there: Creating authentic experiences through
interactive exchanges. English Journal, 90(2), 104. doi:10.2307/821226
Sperling, M. (1991). Dialogues of deliberation: Conversation in the teacher-student
writing conference. Written Communication, 8(2), 131–162.
Street, C. (2003). Pre-service teachers’ attitudes about writing and learning to teach
writing: Implications for teacher educators. Teacher Education Quarterly, 30(3),
33–50.

335

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 335 7/2/18 12:30 PM
English Education, V50 N4, July 2018

Swain, S., & LeMahieu, P. (2012). Assessment in a culture of inquiry: The story of
the National Writing Project’s analytic writing continuum. In N. Eliot & L. Perl-
man (Eds.), Writing assessment in the 21st century: Essays in honor of Edward M.
White (pp. 45–67). New York: Hampton Press.
Tremmel, R. (2001). Seeking a balanced discipline: Writing teacher education in
first-year composition and English education. English Education, 34, 6–30.
Volante, L., & Fazio, X. (2007). Exploring teacher candidates’ assessment literacy:
Implications for teacher education reform and professional development. Cana-
dian Journal of Education, 30(3), 749–770.
Vygotsky, L. S. (1978). Mind in society: The development of higher psychological
processes. Cambridge, MA: Harvard University Press.
Wallace, M., & White, T. (2014). Secondary mathematics preservice teachers’
assessment perspectives and practices: An evolutionary portrait. Mathematics
Teacher Education and Development, 16(2), 25–45.

Sarah W. Beck is an associate professor of English educa-


tion at New York University and a former teacher of high
school English and college-level writing. An NCTE member
since 2001, she conducts research on writing instruction,
writing assessment, and the development of teacher
knowledge. She can be reached at sarah.beck@nyu.edu.

Diler Cavdar grew up in Queens, New York, attending New


York City public schools. She majored in teaching English,
grades 7–12, and minored in creative writing at New York
University. Diler is currently enrolled at UC Berkeley
School of Law and can be contacted at dc2539@nyu.edu.

Allison Wahrman is a 10th-grade literature teacher at De-


mocracy Prep Endurance High School, a charter school
located in East Harlem, New York. She studied English
education as an undergraduate at New York University. She
would like to continue her studies in the field of literacy
education. She can be contacted at avw231@nyu.edu.

336

This content downloaded from


154.59.124.153 on Thu, 29 Feb 2024 21:14:54 +00:00
All use subject to https://about.jstor.org/terms
e305-336-July18-EE.indd 336 7/2/18 12:30 PM

You might also like