Professional Documents
Culture Documents
Learning To Teach Writing Through Dialogic Assessment
Learning To Teach Writing Through Dialogic Assessment
REFERENCES
Linked references are available on JSTOR for this article:
https://www.jstor.org/stable/10.2307/26797007?seq=1&cid=pdf-
reference#references_tab_contents
You may need to log in to JSTOR to access the linked references.
JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide
range of content in a trusted digital archive. We use information technology and tools to increase productivity and
facilitate new forms of scholarship. For more information about JSTOR, please contact support@jstor.org.
Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at
https://about.jstor.org/terms
National Council of Teachers of English is collaborating with JSTOR to digitize, preserve and
extend access to English Education
Learning to teach writing is a complex process influenced by many factors. Formative assessment
holds promise as a place for preservice teachers to gain a better understanding of students’ unique
struggles as writers and of writing as a complex, challenging skill. The authors of this article de-
scribe how working with a dialogic method of formative assessment gave two preservice teachers
unique insights about their students as writers and transformed their understanding of writing
development. We argue for the benefits of incorporating more experience with formative writing
assessment into the preservice education of English teachers.
306
307
what makes writing challenging (Sipe, 2000). In the latter case, opportunities
to ask supportive questions were considered by participants to be essential
in gaining this perspective.
Writing assessment is one context in which preservice teachers have
the opportunity to acquire deeper knowledge of students as writers, along
with the responsibility to learn how to make use of that knowledge. As we
discuss in the next section, best practices for educating preservice teachers
about assessment are an important and, according to some, understudied
aspect of teacher education.
308
309
individualized and dynamic way. From this wish grew the idea of dialogic
writing assessment (Beck et al., 2015).
Practitioners of formative assessment often ground their work in as-
sumptions that (1) formative assessment has tremendous power to enhance
student learning (Leung, 2007; Morrison 2005; Shepard, 2000); (2) forma-
tive assessment involves coordinating knowledge of content, knowledge
of what constitutes a learning progression, and knowledge of pedagogical
tools (Heritage et al., 2009); and (3) ability is not fixed but malleable and
responsive to instructional support (Shepard, 2000). This third assumption
may be especially important for developing skill in formative assessment of
writing, insofar as preservice teachers are inclined to view writing as a fixed
rather than a malleable skill (Norman & Spencer, 2005).
Dialogic writing assessment is a way for a teacher to witness the fluid
and dynamic nature of writing ability at close range, while simultaneously
supporting development of that ability. In this way it is akin to what is
known as dynamic assessment, an approach to
Dialogic writing assessment is assessment in which the examiner deliberately
a way for a teacher to witness attempts to teach the subject a skill that will
the fluid and dynamic nature of improve the subject’s performance (Haywood
writing ability at close range, & Lidz, 2007), thereby integrating assessment
while simultaneously supporting and instruction in the same activity. Central to
development of that ability. the practice of dynamic assessment is Vygotsky’s
notion of the zone of proximal development
(1978), defined as “the distance between the actual developmental level
as determined by independent problem solving and the level of potential
development as determined through problem-solving under adult guidance
or in collaboration with more capable peers” (p. 86). Dialogic writing as-
sessment has a similar conceptual foundation. Our use of the term dialogic
also invokes Bakhtin’s (1981) claim that all speech is dialogic, in that an
utterance reflects not just the person speaking but all of those to whom the
utterance is addressed; in this way written language is considered a social and
cultural construction (Bakhtin, 1981). As Nystrand (1989) argued, “writing,
like all language, is . . . inherently interactive and social” (p. 70). Dialogic
writing assessment, then, is a way of making space for the ordinary practices
of talking and listening in the teacher’s toolkit for assessing writing. Unlike
more typical formative English and language arts assessment tools such as
exit slips or rubrics, it also allows teachers to help students learn to situate
themselves authentically as writers.
To assess learning in a dynamic way requires teachers to imagine
targets for growth. Leung (2007) suggests as much when he asserts that in
310
Method of Inquiry
Research Participants and Sites
The idea for this project began when Allie and Diler were selected for par-
ticipation in an honors seminar for undergraduate students in a teacher
education program at a private university in a major metropolitan area of the
eastern United States. A requirement for enrollment in this seminar was that
students work with a faculty member on a mentored research project during
their senior year. Sarah was a faculty mentor for this research seminar, and
she selected Allie and Diler to work with because of their interest in writing
instruction and writing assessment. Sarah had developed the concept of dia-
logic writing assessment based on work with experienced certified teachers,
and she was interested in exploring how working with this method of writ-
ing assessment could shape preservice teachers’ knowledge of writing and
writing instruction. The mentored honors project took place over the whole
of Allie’s and Diler’s senior year: during the fall semester, they worked with
Sarah on coding data collected in other teachers’ classrooms and reviewing
published literature on writing instruction and formative assessment. In
the spring semester, they conducted their own dialogic writing assessment
sessions with students in their student teaching placements.
311
312
veloping reading skills to support students’ analytic writing about texts, and
the unit in which Allie assumed major responsibility for teaching lessons was
focused on making inferences. One of the main strategies that Allie and her
CT used in this unit was something called an “inference equation,” using the
format of a mathematical addition formula: [quote] + [prior knowledge] =
[inference]. Using this formula, students selected a quote from the text, then
recorded the prior knowledge based on their real-life experience that helped
them interpret the quote, and finally composed an inference about what
the piece of evidence is implying. The completed equations were intended
to serve as skeletons of body paragraphs in an argumentative essay. During
this unit the whole class read David Levithan’s (2013) Every Day and, after
finishing the book, students were assigned to write an essay in response to
the story. This was the first point in the year in which the whole class had
read a novel together; prior to this they had read novels independently and
short texts (stories and poems) as a class.
Data Collection
Our inquiry involved collecting several types of data: (1) an open-ended ques-
tionnaire that Diler and Allie completed regarding their beliefs about writing
and writing instruction, which was supplemented by conversations with
Sarah in weekly project meetings; (2) writing samples and audio-recorded
dialogic writing assessments with students in the classes for which Diler
and Allie assumed instructional responsibility during their student teach-
ing placements; and (3) Diler’s and Allie’s record-keeping sheets from these
dialogic writing assessment sessions.
313
dialogic assessment sessions was to draft the body paragraphs of their essays.
Prior to these sessions, the students had filled out a worksheet that broke
the task down into steps that included (1) formulating their own prompt—or
“essential question,” the answer to which would become their thesis state-
ment—and (2) collecting appropriate evidence from the class text. Students
were also provided with “inference equation” worksheets, described above,
to help them outline their body paragraphs.
Diler used dialogic writing assessment with three students. Two of
the students, Alyssa and Beatrice, were typically academically successful
students, while one, Cory, typically struggled more with his writing. As was
the case for Allie, the writing task that Diler set for her students’ dialogic
assessment was designed with the help of her CT. They created a writing
prompt based loosely on the task typically assigned for the statewide high
school exit exam in ELA. This prompt is depicted in Figure 2. The students
who had volunteered to participate would be retaking the state exam later in
the spring to improve their scores, so the dialogic assessment activity would
provide additional practice for them.
The passage Diler chose for the prompt was one that students had
recently read during their whole-class study of Romeo and Juliet, but follow-
ing her CT’s model for writing task design, she gave the three students the
passage, but not the prompt, ahead of time and told them to read it, write
what her CT referred to as “the gist”—a summary of what the passage was
about (Frey, Fisher, & Hernandez, 2003)—and annotate any writing strategies
they noticed. Diler hoped that the students would arrive at the session with a
strong grasp of the main idea so that they could use the dialogic assessment
session to delve deeply into the composing process.
314
Text-Analysis Response
Your Task: Closely read the following text (pages 4–5), which is an excerpt from Act I, Scene
I, of Romeo and Juliet. After you read, write a well-developed, text-based response of three
paragraphs. In your response, identify a central idea in the text and analyze how the au-
thor’s use of one writing strategy (literary element or literary technique or rhetorical device)
develops this central idea. Use strong and thorough evidence from the text to support your
analysis. Do not simply summarize the text. You may use the margins to take notes as you
read and scrap paper to plan your response. Write your response in the spaces provided on
the following pages.
Guidelines:
Be sure to:
*Identify a central idea in the text.
*Analyze how the author’s use of one writing strategy (literary element or literary
technique or rhetorical device) develops this central idea. Examples include: charac-
terization, conflict, denotation/connotation, metaphor, simile, irony, language use,
point-of-view, setting, structure, symbolism, theme, tone, etc.
*Use strong and thorough evidence from the text to support your analysis.
*Organize your ideas in a cohesive and coherent manner.
*Maintain a formal style of writing.
*Follow the conventions of standard written English.
While Diler’s task was a simulation of the kind of timed writing as-
sessment characteristic of high-stakes exams, Allie’s was a culmination of a
long process of planning and preparation. Related to this difference was the
process by which the students were expected to have arrived at the central
idea of their text. What the assignments shared was an emphasis on using
textual evidence to support claims and central ideas; however, for Allie the
focus was on literary meaning, while for Diler the focus was on literary
form. Allie’s students were expected to have generated a question that could
be applied to the text but also answered by someone who had not read the
text—called a “Level 3 Question” in the terminology of her classroom. Diler’s
students were to focus on identifying the writing strategies (sometimes also
called “literary elements”) that authors used to convey a central idea.
315
Skill Questions:
1. How are you going to start? What are you going to do first?
2. What is your essential question?
3. What is your claim?
4. What part or piece of the book are you going to use? Can you tell me what your evidence
is?
5. What are you going to say after your evidence?
6. What prior knowledge helped you to understand the quote/evidence?
7. How are you going to move on to the next paragraph?
Challenge Questions:
1. Are you having trouble starting/interpreting/understanding the task?
2. Are you sure that that is a level 3 question? Can what you’re asking be applied to the
book, but be answered by anyone?
3. Do you think that you are stating your claim clearly? What do you mean? Does your
thesis answer your question?
4. Do you think that your evidence supports or matches your claim? Could you maybe find
some stronger evidence?
5. Do you think you’re done discussing how the evidence relates to the claim? Do you
think your reader will understand how those two pieces of the pie connect?
6. Have you used your knowledge outside the text, your prior knowledge, to connect the
claim and evidence in your discussion?
7. Have you connected the last paragraph to this new one? What can you say to glue these
two paragraphs together?
Figure 3. Allie’s Questions
Data Analysis
The data that we analyzed for this article included the initial questionnaires,
their reflections on their responses in these initial questionnaires, the
audio-recorded and transcribed dialogic writing assessment sessions, and
their reflective notes on these sessions. Keeping in mind the central ques-
tion guiding our inquiry—What do preservice teachers learn about student
316
Beginning
What writing element/technique/device are you using? How did you figure out that you
wanted to use this device?
How was your process in deciding to choose this device?
Why did you choose this writing strategy? Did you just pick the one that was easiest? Or
the one you saw come up the most? The one that made the most sense?
[Ask the following questions if the student appears stuck] What do we have to do
with the strategy? Can we just mention it and be done? (need to define/describe the
strategy, examples, and how it explains central idea)
Middle
How are you using the text-based evidence to support your thoughts and ideas?
How did you choose your evidence?
Are you connecting multiple pieces of evidence to explain your idea?
As you write, are you finding anything you want to change or catching any mistakes?
Are you tempted to just continue with what you have or are you able to revise and change
what you have written?
As you write, do you find yourself checking back to reread the prompt?
IF THEY SEEM STUCK IN MIDDLE, OR PAUSE, OR STRAY FROM THE TASK, ASK: Do you
want to check the prompt again to remind you of your task?
TOWARD THE END, WHEN THEY DON’T KNOW OR SEEM TO FORGET WHAT TO DO: Do
you want to check the prompt again to see what you’ve done so far and see if there’s
anything else you still need to do?
If you’re struggling on how to explain something, try thinking out loud in Spanish and
then translate it to English.
Last Paragraph
Did you figure out any new thoughts to add to your central idea?
Any other analysis that came about from doing the writing, even after you figured out
your central idea and device?
Figure 4. Diler’s Questions
317
Findings
Our collaborative reflection and analysis of the various sources of data
converged around two central findings, one related to conceptual tools
and the other to practical tools. The conceptual tools had to do with Diler’s
and Allie’s shift toward interpretive engagement with a source text as a
fundamental component of academic writing at the high school level. The
practical tools involved their use of questions and prompts to help students
develop their writing by supporting this interpretive engagement. Most of
the shifts in Allie’s and Diler’s beliefs about writing development had to do
with the idea that increasingly sophisticated reading was intertwined with
writing development, as did most of the instructional ideas that they took
away from working with the dialogic assessment. They both (1) recognized
that interpretation was the site of most challenges for their students and (2)
figured out how to provide suggestions and feedback to support the students
in overcoming these challenges.
318
319
working with dialogic writing assessment she reflected that a bigger prob-
lem was that “Students have trouble with confidence or just understanding
the text well. If they don’t know what they’re writing about, they won’t be
able to write ‘in a coherent manner’ because that isn’t their priority.” Even
as she recognized that a sufficiently deep understanding of the text was a
prerequisite for strong writing, though, she also saw that dialogic writing
assessment could be a means to “bring about the revelation that students
know more than they think and have much to express.”
Just as the dialogic assessment experience changed Diler’s perspective
on the structural tool that her teaching context provided her, Allie’s experi-
ence with dialogic assessment changed her understanding of the tool that
her teaching context offered for supporting students’ deep analytical rea-
soning—the inference equation. Allie came to understand that the equation
and the writing task that was built around it were not sufficient to allow all
students to develop their ideas. The complexity of articulating interpretive
understanding became a conceptual lens through which she compared the
performance of the two students she worked with: Allie came to appreciate
the extent to which expressing interpretive understanding depends on a com-
mand of the grammar of written language, in the sense of having a range
of options for expressing analytic ideas in writing. Jessica, being a second-
language writer, tended to make grammar errors that reflected interference
from her first language but did not usually hamper her ability to express
herself, whereas Beverly was challenged in the latter way. Although Beverly
and Jessica could say equally insightful things about the book in conversa-
tion, Beverly had much greater difficulty expressing those interpretations in
writing. This led Allie to challenge one of her original beliefs about writing:
the idea that proficient reading would automatically produce better writing.
At the beginning of her student teaching placement she believed that “the
more you read, the better you write,” and that a student’s writing about
literature was an accurate, complete representation of their understanding
of a text. However, her experience with listening to students think aloud as
they composed led her to conclude that this was not necessarily true. In a
post-assessment reflection she lamented the “frustrating aspect of only us-
ing one’s writing to assess reading, is that adolescents might struggle with
writing, inhibiting you from seeing all of the wonderful reading that they
have done.” At the same time, she recognized that providing students with
strategic support during the writing process—specifically, support for link-
ing their prior knowledge to texts—could enable students to use writing to
reflect on their reading. After working with dialogic writing assessment, she,
320
like Diler, developed an awareness of the ways Writing can facilitate reading,
in which writing and reading were both distinct but a certain level of engagement
and interrelated skills. with reading deeply is a prereq-
To the extent that writing and reading are uisite for composing interpretive
interrelated skills, it can be difficult for a teacher
prose. When students are caught
to discern which skill precedes the other. Writ-
between having read superficial-
ing can facilitate reading, but a certain level of
ly, but with insufficient strategies
engagement with reading deeply is a prerequisite
for composing interpretive prose. When students
to use writing to deepen their
are caught between having read superficially, understanding of a text, how can
but with insufficient strategies to use writing to a teacher move students past this
deepen their understanding of a text, how can a point?
teacher move students past this point? In what
follows we present some examples of Diler’s and Allie’s interactions with
their students that suggest answers to this question.
Allie: So let’s go back to our essential question, “How would you han-
dle relationships if you were living an impossible life like A?” OK, so
what do you think is an issue with our essential question? What does
our essential question have to be? Remember our level 3 questions?
beverly: A question about how your life can connect to the book.
Allie: So what do you think is happening right now with that question?
321
Do you think it is too much in [the novel] Every Day, and not
enough about our world around us?
beverly: Yeah.
Allie used Beverly’s already drafted question —“How would you handle
relationships if you were living an impossible life like A?”—not only as a
starting point for refining the paper but also as a way to address Beverly’s
misunderstanding about what constituted a deep and original thesis state-
ment. By saying this question back to Beverly, and prompting her to recall the
requirements for a Level 3 question, Allie was able to reveal what her student
did not understand about the inference equation formula. This was a crucial
piece of information that she was not able to obtain only from studying the
equations they produced. Saying the question back and prompting recall of
the question requirements was also a way for Allie to compensate interaction-
ally for what she saw as a shortcoming in the inference equation formula.
Diler used a different strategy to support students in articulating a
sufficient understanding of the text’s gist and the strategies Shakespeare
used to convey that gist: she requested elaboration. For example, in her ses-
sion with Alyssa, the student was able to recall plot points that the passage
represented but was unsure about major ideas:
So—but—because I really don’t get the gist ‘cause if we talk about Benvolio
and the Montagues, we’re not gonna, uh, Romeo’s family right? But . . . I
mean and we have the prince and the things he does, so I don’t know what’s
the gist, if I’m writing about Romeo, or if I’m writing about the prince,
because I know that Benvolio and Montague are worried for Romeo, but
we see that the prince is like, you know—
diler: How would you try to summarize all this? Or what do you
322
diler: Yeah. So how did everyone seem to act in this passage, this
part of the play?
alyssa: Seems to be mad and worried and sad at the same time.
diler: OK.
diler: So you were saying everyone is sad and mad and worried.
alyssa: Oh yeah.
alyssa: I forgot.
Diler did not finish her thought because Alyssa jumped in with an idea for
a gist statement:
alyssa: So it can be like their actions, “What are the effects of the
results of your actions on the people around you?”
diler: Good, write that down.
323
cory: Weak.
diler: Good.
cory: And he’s crying, he’s hiding, and it’s also affecting him be-
cause he’s hiding and separate from the real world, he’s only living
in his imaginary world and running away from his problems. Now,
how the hell am I going to write that?
diler: That’s perfect actually. What you just said to me is, Romeo’s
new situation is affecting him and his family, because—what did you
say? How did you say that they see him?
cory: Weak.
Listening to Cory talk through his struggles in composing, Diler realized that
his level of analysis and insight was beyond what Alyssa and Beatrice—two
students deemed more successful in the CT’s class—had demonstrated in
the same assignment. They had not discussed in class the effect of Romeo’s
324
behavior on his family, so this was a completely original idea. Cory also ar-
ticulated an original angle on Shakespeare’s use of literary devices when he
wrote in his essay that “The speaker uses third-point-of-view to show how the
different characters perspective can make a different reality of what’s really
happening.” Yet his flat tone and diffident manner led Diler to suspect that
he was not aware of how sophisticated his analysis was. In this interaction,
she attempted to both praise him (“good,” “that’s perfect”) and encourage
him to say more; her confidence-boosting tactic of praise functioned simul-
taneously as both feedback and scaffolding prompt. And indeed, following
this interaction he embarked on a sustained composing episode with few
pauses or digressions.
Interactions with the students in the dialogic assessment sessions
weren’t always successful, however. In contrast to Alyssa and Cory, Beatrice
came to the task with a sense of what she already wanted to write about:
her central idea, which was “fighting between the two families and how
it negatively affected Verona.” Diler used questions to prompt Beatrice’s
metacognitive reflection on her interpretive process:
diler: How did you figure out that that was your central idea?
However, although Beatrice was able to explain why she settled on this
particular central idea and the literary device of metaphor, several turns
later she decided that metaphor was not a good fit for her central idea:
beatrice: So, I can’t find like a metaphor in this because, these two
paragraphs are about the fighting, and the others, are about um,
Montague . . . Romeo. So I think I have to change the literary device.
After Diler prompted her to reread the speech by Prince Escalus, Bea-
trice decided to use monologue rather than metaphor as her literary device.
This had not been Diler’s intention—rather, she had guided Beatrice to this
passage because of its vivid metaphor (“quench the fire of your pernicious
rage/With purple fountains issuing from your veins”), which could have
325
been used to support her central idea of “fighting between the two families
and how that negatively affected Verona.” However, Beatrice did not notice
this metaphor, even with Diler’s prompting to look for something that was
“not literal.”
Although she wasn’t able to support Beatrice in fulfilling her original
plans, in reflecting on this interaction, Diler came to suspect that Beatrice’s
challenge in this writing task came about because she had chosen her central
idea without thinking about literary devices.
diler: Did you get to read this at all [the passage] before? It’s OK if
you didn’t.
cory: No, but I remember it.
diler: [observing him marking the text] Good, that’s good, see you
found stuff already.
326
cory: [reading the text] He’s been seen there many mornings, crying
tears that add drops to the morning dew and making a cloudy day
cloudier with his sighs. The morning dew—making a cloud, that’s—a
hyperbole or—something. He cannot just make a cloud out of his
tears.
diler: Yeah, good, that’s a hyperbole because—
[Brief interval as he reads this passage in the text: But as soon as the
sun rises in the east, my sad son comes home to escape the light. He
locks himself up alone in his bedroom, shuts his windows to keep out
the beautiful daylight, and makes himself an artificial night.]
diler: What are you thinking?
cory: Hyperbole here because—you’re not going to run from the sun,
from the light. That’s impossible.
327
jessica: Yeah.
allie: Mmhmm. For sure. Yeah. You’re definitely making that very,
very clear. And that’s something I hadn’t really thought about be-
fore. So that’s awesome, Jessica.
Allie reassures Jessica three times in this brief exchange: first about her
choice to overlook any inaccuracies in grammar for the moment and move
on with her writing, then about the accuracy of her inference that Rhiannon
is uncomfortable with Finn’s appearance, and finally about the originality
of her inference.
Instructional Takeaways
Using knowledge obtained from assessment to plan for subsequent instruc-
tion is an integral component of the formative assessment process. With this
in mind, we considered how what Diler and Allie learned from working with
dialogic assessment would affect their future work in the classroom, either
with these students or with others. Diler’s main instructional conclusion
had to do with literary elements: while essential to the writing task she had
given them—and of high school ELA writing tasks generally—her students
varied in their ability to make use of these elements to develop their analysis.
Beatrice was limited in her options for finding meaning in the text, within
the parameters defined by the writing task, because metaphor was the only
328
device she understood well, while Cory was adept at invoking literary de-
vices but did not want to foreground them in his analysis. This led Diler to
think that it would be a good instructional tactic for students like Beatrice
and Alyssa to do some informal writing focused on the literary elements in
a text prior to composing a formal essay, to give her a baseline sense of how
much help students needed with this aspect of analysis before developing
their ideas in a full-length essay. For students like Cory, on the other hand,
a better approach might be to first write down their general thoughts and
emerging ideas on a text, and then encourage them to elaborate on these
ideas through discussion of literary elements. She also realized quickly
that each of her three students approached the task from a unique starting
point: whereas Alyssa came to the task unsure about the gist of the passage,
Beatrice had already settled on an idea, while Cory had not even read the
task beforehand (as she had asked the students to do). This required her
to use different questions from her preplanned list (see Figure 4); it also
heightened her awareness of how important it is to individualize support
and feedback for student writers.
A consequence of Allie’s and Diler’s new understanding of the role of
interpretive reading in writing development was that the practical tools they
saw as helpful to students had more to do with reading than with writing—for
instance, prior knowledge, the inference equation, and literary elements.
Diler and Allie saw their perspective on these tools evolve as a consequence
of using the dialogic assessment method. While Diler came to recognize that
students needed different approaches to working with literary elements to
explore meaning in a text, Allie came to question the effectiveness of the
inference equation as a pedagogical tool for some students, once she real-
ized how much help Beverly needed in working with it. She also began to
doubt whether the conceptual tool of “prior knowledge” was sufficient as
a construct to facilitate her own—and her students’—understanding of the
nature of literary interpretation. After their dialogic assessment session Allie
realized that Beverly had “lots of difficulty” and “needed a lot of mediation in
accessing [and] generating prior knowledge.” She also reflected that “prior
knowledge is a pretty vague and abstract name,” and she admitted that when
she first started her student teaching placement at that school she needed
to have several conversations with other teachers to really understand the
concept. Were this her own classroom, she reflected, she would construct a
different kind of written scaffold to support students’ literary interpretation.
Another category of implications for instructional practice has to do
with refinements to the use of dialogic assessment. Diler realized, after work-
ing with three different students who each had different needs, that not all
329
of the questions she anticipated being useful (Figure 4) would have been
useful for all of the students, and that the sessions were more useful when
she used questions selectively, in an individualized way. Allie was particu-
larly interested in which kinds of students would benefit most from dialogic
assessment, recognizing that it would be unfeasible to use with all students
regularly. Reflecting on how she attributed her own writing development to
extensive coaching and support from her parents throughout her schooling,
including into her undergraduate years, Allie thought that dialogic assess-
ment could be especially useful as a form of support for students who do not
have access to guidance or mentorship around academic literacy in their
homes or communities. After graduation, Allie took a full-time teaching posi-
tion as a 10th-grade reading teacher at an urban charter school. After Allie
began working at this school, she reflected that she would most likely use it
with students who can articulate thoughtful analyses of texts in classroom
discussions but struggle to articulate this analysis in writing.
330
331
to surface the challenges that were unique to her third student, Cory. Even
over the relatively short time span of one week, with repeated practice, she
saw her own instructional responsiveness improve.
Working with formative assessments during student teaching experi-
ences can transform preservice teachers’ knowledge about students and
about instruction. Like the research that has followed preservice teachers
engaged in close analysis of students’ writing (Davenport, 2006; Roser et
al., 2014; Shin, 2006; Sipe, 2000), our project revealed how such close work
can deepen student teachers’ understanding of writing and the challenges
it entails. Most importantly, we think, it shows how close work with dialogic
writing assessment can help preservice teachers—and perhaps all teachers—
to assess the value of certain instructional approaches such as inference
equations and MEAL paragraphs in a precise and nuanced way.
One limitation of the inquiry we conducted together is that Allie
and Diler used dialogic writing assessment with only one kind of writing
assignment: literary analysis. While a high-priority genre in high schools
and arguably the most representative genre for academic writing in the
subject of English, it does not represent the full range of genres relevant to
English language arts. That said, dialogic writing assessment is a flexible
approach that is not genre-specific. While strategies such as “saying back”
and encouragement are likely applicable across a range of genres, others
such as “requesting elaboration” seem more specific to expository genres
such as argument, persuasion, and explanation. Sarah is engaged in a long-
term program of research on dialogic writing assessment that explores what
kinds of feedback strategies are most helpful to students, and under what
kinds of conditions. Another aspect of this program of research involves
helping teachers identify which students benefit the most from this method,
because, like most individualized assessments, dialogic writing assessment
is time-consuming and labor intensive.
For Allie and Diler, awareness of the need for individualized instruction
and attention to the role of discrete skills in students’ literacy development
is a longer-term lesson they learned from participating in this inquiry. Since
graduating from her teacher education program Allie has been working as a
10th-grade reading teacher in a charter school that is part of a charter net-
work, in which reading and writing are taught in separate classes. While she
still subscribes to the view that reading and writing development are inter-
twined, she supports the network’s decision to organize literacy instruction
in this way because it allows for targeted work on separate skills. While this
curricular configuration makes it more difficult for teachers to use writing
332
Notes
1. A note on acronym usage: ESL is the standard term for college-age second-
language writers; we use it to describe programs and classes as well. When referring
to K–12 students, we use the term English Learners.
2. The names of Allie’s and Diler’s schools, and of their students, are pseudonyms.
References
Anderson, L. H. (2000). Fever, 1793. New York: Simon & Schuster.
Atwell, N. (1998). In the middle: New understandings about writing, reading and
learning (2nd ed.). Portsmouth, NH: Heinemann.
Baker, S., Gersten, R., & Scanlon, D. (2002). Procedural facilitators and cognitive
strategies: Tools for unraveling the mysteries of comprehension and the writing
process, and for providing meaningful access to the general curriculum. Learn-
ing Disabilities Research & Practice, 17(1), 64–77.
Bakhtin, M. (1981). The dialogic imagination. (Trans. C. Emerson & M. Holquist).
Austin, TX: University of Texas Press.
Bass, J. A., & Chambless, M. (1994). Modeling in teacher education: The effects on
writing attitude. Action in Teacher Education, 16(2), 37–44.
Beck, S. W., Llosa, L., Black, K., & Trzeszkowski-Giese, A. (2015). Beyond the rubric.
Journal of Adolescent & Adult Literacy, 58(8), 670–681.
Bereiter, C., & Scardamalia, M. (1987). The psychology of written composition. Hills-
dale, NJ: Lawrence Erlbaum Associates.
Biancarosa, G., & Snow, C. E. (2006). Reading next: A vision for action and research
in middle and high school literacy—A report from Carnegie Corporation of New
York (2nd ed.). Washington, DC: Alliance for Excellent Education.
333
334
Kiuhara, S., Graham, S., & Hawken, L. (2009). Teaching writing to high school stu-
dents: A national survey. Journal of Educational Psychology, 101(1), 136–160.
Leung, C. (2007). Dynamic assessment: Assessment for and as teaching? Language
Assessment Quarterly, 4(3), 257–278.
Levithan, D. (2013). Every day. New York: Ember.
McIver, M., & Wolf, S. (1999). The power of the conference is the power of sugges-
tion. Language Arts, 77(1), 54–61.
Mertler, C. (2009). Teachers’ assessment knowledge and their perceptions of the
impact of classroom assessment professional development. Improving Schools,
12(2), 101–113.
Morgan, D. N., & Pytash, K. E. (2014). Preparing preservice teachers to become
teachers of writing: A 20-year review of the research literature. English Educa-
tion, 47(1), 1–6.
Morrison, J. A. (2005). Using science notebooks to promote preservice teachers’
understanding of formative assessment. Issues in Teacher Education, 14(1), 5–21.
National Writing Project & Nagin, C. (2003). Because writing matters: Improving
student writing in our schools. San Francisco: Jossey-Bass.
Norman, K. A., & Spencer, B. H. (2005). Our lives as writers: Examining preser-
vice teachers’ experiences and beliefs about the nature of writing and writing
instruction. Teacher Education Quarterly, 32(1), 25–40.
Nystrand, M. (1989). A social-interactive model of writing. Written Communication,
6(1), 66–85.
Popham, W. (2009). Assessment literacy for teachers: Faddish or fundamental?
Theory Into Practice, 48, 4–11.
Roser, N., Hoffman, J., Wetzel, M., Price-Dennis, D., Peterson, K., & Chamberlain,
K. (2014). Pull up a chair and listen to them write: Preservice teachers learn
from beginning writers. Journal of Early Childhood Teacher Education, 35(2),
150–167. doi:10.1080/10901027.2014.905807
Sable, L., Forbes, T., & Zangori, L. (2015). Promoting prospective elementary teach-
ers’ learning to use formative assessment for life science instruction. Journal of
Science Teacher Education, 26(4), 419–445.
Shepard, L. (2000). The role of assessment in a learning culture. Educational
Researcher, 29(7), 4–14.
Shin, S. (2006). Learning to teach writing through tutoring and journal
writing. Teachers and Teaching: Theory and Practice, 12(3), 325–345.
doi:10.1080/13450600500467621
Sipe, R. B. (2000). Virtually being there: Creating authentic experiences through
interactive exchanges. English Journal, 90(2), 104. doi:10.2307/821226
Sperling, M. (1991). Dialogues of deliberation: Conversation in the teacher-student
writing conference. Written Communication, 8(2), 131–162.
Street, C. (2003). Pre-service teachers’ attitudes about writing and learning to teach
writing: Implications for teacher educators. Teacher Education Quarterly, 30(3),
33–50.
335
Swain, S., & LeMahieu, P. (2012). Assessment in a culture of inquiry: The story of
the National Writing Project’s analytic writing continuum. In N. Eliot & L. Perl-
man (Eds.), Writing assessment in the 21st century: Essays in honor of Edward M.
White (pp. 45–67). New York: Hampton Press.
Tremmel, R. (2001). Seeking a balanced discipline: Writing teacher education in
first-year composition and English education. English Education, 34, 6–30.
Volante, L., & Fazio, X. (2007). Exploring teacher candidates’ assessment literacy:
Implications for teacher education reform and professional development. Cana-
dian Journal of Education, 30(3), 749–770.
Vygotsky, L. S. (1978). Mind in society: The development of higher psychological
processes. Cambridge, MA: Harvard University Press.
Wallace, M., & White, T. (2014). Secondary mathematics preservice teachers’
assessment perspectives and practices: An evolutionary portrait. Mathematics
Teacher Education and Development, 16(2), 25–45.
336