Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 17

Running head: STUDENT ASSESSMENT PROJECT

Student Assessment Project

Joseph O. Schuster

EDU 325

Introduction
STUDENT ASSESSMENT PROJECT 2

In this project, the Dynamic Indicators of Basic Early Literacy Skills (DIBELS)

test was used by a pre-service educator to practice utilizing Curriculum Based

Measurements (CBM) in order to assess student literacy. For the purposes of this

assessment, one elementary student was chosen for DIBELS administration: “Cooper.”

Cooper is a seven-year-old first grade student from a small public elementary school in

an urban, relatively low socioeconomic area. The DIBELS test was administered by the

pre-service educator as the “assessor” in two parts and over two separate days. The

Beginning portion of the first grade DIBELS assessment was administered on Thursday,

February 28, 2019, and the Middle portion was administered three weeks later on

Thursday, March 21, 2019. The data resulting from the assessment was used by the

assessor for the purpose of two main objectives. First, to identify two specific areas of

literacy that Cooper needed to grow in and to determine a strategy to help Cooper meet

each area of growth. Second, the data was used to create a lesson plan that provides

instruction on one of those skills using the chosen strategy.

Before the DIBELS assessment was administered, a background interview was

conducted with Cooper’s homeroom teacher, inquiring into Cooper’s strengths and

weaknesses, as well as his present levels of academic, social, and behavioral

performance. Based on this interview, the assessor learned that Cooper lives with his

Mom, Dad, and eight-year-old sister. Cooper’s parents want him to be able to read by the

end of the school year. Currently, Cooper’s grades in school stay around average. He

usually gets Cs and receives scores of 70%-80% on tests in his reading and mathematics

classes. In regards to Present Level of Performance (PLOP), Cooper’s teacher reported

that he is at a 1st grade level in reading, spelling, writing, and math. Cooper has an IEP
STUDENT ASSESSMENT PROJECT 3

for academics, with accommodations such as small group instruction, repeated and

simplified directions, directions read aloud, letter and word charts for reading instruction,

and number line charts for mathematics.

Academically, Cooper’s strengths include the ability to identify all familiar letter

sounds and to blend those sounds into words. Cooper’s teacher reported that with

repeated reading, Cooper can read a sentence. He can also add and subtract with digits up

to 20 with the use of manipulatives and a number line. One major area of academic

improvement for Cooper identified by his teacher is a need to practice sight word

recognition in the hopes of improving his reading fluency.

Behaviorally, Cooper has many strengths. His homeroom teacher says that

Cooper does not have many issues with behavior and therefore does not have a Behavior

Intervention Plan (BIP). His teacher describes him as kind, honest, and polite to peers as

well as adults. She says that he is a joy to have in class, and overall a very happy child.

She also says that Cooper is one of her favorite students, and “makes each day

enjoyable.” If there is any room for improvement in the behavior department, Cooper’s

teacher would say that sometimes he can be stubborn. In addition, Cooper has a very

short attention span.

Socially, Cody is a very successful student. According to his teacher, he is

consistently kind and polite to peers. He is well liked by other students and has made

friends at school. He is friendly to teachers from other grades and classes, knows many

of their names, and will often wave to them or greet them by name in the hall. Cooper’s

teacher noted that he likes to have individual attention. He finds helping others

(especially adults) motivating. Finally, Cooper’s teacher reported that he usually keeps a
STUDENT ASSESSMENT PROJECT 4

neat folder and desk area, and does not exhibit any deficiencies in organization or

decision-making.

Procedures

I chose to work with Cooper for the DIBELS assessment because I was Cooper’s

reading tutor for a previous field experience and had already established a positive

rapport with him. I reached out to the cooperating teacher from the past field experience

and asked if it would be alright if I came in to work with Cooper again and administer the

DIBELS assessment. The cooperating teacher was very happy to accommodate me and

quickly helped me set up a day and time that I could come into the school. The teacher

made sure that these days and times would be convenient for Cooper to be pulled out of

his regular classes for the assessment. Two twenty-minute sessions on two separate days

were blocked off, first for the administration of the Beginning portion of the assessment,

and second for both the Middle and End portions of the assessment to be administered

together.

On my first day at the school, I gave Cooper’s teacher a copy of background

questions about Cooper. We decided that she would keep this copy and take time to look

over the questions on her own. Then, she could share her thoughts with me on the second

day of the assessment. I then took Cooper to a large rectangular table with two chairs in

a quiet resource room next to the school computer lab. I placed Cooper’s chair at the

head of the table and arranged my chair to the right side of his for the best possible angle

for administering the DIBELS test. Before jumping into the assessment, I made sure to

begin the twenty-minute session by re-establishing my connection with Cooper through

friendly conversation. For example, I talked with Cooper about his recent birthday
STUDENT ASSESSMENT PROJECT 5

celebration, his new “All-Star” shoes, and his favorite TV show, SpongeBob

SquarePants.

Soon, I transitioned into explaining the assessment to Cooper. I prefaced this

explanation with the thought that last semester, I had helped Cooper with his reading; this

semester, however, I needed his help to complete my college project by taking some short

tests. After Cooper agreed to help me, I told him that he would be taking three short tests

today and gave him a slip of paper with three circles on it, each one labeled either “Test

1,” “Test 2,” or “Test 3.” I explained to Cooper that after he finished each short test, he

could choose a sticker to put in the center of the respective circle on the slip of paper to

keep track of his progress. I knew from my past field experience with Cooper that he has

a very short attention span and would require this additional motivation, despite the

probes being only 1 minute in length.

After this explanation I began the assessment, using the DIBELS Next First

Grade Student Materials packet, the DIBELS First Grade Scoring Booklet, and my phone

as a stopwatch function for the assessment. Although there were many times during the

assessment when Cooper attempted to ask a question or cause some kind of distraction, I

was able to redirect him by using the unlimited “Keep Going!” prompt within the

DIBELS assessment guidelines. When all three probes were completed, I used the

remaining time to draw a SpongeBob scene with Cooper and then to walk with him back

to his classroom. This process was repeated a few weeks later, only this time I

administered the Middle and End portions of the assessment. There are no other

noteworthy variances between this second session and the first. However, Cooper’s

teacher was not present on the second day of the assessment, and so rather than
STUDENT ASSESSMENT PROJECT 6

completing the background interview in person, she was happy to respond to the

questions via email.

After leaving the school, I finished filling out all portions of the scoring booklet

and analyzed the results of the assessments by manually calculating the DIBELS

composite score and comparing it to grade-level benchmarks. Because of these scores, I

decided to the following two areas of intervention in reading: segmenting (or phoneme

segmentation fluency) and blending. A detailed rationale behind this decision and the

strategies that were chosen to accompany them are provided in the “Areas Targeted for

Improvement” section.

Assessments Given

According to the DIBELS first grade level, four different assessments were

administered to Cooper: Letter Naming Fluency (LNF), Phoneme Segmentation Fluency

(PSF), Nonsense Word Fluency (NWF), and finally DIBELS Oral Reading Fluency

(DORF).

Letter Naming Fluency is a short, direct measure (often called a probe) of a

student’s automaticity in naming letters (Good, 2002, p. 48). The purpose of the LNF

probe is not to determine which specific letters the student knows or not, but rather to

measure their fluency (Good, 2002, p. 48). The assessment is administered by showing

the student a page of randomly listed uppercase and lowercase letters. The assessor asks

the student to name the letters one by one. The number of letters named correctly is

recorded. Although LNF is not considered a basic literacy skill, the data collected from

this assessment is important because it provides insight into the student’s reading

development and facilitates the achievement of other early literacy skills and subskills
STUDENT ASSESSMENT PROJECT 7

(Good, 2002, p. 48). For example, letter name knowledge is highly connected to letter-

sound knowledge, and research indicates that knowing letter names provides multiple

benefits for acquiring letter-sound correspondence (Clemens, Lai, Burke, & Wu, 2017, p.

272). In addition, students usually learn letter names before sounds, and so LNF gives

students a foundation upon which they can more easily build new literacy information

and skills (Clemens et al., 2017, p. 273). Finally, LNF serves as an “at-risk” indicator

and helps predict future reading achievement (Clemens et al., 2017, p. 273).

Phoneme Segmentation Fluency (PSF) is a measurement of phonemic awareness

(Good, 2002, p. 55). It assesses a student’s ability to segment a spoken word into its

component sound segments. The administrator reads a word aloud, and the student is

asked to say the sounds in the word. Correct sound segments are underlined and added

by the administrator, with partial credit given for partial segmentation (Good, 2002, p.

55). This assessment is important because research shows that phonemic awareness is a

very strong predictor of future reading achievement (Abbott, Walton, & Greenwood,

2002, p. 23). Some researchers say that phonemic awareness is an even more powerful

predictor of reading progress than IQ, and can predict levels of reading and spelling

achievement as many as eleven years in the future (Smith, 1998, p. 24). It is very

important to catch weaknesses or delays in phonemic awareness as early as possible in

order to provide timely intervention (Smith, 1998, p. 24).

Nonsense Word Fluency (NWF) measures both the alphabetic principle

(knowledge of basic letter-sound correspondence) and basic phonics (the ability to blend

letter sounds into consonant-vowel-consonant and vowel-consonant words) in students

(Good, 2002, p. 66). Nonsense words, also called “make-believe” words or “pseudo-
STUDENT ASSESSMENT PROJECT 8

words,” are used for this probe so that a student does not rely on their word recognition

skills, but rather on their phonics skills (Good, 2002, p. 66). Using standardized

procedures, the student is given a sheet of randomly listed VC and CVC nonsense words

and is asked to do their best to read the words, either as whole words or by saying sounds

in the word (Good, 2002, p. 66). The assessor underlines each correct letter sound spoken

by the student, both those spoken in isolation and those blended together (Good, 2002, p.

66). The NWF probe is important because several studies have shown that there are

strong, positive relations between gains made in NWF and oral reading fluency (ORF)

and comprehension scores. (Fien, Park, Baker, Smith, Stoolmiller, & Kame’enui, 2010, p.

635). In addition, the NWF probe is a direct measure of the student’s grasp of the

alphabetic principle, which is a necessary skill for early literacy. (Fine et al., 2010, p.

636).

DIBELS Oral Reading Fluency (DORF) measures many skills, including

accuracy and fluency of reading a connected text, word attack skills, advanced phonics

skills, and reading comprehension (Good, 2002, p. 79). The actual administration

consists of two parts: oral reading fluency and passage retell. In the first part, students are

given a grade-level text and asked to read for 1 minute, while the administrator notes all

errors (Good, 2002, p. 79). This process is repeated with two other reading passages and

the administrator calculates a median score of the three based on how many words were

read correctly (Good, 2002, p. 80). The administrator also rates the quality of the student

retell. This data is important because some students can speed-read without

comprehending what they are reading, and this tendency should be identified by
STUDENT ASSESSMENT PROJECT 9

educators (Good, 2002, p. 80). DORF also provides information on the student’s oral

language skills and overall reading proficiency.

Results & Analysis

Cooper got an LNF score of 6. The LNF assessment is only administered at the

beginning of the year (between months 1-3) and does not have a benchmark. This is

because letter naming is not directly linked to one of the five core components of basic

literacy and is therefore not seen as a necessary skill for achieving desired reading

outcomes (Good, 2002, 47-48). However, the LNF score is a component of the DIBELS

composite score for the beginning of the year, and therefore contributes to whether or not

the student makes benchmark. Two LNF response patterns were exhibited by Cooper

during administration. First, Cooper often said the letters sound instead of the letter

name, even after the one allotted “Try to say each letter name” prompt. Second, there

were several points during administration when Cooper became distracted or exhibited

minimal effort to stay on task. This may have affected the results, even despite frequent

use of the unlimited “Keep going” prompt.

In the PSF assessment, Cooper achieved a 9. This score puts Cooper at well below

benchmark and shows that he will need intensive support in this area. This puts Cooper

below the cut point of risk, with only a 10-20% likelihood of achieving early literacy

goals in the future. This means he would be in Tier III for RTI. Besides exhibiting

attention difficulties, Cooper also showed a PSF response patterns of only saying the

initial sounds of words. For example, for the word “count,” Cooper only said the /k/

sound, even after being reminded to “Say all the sounds in the word” as the assessment

allows.
STUDENT ASSESSMENT PROJECT 10

The NWF assessment was administered in three sections, one for the beginning,

middle, and end of the year. Cooper was not able to read any of the nonsense words as a

whole and therefore scored a 0 for the Whole Words Read (WWR) score for all three

sections of the year. However, his Correct Letter Sounds (CLS) score fluctuated. In the

beginning of the year Cooper received a CLS of 19 which put him below benchmark, but

still above the cut point for risk, and therefore likely to need strategic (rather than

intensive) support. However, in both the middle and end of the year assessments

(administered on the same day) Cooper got a CLS score of 7 and 23 respectively, putting

him well below benchmark with need of intensive support. Again, Cooper is below the

cut point of risk, with only a 10-20% likelihood of achieving early literacy goals. In

addition to attention difficulties, Cooper showed a NWF response pattern of saying the

correct sounds, but not recoding, even after being given the allotted “try to read the words

as whole words” prompt.

The DORF assessment was administered for both the middle and end of the year.

Cooper achieved a DORF score of 2 and a 4 respectively for the first passage. Because

he did not get more than 10 words correct in the first line, his assessment was

discontinued and he was not administered a retell or passages 2 and 3. These low scores

again put Cooper at well below benchmark and in need of intensive support (Tier III).

Added to his NWF score from the middle of the year, the DORF score of 2 gave Cooper

a middle of the year DORF Accuracy Percentage of 15%. This percentage converts to an

accuracy value of 0 and a composite score of 9. For the end of the year, Cooper got a

DORF accuracy percent of 21%, also converting to an accuracy value of 0 and a


STUDENT ASSESSMENT PROJECT 11

composite score of 4. Both of these composite scores are well below benchmark and

signal Tier III intervention. See the table below:

Graph 1. Cooper’s performance on DIBELS compared to benchmark averages and cut

points of risk for each assessment.

Areas Targeted for Improvement

The literacy areas of segmenting (PSF) and blending were chosen for Cooper to

improve upon. These two skills were chosen because based on his DIBELS assessment

scores, Cooper was well below benchmark for both of these categories. In addition,
STUDENT ASSESSMENT PROJECT 12

segmenting and blending are considered the two most crucial skills for building solid

phonemic and phonological awareness (Tankersley, 2003). They are the building blocks

of reading. Therefore, in order for Cooper’s DORF score to significantly improve, it

would first be necessary for his foundational skills in both PSF and blending to improve.

Segmenting

Segmenting refers to the skill of breaking up the pronunciation of a word into

individual sounds, or phonemes (Slavin & Madden, 2005, p. 22). Therefore, this skill is

very closely related to phonemic awareness. Phonemic awareness can be defined as the

ability of a student to hear and manipulate phonemes in spoken words, and does not

directly require knowledge of written letters, letter names, or syllables (Reutzel & Cooter,

2012, p. 107). PSF helps to measure this skill in students, because the assessor does not

show the student any written words, but rather pronounces words and asks the student to

segment or divide the words into individual phonemes (Good, 2002, p. 55).

The strategy chosen to help Cooper improve upon this skill (and therefore obtain

a higher PSF score and become more proficient at related literacy skills) is called “Using

Your Body to Break Words Down,” as trademarked by the Success For All Tutoring

Foundation’s Tutoring Manual (Slavin & Madden, 2005, p. 136). This strategy, also

sometimes known as simply “Break-It-Down” or “Tap-It-Out,” involves students using

their hands, feet, pencil, or any rhythm-making object to segment (or break down) a word

into its individual phonemes (Slavin & Madden, 2005, p. 135-136). For example, if the

teacher speaks the word “man” aloud, a student could use their hands to clap as they

segment the word into its phonemes: “/m/ [clap], /a/ [clap], /n/ [clap]”. This strategy is

often successful because it adds a physical element to learning. It is also known as a very
STUDENT ASSESSMENT PROJECT 13

flexible strategy, as students could use any form of percussion to practice it. For

example, students could clap, snap, pat, stomp, tap, or even use drums, xylophones, or

maracas (Tankersley, 2003). Student progress on segmenting can be easily monitored

both formally and informally using this strategy due to its visible and audible nature.

After two weeks of instruction and practice using this skill, Cooper would be given

another PSF probe and any improvement would be recorded on a graph.

Blending

Blending, in contrast to segmenting, is more closely related to the larger, umbrella

skill of phonological awareness which involves knowing written letters and manipulating

units of sounds larger than phonemes (Reutzel & Cooter, 2012, p. 107). Blending, unlike

phonemic awareness and segmenting, is not intuitive, and requires explicit explanation,

modeling, and guided practice by the instructor (Reutzel & Cooter, 2012, p. 156).

Blending can be defined as the ability to blend the sounds of letters in words into

approximate pronunciations (Reutzel & Cooter, 2012, p. 156). This area was picked for

improvement because Cooper needed to use this skill for DIBELS, especially in the NWF

assessment. When reading through the nonsense words, Cooper only pronounced the

individual, correct letter sounds (CLS) and was unable to blend the sounds together to

read the whole nonsense words (WWR).

The strategy chosen to help Cooper with blending is sequential blending coupled

with the use of word box graphic organizers. Sequential blending is a specific strategy in

which a reader scans each letter in a CVC word moving from left to right, says the sound

of each letter in sequence, and then says those sounds in sequence faster and faster until

they are blended together into a word (Reutzel & Cooter, 2012, p. 156). This strategy
STUDENT ASSESSMENT PROJECT 14

requires explicit modeling and guided practice for students to be able to achieve this skill

on their own, and is made much easier with the use of a word box graphic organizer

(Reutzel & Cooter, 2012, p. 156). Word boxes is a strategy where students put plastic

letters into divided sections of a drawn rectangle (boxes) as each sound is spoken

(Joseph, 2018, p. 307). The goal of this strategy is to help students with phonological

decoding, and research has shown that it has been effective, even for students with autism

(Joseph, 2018, p. 303). Part of the reason word boxes have been proven effective is

because it is a multisensory process, requiring the student to hear the sounds, look at

every letter that represents the sounds, noting where letters are placed sequentially in a

words, and arrange the letters into their respective positions by physically sliding them

into the boxes (Joseph, 2018, p. 304).

Therefore, Cooper would not only be taught how to sequentially blend through

direct instruction, but he would also be given the word box tool. Mastery of both aspects

of this strategy should greatly improve not only Cooper’s NWF scores, but also his

DORF reading scores as he will be better able to sound out unfamiliar words. Cooper’s

progress would be monitored by re-administering the NWF test (or similar probes) and

graphing the results after a week or two of sequential blending instruction and practice

using the word boxes.

Conclusion

Overall, this project proved to be a success. Looking back, I feel satisfied seeing

all of the components of the project that I accomplished to write this paper: getting to

know my student through a teacher interview, administering the DIBELS assessment,

scoring the assessment and interpreting the results, researching the importance of each
STUDENT ASSESSMENT PROJECT 15

section that was administered, picking two skills for the student to improve upon, finding

two strategies to help with improvement, and finally to create a lesson plan for one of

these strategies.

I have found two aspects of this project that have proved to be the especially

important to my conception of the CBM as I move forward as a pre-educator. First, I

now fully understand the importance, or even the necessity, of administering CBMs to

students. Before I began this project, I definitely considered the administration of strictly

standardized and scripted assessments to sometimes be a waste of time. But after

administering one myself and doing research on what the DIBELS test measures and the

information it provides on the student, I now see how valuable the data collected from

such assessments can be, and how easy they can be to administer. I now think that

frequent use of CBMs such as DIBELS should be an essential part of any successful

classroom. The second aspect I found especially important is the realization that the

main purpose of CBMs is to inform instruction. Sure, it is important to use the data to

see and graph progress, but it ultimately needs to be used to make instruction more

effective. Although the lesson plan I created is theoretical, it allowed me to apply the

knowledge I gained from the data to the instructional setting and think about how I would

teach Cooper using what I learned from DIBELS.


STUDENT ASSESSMENT PROJECT 16

Bibliography

Abbott, M., Walton, C., & Greenwood, C. R. (2002). Phonemic awareness in

kindergarten and first grade. Teaching Exceptional Children, 34(4), 20-26.

Clemens, N. H., Lai, M. H. C., Burke, M., & Wu, J. (2017). Interrelations of growth in

letter naming and sound fluency in kindergarten and implications for subsequent

reading fluency. School Psychology Review, 46(3), 272-287.

Fien, H., Park, Y., Baker, S. K., Smith, J. L. M., Stoolmiller, M., & Kame’enui, E. J.

(2010). An examination of the relation of nonsense word fluency initial status and

gains to reading outcomes for beginning readers. School Psychology Review, 39(4),

631-653.

Good, R.H., & Kaminski, R. A. (Eds.) (2002). Dynamic indicators of basic early literacy

skills (6thed.). Eugene, OR: Institute for the Development of Educational

Achievement.

Joseph, L. M. (2018). Effects of word boxes on phoneme segmentation, word

identification, and spelling for a sample of children with autism. Child Language

Teaching and Therapy, 34(3), 303-317.

Reutzel, D. R., & Cooter, R. B. Jr. (2012). Teaching children to read: The teacher makes

the difference. Sixth Edition. Boston, MA: Allyn and Bacon/Pearson.

Slavin, R. E., & Madden, N. A. (2005). Success for all tutoring manual (3rd ed.).

Baltimore, MD: Success for All Foundation.

Smith, C. R. (1998). From gibberish to phonemic awareness: Effective decoding

instruction. Teaching Exceptional Children, 30(6), 20-25.


STUDENT ASSESSMENT PROJECT 17

Tankersley, K. (2003). Threads of reading: Strategies for literacy development.

Alexandria, VA: Association for Supervision and Curriculum Development.

You might also like