MAP June 2010 Teacher Comments

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 43

MAP in Seattle Schools June 2010

What would help you build on the concepts you have learned so far
related to student growth?
1. Getting rid of the MAP test would help me increase student growth.
2. Need more training to understand the various reports.
3. How the scores on the MAP relates/correlates to MSP
4. Please address the issue of accomplished students not seeing the same growth or a more
limited range if they are assigned to the math continuum recommended by the district.
5. I would like student growth based on strands to be easy to interpret. I had to do it manually.
6. If students went up from fall to winter then down again in the spring, I would like to see them
credited for the gains they made looking at all three scores not just the spring score. Some
kids have a tough day or test fatigue by the 3rd test and look at negative gains although they
did show growth in the winter score.
7. A greater understqanding of how MAP aligns with state standards and Everyday Math
8. Our curriculum does not dovetail with the concepts being evaluated on the MAP. We need
more intentional instruction of grammar, Latin roots, literary elements and devices, etc.
9. Strategies and materials to help individual students with personal growth and goals.
10. I feel like I need more support from the district with curriculum resources to address specfic
strands of MAP. What's available to help kids practice certain skills (homework)?
11. Access to district-created summaries, reports, and details.
12. Long term real student growth has nothing to do with MAP
13. To have access to the dat so we can slice and dice the data ourselves
14. Discussion regarding changing instruction to meet the needs of kids. I understand this, but the
staff is challenged by this concept.
15. The MAP testing with assistance does not go to the depth of the other children's reports. It
just shows number sense, which does not correlate with going further into analyzing the data,
something the regular tests do for non-assisted children.
16. how to account for data abnormalities.
17. More detailed information about what the results mean.
18. Finding students' weakness and strength in details help me to adjust my instruction.
19. It is very hard to use the data from my kindergarteners to inform instruction because fall,
winter and for most kids spring, the data truly only showed me their ability to use a computer.
The results only accurately represented (when compared to other assessment used) 5-10%
of my class and only with their spring results.
20. The message from the MAP Trainers to teachers has been that the MAP scores will not be a
"high-stakes" test. However, it is being used for such items as the Advanced Learning
eligibility. Can we please change how we are messaging this to teachers? Just need a
consistent message.
21. The problem is that the data is presented by the only person trained (and she is a math
person). She showed me the scores, but cannot interpret them. I need effective training.
22. More training about what "typical growth" means,

info on what other SPS and national schools participate ie, where are the "typical" students
coming from?
23. I think I have a strong understanding of student growth and achievement
24. reports dont correlate to grade level skills
25. My students were too low to score significantly on this test. The test results were not that
helpful.
26. Student growth is not showing - it's going in the opposite direction.
27. Access to district-created reports.
28. I haven't seen correlation/causal relationships between my effort and student growth.
Sometimes I see the inverse occurring.
29. not applicable
30. This year was hard considering students who were allowed accomandations didn't recieve
them until Spring in my school so of course there was a huge jump in scores. It was also hard
becuase there is no way for us to see any answers and many questions I observed in 3rd
grade I wasn't even sure what the answer was.
31. dump map testing and use Iowa Tests
32. These district-created summaries should be very helpful and are most welcome, but seem
expensive for each district to produce. Is NWEA being encouraged to provide these? As they
are new & are not easily accessible for every teacher in a central site (Prinipal gatekeeper)
easy access is currently a problem. Is there a server that can be accessed by every teacher in
every building via a password for their own reports?

Also, an understanding of winter scores. They seem to be a bit wild & not as accurate as Fall
& Spring. Also if student guess accurately, it can bump their scores up by 10-15 pts. Then
subsquent tests reflect reduced scores. NWEA needs to figure this out better. One of the
college boards gives a partial negative pt for missed items & no deduction for a question left
blank. Maybe NWEA needs to go to this kind of scoring. It makes the scoring seem less
accurate.
33. It would help if the district could send someone able to explain the data to teachers, to model
for us what the letters and numbers mean, so that we can begin to wrap our heads around this
test. Until then, it will remain vague and incomprehensible to teachers, students and parents
alike.
34. It would be informative to have access to data from students who are not in my class so that I
can make direct comparisons between students in regular math, those in advanced math, and
those in math with support. It would help me determine if each student is placed appropriately.
35. I said "neutral" above because I don't think I saw those reports.
36. more information
37. We did not do the fall MAP in K. We only did the winter and the growth comparison was not
shown for this. So that was not helpful
38. training

time to collaborate with other staff


39. the website could desperately us a designer to help with user-friendliness as well as overall
aesthetics =)
40. To understand how MAP relates to state standards and district-mandated curriculum.
41. I need to spend more time and have some of the data clarified.
42. The reports you talk about above were not created by the district. We had to create them and
then we did not have much time to use the data to inform instruction. The class strand would
have been very helpful.
43. Not having the range for the subcomponents. Being able to compare MAP data with other data
on kids.
44. questions shown to teacher based on range
45. MAP Reading scores for Deaf and Hard-of-Hearing students does not appear to be accurate
46. Access to the test results and more training on how to interpret the results.
47. More training.
48. I feel this is a biased test and the district is basing too much on this data.
49. more clarity about vocabulary

data that compares to standard, not to district wide growth


50. Growth data is the most crucial. We get that from district reports.
51. To show ALL students in a class' progress at onece for all three tests taken
52. n/a
53. none
54. Recognition by the district that not all growth is shown on a multiple choice exam.
55. To move the final testing window to June so that we can use the data suite provided by the
company.
56. I wouild like to know which EALRs are addressed in each question.
57. We are finding that our advanced learners across the board did not meet typical growth
because they scored so high in the fall and it looks like they did not learn anything--even
showed negative growth.
58. You cannot define "student growth". It is different for each child at every grade and age level. If
you believe that you can evaluate a TEACHER based on the results of the MAP test, or any
other test, you are dead wrong. I have 30 years teaching experience and know what I am
talking about. Please do not think that the MAP test is any sort of "administrative savior" that
will make teachers' better at what they do. You need to change the fundamentals of learning
and teaching in order to help students learn better and more.
59. na
60. What other instructional interventions/ changes could be recommended for specific ability
groups.
61. This has not helped me at all as related to student growth.
62. A uniform set of materials, information and speaking points to use when discussing data with
familes, esp. how their child's score is related to district averages, and what to do when they
are below average.
63. Can we compare students and grades in graph format to show teachers trends?
64. I don't think MAP has given an accurate picture of some of my older students with autism.
65. More information and easier access to the many subtests in the primary reading and math
66. I would like to know HOW the concepts are tested. How is writing assessed when there is no
writing? How is reading assessed when there is no reading? If the test does not test what we
teach, then it won't be an accurate measure of student growth. My student's growth on the
MAP does not always correlate to their growth in class. This is frustrating to me.
67. It would be incredibly helpful to have more information on MAP -- to see and have the
opportunity to take an actual MAP test. As it stands, I have never laid eyes on the test, know
very little about it, and have been given virtually no PD on how to make it relevant for my
students and my practice.
68. write the concepts in parent-student friendly language.
69. ?
70. District-funded release time to analyze reports and plan differentiated instruction. Tailored
professional development about differentiated instruction at the high school level.
71. We need concrete info to give parents - they should also be trained on how to read this data. I
have no idea how to explain these results, and I received almost no usable results due to
inconsistent and/or incomplete testing of my class.
72. Having reports customized to my specific students. I see students from every class in the
school. It is VERY time consumng to have to go through each teacher's reports to find my
students. I'd like to be able to set up my groups once a year and print custom reports for the
students I see.
73. I teach math (CMP2) and would appreciate support in using student MAP data to differentiate
our curriculum to meet individual student needs. This needs to be from someone who has a
strong understanding of the CMP2 curriculum and of the MAP strand data. Thanks
74. Accessing just the bilingual student data
75. An understanding of WHY if student growth is listed as 2 or 3 points and the "standard
deviation" is 2 or 3 points both up and down - how can that be called growth?
76. having copies of similar questions on the map test,specific examples,the strands are too
general
77. It would be helpful to see specific questions that the majority of the class missed.

Also, it would be more efficient to have the students' scores assigned a grade level along with
the RIT and lexile. It is more meaningful to the families.
78. show it in a chart form or graph
79. MAP is a waste of time
80. we need to be able to use the big fold out thing which we were not given by the district for
every teacher; only the MAP personnel rec'd those
81. I still need to understand what skills align to what RIT range, and how that correlates to grade
level.
82. To learn about typical growth for special education, ELL students and other special
populations.
83. The data is too confusing to easily interpret for parents and even for teachers when trying to
organize strand data.
84. simpler explanation/model of growth reports - it's very statistical and complicated
85. Information on the student goal process. Is there going to be a larger focus on creating goals
with students using the NWEA student goal worksheets?
86. Taking skills/strand data - looking at it through the lense of "next steps" for this child
87. n/a
88. Student Growth is a meaningless concept when one cannot articulate for students how to
raise their RIT scores. The alignment of skills to scores is quite fuzzy. The RIT scores do not
indicate which skills students are lacking.
89. more about the connection bewteen curriculum calendar (math) and MAP, mote on how to use
MAP to inform instruction
90. Knowing it wasn't being tied to my evaluation.
91. What can I do to improve reading scores and how can I find resources that meet my students
where they are in reading so that I know that they are getting the concepts?
92. Giving more time and assistance to teachers to teach and making classroom sizes smaller
93. I need more inservice on interpreting Map data. We had a brief training and then were told to
go print and look at our data. Our school just started using MAP in the spring, so perhaps
other schools have had more training.
94. Protocols for student goal setting
95. I think having the time to plan lessons targeted at strand areas to see if my instruction helps
improve students' scores. As of now, students' scores may have improved, but I cannot point
to specific instructional practices that affected specific strand components. I think once I am
able to do this and share it with students and families, students will have more ownership in
their learning. They have to see the results as related to specific practice.
96. how levels relate to report card, alligned to curriculum, etc.
97. I do not have access to MAP scores for students I work with (I am a related service provider -
SLP) - and feel they could be VERY useful if I did have access.
98. Agreed upon lesson ideas for particular strands according to MAP, like Text Components, and
Reading: Multiple Purposes
99. how to connect the results to the real world. How to connect the results to curriculum and
instruction.
100. I think just taking the test more times will allow teachers to better understand the growth data-
more practice.
101. Now, how to use the strands and relate them with our GLE's.
102. It would help me to understand how putting advanced curriculum in the tests is actually
helping my advanced students. It is not a test of their ability (reading comp), but a test to
reflect the curriculum way beyond their level.
103. I would like to know how to read the data for ELL students.
104. Being able to explain (simply) what a RIT score is and how it is determined.
105. I would like tolearn how ot print out the indiviual reading lists for each student.
106. We didn't do the fall assessment so that made it so there was no measurable data. In K I am
wondering if we could also see growth winter to spring, the reason for this is that in October
many of their computer skills are low, and it may not accurately reflect their levels. By winter
they have been more exposed to computers and are able to perform at a higher level. I do
wish we could have data that showed growth during these four months (winter-spring) for
kinder students.
107. Having a bit more time to think about how I can apply what I've learned, and having more time
to discuss findings with my colleagues. I don't think we have enough time to think about what
the MAP data means in terms of day to day teaching practices/principles and/or pedagogy.
108. None.
109. reports for kindergarten children
110. I would like a bit more about how to interpret extreme growth (15 points or more). Can I
correlate growth with specific areas of mathematical learning? How?
111. strategies for working with students of different abilities who need to target the same strand in
their work.
112. Having some ready made materials in order to work on certain skill sets that are needed for
students to work on as homework. I target them in class for these skills but it was be nice to
have some ready made items to send home for parents to help their child too.
113. More trainings; more district-provided teaching curriculum that is aligned to what students are
being testing on; release time with subs for the classroom so that teachers can get together to
discuss results and instructional strategies.
114. Find specific areas for SDI
115. Timely results
116. to see the combination of HSPE/IOWA/MAP/SRI and any other scores all on the students
scource page. As all tests have weak areas to see at least 3-5 different assesments on one
page would inform decisions better than depending on just one test to show a snapshot of a
students learning.
117. NA
118. Knowing what the actual questions are and what area they each relate to. Access to these
would be incredibly helpful.
119. Include all the other standardized tests that students have taken
120. a curriculum that is tied to the MAP assessments
121. Haven't had students use them yet. Special Ed
122. Real time to analyze the data
123. I would like to know how to use this information to support my mentorship students in a
realisitic/applicable way. I would like to know how to take information I've received and why it
is relevant / how to make it relevant for my Spanish language classes. If it's just to inform me
that students are average or not in reading/math that's not really relevant. I can already tell
their strengths by the assessments I use in class. How should this data inform my practice?
124. I teach seniors who have no MAP data. It would help to have students with MAP data.
125. Watch data over time. Small group discussions.
126. I am still learning and don't have a suggestion yet.
127. N/A
128. A better understanding of how the questions that are asked reflect what is actually taught.
Many of the students had questions that are not part of the Essential learnings of my grade
level
129. I have a special education 4B classroom. Only one or two students are able to participate. I'd
like to know how my students compare to other 4B students.
130. Continued training
131. I would like more support in understanding how much of the test scores can be impacted by
test issues - for example, how much of a student's score is considered when they have little
computer experience.
132. Have them apply to the subject I teach.
133. Learn more about how our curriculum relates to what is on the test--
134. More strand data comparisons with other schools.
135. My students took different tests in Fall/Winter and Spring. I am unable to see growth patterns
in my results.
136. How to incorporate all the things student's need in a particular strand when that strand is a
"weakness"
137. Knowing what was on the test and how the test works
138. I would like more TIME to look at things with my grade-level team
139. how is the RIT target determined? I need to understand this to know how to interpret it.
140. Item analysis of tests so I know exactly what skills each student needs to work on.
141. I still do not really understand this!
142. I would like a better understanding of the validity of MAP scores for Special Ed. students who
showed negative "growth" according to MAP scores, but who showed very positive results with
another computer generated test (Read 180 SRI test).
143. I am still really confused how to get to the kindergarten data, and then how to read the data to
help me inform instruction.
144. Have professional development time to work on the data.
145. More time to analyze the data to inform instruction
146. I need to delve deeper into the strands my students are struggling in. This is a HUGE task,
which I'm committed to but need time and guidance to do so. Please keep in mind that I'm still
learning about writer's workshop, reader's workshop and EDM -AMONG OTHER THINGS.
This is a lot of learning to do!
147. Line graph showing typical growth for AT GRADE LEVEL student vs typical growth for below
grade level, etc. - a visual to compare what type of growth is needed in lower performing to
meet their peers
148. More discussion about relating MAP to other assessments.
149. TIME
150. more training before we're in the full swing of the school year.
151. More familiarity.
152. none
153. More time to reflect with staff and discuss strategies
154. Professional development in classroom strategies to improve student scores
155. It would be easier to generate more specific instructions if teachers are able to see the actual
problems that individual students had attempted and/or missed rather than just knowing the
students being high/low/average in the strands. It is important to know how and why the
students made the errors so we can help them correct their mistakes or reteach the concept if
necessary.
156. can't think of anything in particular
157. My students can be "all of the place." A student can be in the 38th percentile in the fall, the
89th percentile in the winter and the 42nd percentile in the spring. How do I
understand/explain this?
158. more DesCartes training
159. Better communication/letters for families who are homeschooling.
160. Some understanding of how MAP relates to ongoing curriculum alignment -- i.e. how do in-
class, agreed-upon assessments relate to MAP partic. where data-gathering is concerned.
161. If the MAP testing directly related to the science content that I currently teach and/or targeted
learning skills linked to my students' growth in science concepts
162. It would help me if the strands were broken down even more. For example if a large majority
of my students didn't score well in number sense or problem solving; I would want to know
what specific areas. Is it calculation, order of operations etc.
163. Time build into the staff development time to analyze data--it takes time, so would be helpful if
we could use some of our PD time to do this.
164. HOW DOES MAP CONNECT AT ALL WITH WHAT WE DO IN THE CLASSROOM. iT
LOOKS LIKE ANOTHER PRI, IF ANYONE STILL REMEMBERS THAT WASTE OF TIME
AND MONEY.
165. We need to see a sampling of the questions students answer. It is very hard to inform
instruction based on a score when the type of question, vocabulary, or concept presented is
never seen by the teacher.
166. Nothing
167. I'm still having difficulties interpreting and helping my families interpret the MAP score. To
many of us, it is just a number. Also, if you have a student who is performing well above grade
level at the beginning of the year, the student growth does not seem to adjust for this -- ie their
MAP score may not increase as much as an average (or below average) student.
168. more time to study, ask questions regarding the reports
169. I have no idea how MAP reflects/informs on my science teaching
170. How MAP aligns with state standards - the actual curriculum we use on a daily basis
171. Although the introductory trainings worked with the Train the Trainer model, the intricacies of
differentiation were too complicated to effectively replicate this model by the time we got to the
Spring. This is the core idea of making MAP scores inform instruction and it was the weakest
part of the PD.
172. I would like MAP testing available at my school site (Interagency) so I don;t have to take
student off-campus to complete the testing. Makes for lower completion rates.
173. N/A
174. how to actually use the data without spending hours doing differentiation ladders which are too
time consumming and a waste of my time.
175. Wwe would like to have a report that is based on whether or not the students are below, at
grade level, or above grade level, so we can clearly see which students need interventions.
176. How do we say that there are many circumstances that would result in a lowered test score
since this tests what a kid knows, not what they've been taught, they cannot know less than
they did on the previous MAP. If we can sluff away any drops in scores to a variety of
circumstances, it seems like that would result in the same fluidity of any increase (but
especially small increases) in the scores.
177. How to incorporate MAP analysis into my instruction.
178. I really do not support this program, so it is difficult for me to talk about "building" on it
179. It would be helpful to have one copy of the DES CARTES strategies for each grade level.
180. We do not need testing this often! Students do not take these test seriously and they take
time away from student learning. My students take three or more OTHER standardized test
every year. MAP gives eroneous or superfluous data.
181. Have a teacher show how she/he uses MAP data.

Seeing how MAP results correlate with MSP results this year.

Info on how science, social studies, arts, etc teachers can use math/reading data to inform
their instruction.
182. Training on the terminology
183. More trainings offered. I took over this class in the winter and more support was not offered to
new teachers who came in mid-year.
184. To have the test data reflect what the curricula and state standards expectations are.
185. more time to look over tests and collaborate -not at staff meeting after school when all are not
interested.
186. Make sure all teachers receive reports on their students
187. Probably the rational of testing three times a year when the best indicators of growth come
with human interaction with teachers and other students.
188. beginning info session
189. We need more PD on how to interpret the MAP and how to use the scores to assist with
instruction. Unfortunately, my school has been in the dark and many parts.
190. I do not see the MAP results. When I have asked for access I was told that there was not a
way for me to sign in to get results, but if I wanted them to see the teachers.
191. More consitent data with fewer outliers. A better understanding of why some students go
backwards in their scores.
192. Knowledge of what ranges typically developing students are in at certain grade levels.
193. I really need to understand the RTI measurements. I also need a more spreadsheet friendly
way to export the data so I can break down the numbers that are divided by skills and
manipulate the work in ways that actually helpful and don't waste tons of paper.
194. I would like to continue the training and continue using the MAP results to inform instruction.
My students showed tremendous growth this year ranging from 3-28.
195. Knowing specific questions that were wrong
196. I would like to understand what concepts my students are missing. Last year on the district
math tests we were given scores that outlined students' responses to individual questions as
well as to content strands but on the MAP data we are only given a single number that does
not give much information.
197. See the entire test my K, 1 and 2nd grade students take.
198. continued professional development
199. To be able to see what the questions look like for each student!
200. More time and help accessing the descartes.
201. I would like to eliminate the use of norm-referenced tests and instead seek out a criterion-
referenced test (based on OUR state standards) to track student growth.
202. More time to analyze results and collaborate with team members to use results more
effectively regarding instruction
203. direct relation to curriculum taught in my classes and more knowledge to the students and
parents about the process and importance
204. Tests and curricula should be aligned and related.
205. Curriculum aligned withthe test. It is unethical and unprofessional to test children on material
they are not being taught.
206. It would be helpful to include in the fall report, the typical RIT growth number for each
individual child. For instance, either the typical spring RIT score or the actual number of
points.
207. Authentic Assesments
208. hard copy/ email information on how to improve/encourage students academic growth.
209. I had to create my own excel spreadsheet to get all information about my class performance,
growth versus expected growth, and percentiles comparing fall winter and spring on one sheet
in one place, including averages for my class as a whole. The NWEA reports were helpful to
give to parents, but not for my overall performance. At this point with the test so new, we have
little to go on about what it means when individuals slip in performance and percentile,
sometimes dramatically. Every teacher has a few of these. How reliable is this as a strong
indicator that this child is having a problem. No training, literature, or experience has yet
informed me how to deal with such data.
210. I can't figure out how to get the reports I want off the website. There are so many instructional
suggestions for each RIT band and each sub-category that it would be impossible to use the
data to plan instruction for a particular category, there are not enough trends.
211. There seem to be discrepencies between the state EALRs and what the students are tested
on in MAP. There needs to be a stronger connection to the grade level standards.
212. More Professional development to interpret data and to come up with strategies to inform my
instruction
213. Not certain
214. How to use to inform instuction if students have widely varying needs.
215. It would be very helpful to have a list of just the ELLs in my building and their results and their
strengths/weaknesses by strand. I have been sorting through the whole school's data.
216. Not much, because these tests are not relevent to ELL students, of which my school is heavily
made up of.They don't understand the questions properly. In addition to that, I teach 1st
graders who are not always able to really focus on what they're asked, and only think about
being "first to finish". I know that for a fact. My highest Math student spent the shortest time of
anyone on that test, and went down in her score. She has no patience or focus to really do
well.
217. a more effective assessment tool that connects to my curriculum and state standards
218. How do we push our students that are already reaching 98%-99% percentile
219. It is confusing with all the numbers and we didn't recieve proper training on it.
220. Just more time to process what we've already been exposed to with our teachers.
221. I don't know what you mean.
222. More paid time to understand how to really use this information.
223. I need to understand exactly how MAP relates to the curriculum we are using to teach our
students, aside from general strand information.
MAP in Seattle Schools June 2010
What would you like to share with other teachers and central office
staff about how you're using MAP data?
1. I find the MAP test a useless waste of time. I have no interest in teaching to the MAP test.
2. MAP data is very helpful in teams
3. The students like to have targeted areas to work on within a broad whole language
approach.
4. I'm looking for trends and for areas of strength and weakness. I am very, very concerned
about ELL performance and how many of my students are not able to access the math due
to past math failures(one couldn't read the word quarters but is able to multipying fractions
fluently in class)
5. I do not feel that MAP data is very useful
6. Parent conferences (formal and informal), grade level meetings, SIT and 504 meetings,
adaptations in instruction
7. I share the data with students and parents. I look closely at data in order to inform my
instruction. (But am not always certain how to do this with our current curriculum.)
8. MAp data is a great overview of student progress and helps immensely with students making
personal goals and taking responsability for their learning.
9. Our school is not using it. We are more interested in the long term personal and academic
growth and understanding of self and others.
10. Give us access to the data
11. DesCartes pages are very confusing and not necessarily helpful for teachers. It would be
nice if each strategy/focus area came with an example so teachers knew exactly what is
meant.
12. nothing
13. Communicating with parents about weakness and strength of the students are important.
14. does not help with placing students into reading groups as DRA does
15. The MAPS test does not need to be given 3 times. We have shown with our students that
giving the test 3 times simply creates more fluctuation in how the students performed. The
test should be given at most, at the beginning and end of the year. The middle data point is
completely worthless.
16. I do not think that the results for grades k-2 should be sent home
17. I hold individual goal-setting conferences with my students to target area(s) of focus for them
for the next trimester.
18. I use it as a general indicator of student achievement and a guide for me as I work
individually with students. This is only one measure and I think we are testing too many
times in the school year. I believe that MAP testing has a place in talking with students and
informing instruction but it should not hold the premier place, and at this point it does.
19. N/A
20. I'm not using it.
21. Very useful in conferring one-on-one with students around their reading. Motivating for kids.
22. I would like to see district set a firm policy across the schools about students who score low
on MAPs and MSP. A matrix of "next steps" should automatically be followed requiring
additional data (not grades, they belie ability) and a program for intervention be it additional
tutoring or what. I am tired of having data go unaddressed because there isn't a system in
place to address what the data is telling us.

Additionally, students in 8th grade care more about their test results when they know HS
placement may/is placed on their results. I had 10-20 kids want redo's when they found this
out. All but 2 did better on the second round.
23. Figure out a way to include self-contained special ed students in the testing window.
24. I am not using it
25. In my opinion, MAP testing places an undo strain on personnel and resources.
26. I didn't use it very often becuase many of my students ahould of had accomadations from
the beginning of the year. Also due to the high poverty in my classroom most of my students
don't even know how to navigate a computer very well.
27. I think until we really understand these scores better + NWEA process in general, this should
be teacher-to-student info or teacher-initiated shared in context of a parent conference. I
don't think this data should be going out to parents generally or be reflected on eSIS. MSP
fine--it's state-normed & produced. And if we did a nationally-normed ITBS etc. with a
proven track record that would be very helpful for giving us information vis a vis students
nationally. This is not something that we can totally explain yet to parents. So it adds hours
to our already overly full workload to have parents calling us for explanations. Perhaps the
district could formulate a generic explanation to parents. In fact, good to put on the SSD
website: "We are currently piloting an interim assessment and we are learning what the
results mean. You may be contacted by your child's teacher to discuss these results in the
context of his or her overall school work, but they will not be posted until we have more
experience with the assessment." This is really a teacher tool especially until we understand
it better. There are items that don't match our current Reading/LA curriculum in particular
which is student centered rather than content driven... with mini-lessons to improve student
reading & writing but not the kind of adult/college genre questions we are seeing if a student
were doing a course in particular literary periods, e.g., Romanticism, Surrealism,
Enlightenment. Not at middle school anyway.
28. How does this test work and how is it useful?
29. I'm not - high school classes there is no way to evaluate and differentiate for 120+ students
without time being allotted to do so. And even if there were, why should math and LA
teachers be stuck with this extra work when science, world languages and elective teachers
are exempt? We're tired of getting all the work and all the blame when we only see kids for <
1 hr per day!
30. We are going to need more training for this to be more than a huge waste of time. It wasn't
easy to access this data, and we need real training on how to interpret it.
31. You MUST make the website so that we can search by student. Any teacher BESIDES the
English and Math teachers find the NWEA website VERY VERY difficult to use.
32. Students are reading just right books at their level.
33. I'm concerned with the reliability of the data
34. i love the concept of triangulation and i think that many of the concerns i heard voiced over
and over from other schools MAP teams could use this mantra when viewing their data.
there was a bit a hostility that ended up cutting into more meaningful discussions.
35. That this should be for staff NOT students and parents!
36. Math MAP data is the most helpful, but for reading it is not helpful at all. I find the
information for informing instruction to be to general to work for my students. I was
disappointed I was unable to pin-point exactly what they missed on the test.
37. n/a
38. Maybe
39. focus on small groups needing additional instruction in areas of weakness
40. It informs instruction but how do we teach to it. Some of the questions bore no relevance to
curricula.
41. I have not seen the (district-created) reports.
42. Schools need to take lots of time to review the data. Giving the individual data to teachers
first, than giving them class summaries and growth data seems to help them focus more on
trends.
43. Comparing to oral reading assessments and WASL/MSP data and shwoing the correlation.
Most of the tiem there is a direct correlation
44. We are not using MAP data much at all because it is very time consuming and does not
seem very relevant, especially at primary grades.
45. Data seems to complicated for teachers and students. Not very helpful. We should stick to
the folks who know testing and make a living of it--collegeboard.
46. The extent to which we are already using MAP data to make significant decisions about
teachers, schools, and students is alarming.
47. It has been helpful for me to see the growth in my students. Even if they are not at grade
level, it was important to see their growth and that they are progressing.
48. we have developed a whole system around the use of MAP data in planning for reading
instruction. Happy to share it with others.
49. I would like to know that how this information connects to existing test data, such as DRA,
and MSP data.
50. na
51. It is unclear to me how the MAP data is useful and therefore I do not use it.
52. I make my own database, with students scores from the last two years. I find it useful and
informative when I can see and measure a student's progress accross more than one grade
level.
53. I am using to help me develop sustainable, measureable IEP goals.
54. I have used MAP scores and goal descriptors as ONE means to write IEPs. However, I use
multiple means of formative and summative assessments.
55. I use the primary subtests. they are much more informative that the survey tests three times
a year
56. I have found it helpful in creating book clubs in my middle school classroom.
57. It is only one measure of student growth.
58. Nothing at this time. Just learning.
59. ?
60. I'm not using it to plan instruction because I don't have time.
61. Nothing
62. Once we get the 8th grade data, it will be used to inform us for the summer Bridges program
and Fall placement.
63. Because most of probality for high school gets covered in the 3rd year adv. algebra ,those
score have a wild variation and impact too much of the overall result
64. It's a learning working process
65. ridiculous waste of time and money
66. I am using MAP data to measure IEP progress, rather than having to administer a new and
different test quarterly.
67. I'm a special ed. teacher and have found the MAP progress to be unappropriate for my
student population
68. As a targeting tool for building interventions (remedial classes, etc) - combined with other
data it helps show where students are at
69. I don't believe the assessements are consistently valid.
70. Need lots more time!
71. We specifically use it to inform systematic and flexible groupings for intervention and small
group teaching in the classroom
72. n/a
73. Our school primarily uses MAP data for helping to determine which students take math or
reading support classes and which level of math class they should take.
74. helps me push kids to work harder to improve their skills...student accountability
75. not applicable to my subject area
76. nothing
77. taking strategies - use of paper and pencil for math

student goal setting

intervention placement
78. Do other teachers of self-contain students set goals with their students?
79. I have a long way to go, but I feel like I am moving in the right direction. This needs to be a
collaborative effort between teachers and students. Also, I believe it can be meaningful to
families once we can provide more information about how instruction is tied to results.
80. Kindergarten should not be tested. Developmentally inappropriate. We are testing their
computer skills, not their reading and math skills.
81. Talk with students about scores and skills, create plans, stress intrinsic incentives.
82. I dislike MAP testing strongly. I think it is not a functional tool and therefore a waste of
instructional minutes and a great deal of money.
83. I am using the reports to focus on teacher professional development (which areas are
students performing well at and which areas may need more support). Then the teacher
meets weekly with thr coach to plan and implement for those areas as well as grade-band
PD.
84. MAP is only one data point. It does not match my observations of student ability based on
running records or CBAs.
85. I am not using MAP data because it tells me less than other gauges in the classroom
86. We should look at compring ELLs with other districts that are using the test.
87. Teacher's for the most part are not using MAP to drive instruction (they have too many other
things to do than to analyze this data).School psychologist has used some data to write IEP
reports. ELL teacher has used the results to traack her students reading gains.
88. Soem times a student shoudl be allowed to retake thetesti>e. If they just came back in the
country or they only spent less than 15 minutes on the test. This test should be more than
just a one day measurement.
89. I'd like to see how others are sharing MAP data with parents, and if parents are finding the
MAP data meaningful. I'd also be interested in seeing how MAP data can be incorporated
into IEPs.
90. The data themselves seem to call into question the validity of this assessment.
91. nothing
92. I keep these statistics for a point of reference using parent/student/counselor/teacher
meetings.
93. Student progress
94. Using MAP data to help form my groupings and to look at what strands I need to target to
help students grow.
95. MAP data reinforced the specially designed instruction for my students, BUT it was NOT an
accurate display of thier growth throughout the year. Students with Special Needs have
different growth norms than typically developing students. Analyzing the MAP data was
helpful with 50% of my students but was not helpful for the remaining half due to the
inaccurate test results (kids just clicking, refusal, etc.)
96. Frustrating when data not available upon completion.
97. At a High School level it really is not an assessment that tells us new information. Looking at
student attendance and MS grades is a better indicator of motivation and grade level
competencies.
98. NA
99. I create an Excel file of all of my students and their fall, winter and spring scores and include
a column that shows how many points they went up by from fall to spring (or winter). It is
helpful to arrange the students by their fall score and see which of my students tended to
increase their scores the most. When I did this after the winter test, I found that I was really
reaching my lowest students, but may not have been challenging my higher students
enough. I changed my practice a bit, and saw better results at the high end in the spring
results.
100. I am going to see how DRA scores relate to reading results on the MAP (I haven't given the
DRA yet.)
101. It is still very disconnected.
102. I like it's usefulness for showing progress without comparison to others.
103. It's a lot of data to sift through at first glance
104. Can not yet apply data to students
105. This is too much testing. RTI experts have told me that by the high school level, we have
much of the data already. To take valuable class time away from the students for the
purpose of testing is ridiculous and counterproductive. Students stop taking the tests
seriously so the data is not helpful.
106. N/A
107. I have done two differentiation ladders, but can't see how I could possibly do this on a
regular basis. It is very time-consuming.
108. Look at the data and do activities to support student learning in the areas indicated.
109. I share scores with my families, but emphasize that the scores do not always directly
correspond to their success in class.
110. I have not used the data.
111. The reading data is not helpful for K/1 level readers. I can look at a score, know how a
student compares to average, but have no concrete information on what kind of reader they
are. What mistakes do they make? What strategies do they use or not use? What personal
connections are they making from the reading they are doing? Answers to these questions
are what inform my instruction. I can get these answers only by listening to students read.
The DRA is a significantly more effective and useful assessment for K/1.
112. Let's use other testing to evaluate students --not just MAP--I know there are other tests--but
MAP is gaining importance
113. I use the MAP data to help students set individual goals for reading improvement and
specific areas they need to work on in their reading.
114. I work with my students in small groups in reading and math daily. The information that I gain
through instruction is more valuable and more valid that the MAP test.
115. To inform how direct intervention improves student performance on the MAP
116. It is nice to have another point of data. I am a special ed teacher and it is helpful to have
more information to see how my students are growing.
117. I am using it to inform instruction
118. I look at it, but it is really not that helpful. I get more information from EDM checklists,
spelling inventories, phonics assessment, guided reading, etc. The Lexile score for my
second graders didn't even match what I am getting from the DRA (which I know is
accucrate, because I have been trained in the correct way to administer it and what 'typical
mistakes' are that teachers make. I get so much more information working with my students
than from MAP data. Plus, do you really expect me to print off 15 pages, tape them together,
strap them to my wall and go student by student in planning - I have more efficient ways to
do this and I don't even trust that the data is accurate in informing me what my students
abilities are.
119. There should be a larger testing window. Testing students in the morning in one testing
window , as I did in the winter and fall, and then testing them the last half hour of the day, as
I did in the Spring is a variable we should be able to control with a larger testing window.
120. The tests take to long and take too much time out of my students learning time.
121. It helps me see where I need to focus future instruction.
122. It is good for finding extension activities for my high reading group.
123. nothing at this time
124. It's great to have data to support teacher input about individual students - especially when
talking with parents.
125. There is so much data that it is difficult to decide where the focus should be. Mostly the
reports just tell me (what I mostly already know) is that my class is diverse with a wide range
of abilities.
126. transitional kindergarten has little ability to use computer equipment which effects testing
127. I use MAP data as one piece of my triangulation to udnerstand a student's academic
strengths and weaknesses. I believe the district must do the same when grading me, my
school and/or my students. For example, I have many, many students who showed 1.5
years growth in their DRA or TC reading assessments, but not in MAP. This is significant! I
hope such growth will not be ignored.
128. Not terribly useful for informing instuction for highly gifted kids--the lack of an "I don't know"
option leads them to make educated guesses and therefore they score at a much higher
level than they are able to perform at independently when not given a mulitple-choice
problem (esp. for math). However, MAP data has helped me to identify strands where my
entire class needs more support.
129. When we put all our student data on one "histogram" - we could see how we need to better
support particular grade levels as a whole
130. We are using it as one data point. In reading it has helped keep us accountable for our
running records. I would like to find another data point for math.
131. I am extremely concerned about learning style differences among students. I observed
students who were highly motivated to work through 52 math questions, reasoning their way
through each problem. I observed other students who wilted partway through. Many
students need feedback along the way. Many students need a social context in which to do
their best. Some students need the option of skipping a question, to go back to it later.
Some students don't have enough attention span to stick with it for 52 questions. I don't feel
that the MAP fall-to-spring scores accurately gauge how much many students learned.
132. Using the Ladders is really helpful
133. I find it interesting as just one measure of growth, along with class work, student
performance, and the understandings they show in the whole class makeup. I still think no
test can show what the teacher knows and sees daily throughout a year. It is only another bit
of information to understand what a child may know.
134. Not much. I'm not a fan of the MAP at this point.
135. LOoking at growth patterns.
136. none
137. I have used it to speak to the correlation between students who work hard in school and do
well on the MAP test.
138. These test results are far too vague. In lower reading comprehension scores, there is no
data of whether the child has more trouble with Main Idea, sequencing, details, etc, etc.
Same with word recognition---no useful specifics. True in math as well. Ther results are not
idividualize.
139. can't think of anything in particular
140. Until we create some schoolwide plan for using MAP data, it's not terribly useful.
141. It would be extremly helpful if there was a line showing the students progress or lack of in
between semesters. That way teachers could quickly assess who is making progress or
meeting the state standard and who is not.
142. As one form of multiple assessments. It doesn't give the overall view of what students know
and understand, but it can give us a piece of the puzzle.
143. triangulating with other data
144. IT'S JUST A SET OF NUMBERS - IT DOESN'T RELATE TO DETAILED
COMPREHENSION OF READING A LONGER PASSAGE, OR SHOWING HOW YOU
SOLVE A MULTI-STEP PROBLEM. MSP IS MORE LIKE WHAT WE DO DAILY THAN
MAP. THAT SAID, IT IS HELPFUL TO CHART STUDENT GROWTH FROM THE EARLY
GRADES TO THE UPPER ONES. IS MAP THE BEST TOOL FOR THAT?????????????
145. It is very difficult for students to read on computer screens. This prevents many students
from being able to accurately track what they are reading, leading to students rushing
through certain passages.
146. This test is a waste of time and money.
147. It is difficult to talk about a student's specific strengths without knowing the test questions
and the student's answers. Also, the reading portion of the test seems to focus on Latin and
root words as students move into the higher levels of the test. For many of my high readers,
they get to this point in the test and become frustrated because we do not teach Latin and
root words in the primary grades. Also, several of my students commented that the lettering
(font and font size) makes it difficult for them to read the test items.
148. Using as another data point to help group students for small group instruction
149. It is being used for high level administrative and program based decisions, but not really at
the classroom and student level.
150. N/A
151. Using it as another assessment.
152. useful tool to share with parents
153. No thanks.
154. Too much data. Way too much overall testing for kids.
155. It is not very useful at all. It is more paperwork for our tech person. It does not show growth
(yet) as the program developers themselves state that it takes 16 entries. Parents are
troubled by our superintendent's involvement with the board of the MAP creators. I tell them
they are correct to be concerned. MAP takes away any differentiation that I might be prone
to provide, as it clearly gears you towards teaching to the lower end of your classroom. This
is a bad choice for our students and community.
156. I'm not using it because I don't teach ninth graders.

But they monopolize the computer labs for a fourth of the year, so it affects me in that way.
157. I have not had much time to look at MAP data. Why is the district using these scores for
student placement when we were told that this is just to inform instruction?
158. I can not use data that shows no patterns or shows students getting worse in the subject I
teach over the year. I can not use data that comes from students playing at a computer and
not taking the test seriously. There are no stakes for this test and therefore no effort. Even if
we invented stakes for this test, students have test-fatique and have come to not care.
159. What worked when coordinating the MAP tests.

Trouble-shooting MAP administration.


160. It is not well supported at our school. Primary students cannot log themselves in and so our
LAP teacher does it. So then kids are not being served for LAP. Perhaps if the district has
this test offered as a means of helping inform instruction for teachers, and we all celebrate
its validity, then perhaps someone can offer support in getting kids logged in to expedite the
testing process. The help it offers us as teachers is NOT WORTH the hassle that it creates
in taking away our LAP instructor for several weeks a year.
161. the data collected does not align with the curricula or the state standards.
162. nothing
163. I think this is a very useful assessment tool to inform and customize instruction for students
164. I find the MAP data extremely helpful when writing IEPs for my students.
165. not totally using map data.
166. i don't know how to use it yet
167. Please give specialists the ability to access the data. I would use it to guide instruction, but
finding time to meet w/teachers is difficult. I would like to be able to access it without asking
teachers.
168. if we have a pacing calendar where is the time for remediation
169. Nothing at this time.
170. I created my own sheets for students, had them map their data, reflect on the rise and fall of
their grades; but working with the particular skills is where the real wealth of information lies,
and this is hard to tease out for individual students without relying on the NWEA forms that
lump kids together or waste lots of paper.
171. I need more ideas about how to change my instruction to meet the needs of MAP.
172. MAP is very useful data if they analyze what MAP tests students. I did this the students did
well during the spring MAP assessment.
173. I look at the scores and the percentile and then analyze the strand growth.
174. So far I have only been able to use it to confirm what I already know about my students
through classroom based assessments. It has not informed my instruction, led to any
change in practice, or helped me understand my students. It has only reciprocated what I've
already known - nice to have it confirmed, but I'd like to be able to use it otherwise.
175. I do not believe MAP is an accurate data point for beginning readers. MAP does not
correlate other data from the classroom, DRA, or observations.
176. we use the MAP data to inform classroom instruction
177. At the K level, the scores are not a good enough reflection as to how the students are doing,
since not all of them have computers at home and thus found the tasks overwhelming at
times.

The scores do correspond with how I have assessed the students as high, medium and low,
however, they do not provide the information that one-on-one assessments do.
178. The MAP is not aligned to our curriculum - the Washington state EALRs, GLEs, and PEs. I
do not understand how I am supposed to use a non-aligned, norm-referenced, highly
culturally biased assessment to inform my instruction.
179. The quick turnaround for results and the option to have data in number and graph format has
been the most helpful for me and for using with students.
180. because there was little connection to the students directly about the test itself and they
seldom knew really what to expect and/or what prupose was served, they got bored with it
and did not try their best.
181. Map testing has done more harm then good. Our library has been closed (3 times) for
weeks at a time. It is impossible to differentiate Algebra 1 when the skill levels are so
discrepant.
182. It is a waste of my time.
183. Map Data is Not Authentic. Not authentic.
184. I would like specific training on using Map testing to guide my instruction in the mathematics
classroom.
185. The scores are different from what I see on a day to day basis. Some of my students who
operate at a much higher level did not do as well as they do on a daily basis. Is the test a
true indication of the student's academic growth? What if a child is not feeling well and
therefore did not do as well as they might have?
186. We need to continue to use MAP to inform instruction and to communicate with students and
parents (as well as staff) academic growth.
187. Well, since the majority of our kids are testing 4 years above grade level, we're trying to
figure out the appropriateness of providing high school instruction for 2/3 of our kids in order
to move them along the continuum. Not sure how our parents woud respond to our moving
from 6th grade novels to adult poetry, Shakespeare and adult reading books with sexual
content that are read in 9th grade. I'm being sarcastic, but seriously, how can we be
expected to have an impact in the data on the growth of kids testing 4 years above level?
They are learning plenty, but may not move on the continuum because they are not learning
high school content.
188. I have a hard time using the MAP data because I see significant discrepencies between my
observations of student learning and other assessments compared to the MAP data.
189. That we are using it as a tool to assess where our students are in their devlopement in Math
and their progress. We hope to use the data to place students appropriately in intervention
and or enrichment classes.
190. Not certain
191. I think we need to be clear on how MAP data is going to be used. The trainings have focused
on instructional/classroom use, but it seems like every month we hear about a new way it's
being used for decisionmaking or accountability in some way. This is OK, as long as we're
honest about it and don't pretend MAP is all about instruction and not about making
decisions.

Frankly I haven't found MAP that useful for making day-to-day instructional decisions. Good
to have, though.
I would like to have an idea of what the PD is going to be about next year, given that we're
devoting a lot of time to it.
192. I am using it to look at overall growth, analyzing where it occured and then trying to figure out
what we are doing right.
193. Nothing at this time. I feel too frustrated.
194. As a special education for the primary grades, I am using the MAPS screener to identify
strengths, weaknesses, & the ability of my students to generalize their knowledge to a
different format (language used during assessment & method of responding).
195. this is a waste of time
196. I think it is a very useful tool to determine student growth. I think it is useful data for all
teachers, regardless of content area.
197. MAP for kindergarten is not reliable and takes up too much time for the quality of the
information. Overall, MAP is useful but extremely time consuming, it is an unfunded mandate
at this time.
198. I would rather not spend the time testing my students instead of teaching them.
199. The library is closed too long during the school year for classes to attend together with our
librarian. He helps proctor the exams.
200. I beleive this to be a highly useful, but fairly complicated tool. I think we need to be careful
as a system with how we are, in fact, providing "just in time" PD for teachers and the right
communications with families. In order for teachers to truly make effective instructional
decisions based on this data, they need time and opportunity to really understand it in
smaller, more manageable pieces.
201. How this ties in with EDM.
202. If I was doing so, yes, of course!
203. I find the information highly specific to data and statistics, not user friendly to teachers, who
are not usually statisticians.
MAP in Seattle Schools June 2010
What other feedback do you have about how MAP was administered
(logistics)?
1. MAP testing takes way too much time out of instruction.
2. Drop this test.
3. I would have liked to have the MAP spring testing window later in the year, like right now. It
would then be a true measure of growth for the year.
4. Losing Library for two weeks was not helpful for my students at all.
5. None
6. MOtivation was inconsistent for the spring test.
7. Some teachers were not happy with their map scores and had students take the test 2 or 3
times until they achieved a score that showed gains...this seems a very disingenous way to
gather data and should be discouraged no? I'm sure if I had some of my students retake it 2
or 3 times they would have also had gains but what am I doing to that student and how is the
data ringing true is we allow so many retakes?
8. Timeline for large schools must be extended.
9. We could use more computers at our school.
10. I wonder if students were at a disadvantage who had to test 6th period for each cycle. My
students had one test interruped by a fire drill. On another test date, one third of the class left
part-way through for a track meet. I just don't think kids perform as well when having to
resume the test the next day. It would be great if there were a way for students to all test in the
a.m., like we do for MSP.
11. Need a staff person to log in students
12. Don't make is so close to the WASL
13. It is a lot of testing, especially since these students also have to take the MSP. It also means
that there are 10 weeks each year where our library is unavailable to other students.
14. It was hard to go from a 3 week MAP window to a 2 week MSP window with only a week in
between. The same folks coordinated both, so it was like a 5 week assesment window for
coordinators.
15. too much standardized testing; MANY students lacked motivation.
16. Makes it difficult to schedule classes in the library to do research.
17. K kids struggle with logistics of tech, fall is TOOOOOO early to mandate K kids to be on the
computer in a high stakes test
18. Extra people to help administering MAP tests
19. Why THREE times a year? Why NOT fall and spring? This NINE weeks of the lab being
used for assessments!!!
20. This test is very time consuming. The district should consider hiring MAP coaches as there
should always be a proctor in the room due to the use of these scores eventually being tied to
teacher performance. There will be cheating-- sad but true.
21. we need headphones that fit the small heads of 5-7 year olds, but the school budget has no
wiggle room. Using a mouse properly is not developmently apropriate for kindergarteners
expecially when we dont have any classroom computers to practice on.
22. very time consuming
23. The test does not need to be given 3 times during the school year. The middle data point is
totally worthless to the system. This year, the initial testing window was too close to the
beginning of school for many students to be truly invested, and the final testing window was
too early so it was not a fair evaluation of the entire year's progress. Additionally considering
how much computer lab time we lost in administrating this test vs. what could have been done
in actual teacher curriculum it is completely not worth it. Our school lost 25% of its available
computer lab time to the MAPS test, unnecessary and unfair to teachers.
24. I sent you an e-mail related to MAP discussing four concerns I have with the testing. Kit
McCormick, Garfield
25. I could only test one student at a time so it was very time consuming to test my class.
26. It is not useful in high school and drains resources that students need for graduation, including
access to computers and use of instructional and student work time...we either need more
computers or less testing
27. It takes up lab time needed for classes. I also would like to see the order of the test reversed
(reading then math) just to see if the data changes. Kids are test fatigued by the time they get
to the reading test which might impact our results negatively.
28. There were often times that my MAP lead didn't have answers to questions....Specifically
about accomadations. The reading accomadations are questionable as well. If students are
p[ermitted headphones how can I really tell how well they can read for information Vs.
listening for information.
29. As I commented earlier, MAP is very taxing on personnel and resources. It monopolized the
library and computer labs for a significant amount of time this year and it looks like even more
time will be committed next year, taking our technology and library resources away from our
students for testing.
30. Our infrastructure is definitely declining in it's ability to deal with MAP & the volume of use over
the course of MAP windows. We have a few freeze-ups per day. MAP itself will deliver a
blank screen or only partial data. We work around this by moving students to a different
computer when one is available, then reboot & do the dual logon & test selection (3 steps) on
the errant computer. When there isn't a free computer, we do the same & the student just has
to wait for the computer to be rebooted, & logged on. It will be interesting to see if Thin-Client
technology will help us with some of the issues. # of computers to efficiently administer the
test is an ongoing challenge.
31. Kindergarten needed mice instead of fingerpads on the first test. Too difficult for several kids
who just lacked fine motor skills...
32. It takes up too much class time with no value added.
33. Is there a connection between our superintendent's MAP board membership and the district's
use of this test?
34. It wasn't clear to me how much we should discuss scores with students before or after the test.
35. Our computer labs are closed to classes for two weeks, three times a year because of this
useless test. Meanwhile our labs are unavailable for other productive educational activities like
research
36. It is difficult when all the schools computers are being used for two or three weeks for the MAP
testing.
37. It took too much time and kids did not take it seriously. The reports did not come or could not
be compiled so we had to do it by hand which gave us no time to use the data for instruction.
38. Motivation level on the test varried a lot. Generally, the first to finish teh test had the lowest
grade in the class and the last the highest. While not proof of effort, it is a strong coorelation,
39. n/a
40. I hope that this does not become the most significant measure of student progress because I
do not believe it is very accurate.
41. I was absent on the spring MAPS due to family medical situation, but I do feel they would have
done better with me there.
42. Some of our older students still did not take the test seriously. Now that we understand how
to use the data, I hope the teachers will do a better job of preparing students to do their best
on the test.
43. More inforamtion on students who want to retake and how their scores are included in the
results
44. more training
45. It is very time consuming and takes up our computer lab, which is in our library for many days,
which means kids don't have access to books.
46. Administering MAP took substantial time and resource from a staff member who was not hired
with MAP administration duties on his job description. He had to drop a substantial amount of
his work that he was hired to do off of his schedule. How can such a time consuming
assessment such as MAP be put into place without funding a MAP administrator? It doesn't
seem right.
47. Given the importance the district and administrators (principals included) have placed on the
MAP, I did not like that my class had to take it at the very end of each day on the very last
week of MAP testing time. No time for makeups, among other things.
48. For some reason, some of my students scores showed up. Even if they were absent one day,
they did their test on the make-up day. I made sure that ALL of my students were tested, but
some scores were not available. This is frustrating.
49. timeframe -- April was just toooooo soon. Do after MSP
50. I felt there were a lot of glitches in the administration of the tests. Teachers need to be better
trained.
51. Our school is fortunate enough to have a computer lab and a lab staffer so the scheduling was
done by him. Unfortunately it meant noone could use the lab for 9 weeks out of the year as it
was tied up for about 3 weeks per administration. That was a definite downside.
52. Does the MAP score affect what Math class students will go into in the 6th grade. And if so,
what MAP score do they need to be placed in an advanced class?
53. Total chaos in trying to fit so many classes in such a short period of time. Very
unmanageable!
54. I have heard conflicting things about what EXACTLY the MAP data tells us. I have heard that it
is reflective of what the student knows and also that it ISN'T. How much importance is to be
placed on this assessment? Also, I would like to know how the district will use the scores for
the school report cards/annual reports. I had many students who in the fall tested out of their
grade level and then in the winter or spring scored lower, but still well above grade level.
These students were part of the 'not enough typical growth' group. However, they were testing
out at 9th-11th grade math skills. Does/Will this affect the report negatively if the data is
simply being recorded/reported from the NWEA reports? I have big questions about how
useful/accessible this info is to parents as well. What has the district done to inform parents
and make sure that ALL parents recieve the same information about these assessments?
55. I don't know how important winter scores are. I think it would make sense to give kids more
developmentally appropriate questions within two or three grade levels above/below where
they are so that they aren't given questions that make them anxious about answering. This
would give give a more accurate representation of what they know, and are ready to learn.
56. I feel that students in my 6th period class definitely did not try their hardest because the room
was quite warm and they were tired. Also, I have a couple of students who are purposefully
distractful (is that a word?) because they themselves are insecure about their low levels.
57. Having the MAP test three times a year is not appropriate. The kids get tired of the same
questions. The MAP sucks up a lot of school resources for three weeks three times a year. I
think that the fact that NWEA does not track progress for the Winter test should be a notice to
the school district that something more useful could be put in its place mid winter.
58. My students wanted to do their best, but some did not adjust well to the MAP, and did not
reflect their skills. I will have to find a way for them to practice sitting in front of a computer
screen reading pictureless stories, which is so different from tool kit and my other NUA means
of teaching.
59. I think the whole process needs to be moved to later in May or early June. I realize there are
reasons for scheduling the test for when it is, but combined with MSP and Memorial Day, the
result is that many students feel that the year is 'psychologically' over, making the remaining
time more difficult to manage.
60. I have yet to be provided with a clear picture of how and why MAP is effective and relevant for
my practice and my students. The PD around MAP at my school has been almost non-
existent, and because I have never been given an opportunity to see/take an actual test, it is
very difficult to convince students that the test is valid and relevant. If we are to continue to
use MAP as a measurement of student achievement, I hope that there will be a much more
organized roll-out of the test. Teachers also need to be provided with time to go through and
understand what all the data means, and I was offered no extra time to do that this year.
61. MAP testing takes away instructional time. We don't have enough computers to test everyone.
Fall window should open before the first day of school. Spring window should be aligned with
HSPE. District should fund its mandates, in this case by providing subs or release time to
administer and utilize MAP testing.
62. I would like it to be scheduled closer to the end of the year
63. Our school successfully used a substitute to manage MAP testing and to coordinate all make-
ups.
64. the map in winter was too close to the final exam review for my 6th period students so a
number of them did not take it
65. Testing right after winter break was a bad idea, espeically since my students are in special ed
and lose skills over long vactaions.
66. Map monkey
67. Since this is now a mandatory test along with the state mandatory tests (which are moving to
computers) every school must have a full time tech person in their budget. It should not be a
choice. School computers have a shorter than normal life due to the excessive use they
receive and if both teachers'evaluations and students' placement in classes is guided by these
tests then the support of the equipment must be there. In addition, there is an equity issue
since most high schools greatest number of bunched computers are in the library. This now
means that the library closed to all student use for 25% of the year.
68. It was done fine.
69. Administering MAP at our school took WAY too much time of our Librarian....kids lost a lot of
Library time and computer time in t he library.
70. Twice a year would be enough.
71. This test should only be administered twice a year; not three times. As a computer lab proctor,
this disrupts our reading program for twelve weeks out of the school year which is frustrating.
72. Questioning validity
73. It consumed about 25% of all the available computer lab. time here in the entire building. This
seriously conflicted with other mandated requirements, such as the History Day CBA that was
delayed due to lack of available computer lab. time. Next year, with expanding this to 9th and
to 10th grades, this will be even more problematic and iwll consume about 50& of the total
computer lab. time available for actual school work.
74. Sped was not prepared for accommodations/adaptations for this test and I didn't know until the
2nd MAP test that I would use developmentally appropriate levels for the students
75. The testing could have been completed much sooner if we had enough computers available
for student use.
76. Three times a year is too much testing
77. The log-in process is way too cumbersome! The need for proctoring support is a drain on
personnel resources in the building.
78. time consuming, students not invested in process
79. It is an industry I don't care for
80. Takes a lot of time out of 9th grade block classes throughout the year 6 hours of instruction
lost.
81. will we get additional headphone? The ones we initially got broke easily.

Will Decarte be aligned to EM


82. It was a last minute decision to use MAP this spring at our school. Testing went smoothly
inspite of the last minute decision. However, we are just beginning to understand MAP.
83. Good to look at but with lower grade levels the grades reflected could mean a lot of things,
attentiveness, ability read, follow directions, etc. #'s were not that helpful and I couldn't
connect RIT, etc. to what I do in class.
84. There were MANY technological problems at our school. Aside from that, the process itself
was smooth.
85. I think the spring test window should be moved back to june so we have the most accurate
data in terms of year long growth. We still have two months left of teaching and learning after
the spring test as it is now
86. Many students opted out of taking the map and parents were generally not supportive of
taking more time away for another standardized test
87. Didn't work on the Macs.
88. The district obviously gave no thought to how a school might be impacted by implementing
accommodations (space, personnel, language.) We have schools that are 45% bilingual!
89. It took our head teacher out of her regular role for 9 weeks of the year--unacceptable!
90. How in the world is this possible at schools without computer labs? It honestly doesn't affect
me much in a school where computers is during PCP but I'm wondering how this is done in
other schools.
91. I don't think it's a good idea to give the MAP right after a break. I also felt that the MSP and
MAP can be spaced better.
92. You know how public schools get a bad reputation from people's impressions that all we do is
test prep and more tests? This kind of thing just inforces that.
93. It was flawless at our school.
94. I t was difficult to do it on the computers and in the lab because of a lack of working computers
I do not feel the district is supportive enough of this Our school has had to provide and obtain
our own computers for the lab which are usually donations and many break down or there are
not enough
95. Many students were unable to complete the testing because our class was scheduled for the
last week of the testing window and several students were out with strep throat during that
period.
96. Computer problems - limited resources for online testing and very hard to do pull out of non
9th grade/ credit wise 9th gr - by interupting their classes. Students that are always absent will
not ever get tested. So for the school board to ask for 95% to be tested is fantasy as the
absent rate is under 90% for the 9th grade...
97. Administration ?When one is doing Reading Exchanges and you do not have your own
students. How does that impact your percentages? Reading scores do not really fully reflect
my work with the students. This would be the same with the math for those staff members
participating in Walk to Math. Logistics were fine at our school.
98. Because of limitations on computer access in our school (most students do not have
computers at home), taking three weeks a year, all day to administer MAP tests is a serious hit
to our school resources and access.
99. Some kids seemed to opt not to answer correctly so that the questions would not keep getting
harder.
100. Absenteeism
101. There wasn't a lot of buy-in by general ed teachers at my school. I can see it being used at
elementary school much easier.
102. The final testing window should be later in the year. First graders should always test at the
end on the window.
103. all sessions went well
104. I was not involved in the spring administration of the test.
105. At our school, the MAP test administrator (not sure if she is the data coach or not) was ill-
prepared for my class and my 1st graders spent 2.5 hours taking 4 tests back-to-back so she
could get them all finished at once. This was inappropriate and led me to discount the test for
the most part since they were tuned out after about 40 minutes.
106. Too much time is spent on testing! The data is no longer accurate because kids stop trying.
Let them have their classroom time back so they can learn and prepare for the HSPE, please.
107. This is occuring too frequently - many missed hours for technology instruction.
108. If the district wants to continue the MAP testing, additional technology (computer) resources
are needed! The impact of the computer labs being used for MAP testing was huge and
crowded out opportunities for other students to have access to computers for projects.
109. Without a computer lab it was difficult to administer and coordinate testing with other staff so
that my students were testing at an optimum time of day.
110. No suggestion at this time.
111. Of all my students, only a very few took the spring testing seriously since they knew that it
would not affect their grade in any way.
112. Not having enough computers in one room to test a whole class is a challenge.
113. It was really hard because were are a K-8 and the test window simply was not long enough.
We had to test in chunks of time which did not create a stustained testing environment.
114. Having students value the test and instilling the desire to show us what they know was vital in
their performance. Having students report to the test with their previous score to beat on a
post-it and discussing strategies to have students take their time on the test by using scratch
paper.
115. I did not administer the MAP test so I did not get to see much about the logistics. It did not
really effect me, so I can't speak to it.
116. We are at a (poor) school that has only 15 computers to use at a time. As a result our library
was closed for weeks during MAP so that each class could go in twice (half a class at a time)
for ech test. During this time our Title services were also shut down. I think it is criminal that
the very students who need access to books and Title services didn't receive them for almost
two months.
117. Having one place (like the school library) with all the computers needed and a schedule for
each class to do the reading and math tests was key.

It took 2 hours out of our teaching time for testing as opposed to testing in class over a 2 week
period.
118. I helped my MAP lead by coordinating a second site for testing. It took a lot of time and it was
difficult this first time through to fit in all students from 3-8 grades (we are a K-8) in the 3 week
time slots. Our MAP administrator, Linda Illman, was as flexible as possible and worked to
support everyone's schedules in order to fit everyone in.
119. I just don't see how this is equitable among schools and for students. My students take the test
in our room, getting training from me and other students in the school do the same getting
training from their teachers and then there are schools that have a computer lead who tells all
the students the same thing and they are testing under the same environment. What is the
purposed of the MAP to inform the disctrict of whether or not students are meeting academic
success or is it to inform teaching? Either seem to be failing to me. Why not spend this much
on having mentors for teachers and evaluating them in a meaningful way.
120. I'm not seeing how MAP is related to our curriculums
121. The school data coaches were the most helpful BUT it was new to them and they did not have
all the questions OR the time to help. To put the stres and strain of being a data coach on top
of a classroom teachers already large work load is insencitive, thoughtless and not
understanding of all a classroom teacher in elementry has to do!
122. We found that the Special Ed students did not demonstrate their actual growth on the MAP
test. All of our tests were given by the Data Coach, and we are considering trying having the
Special Ed. teachers give the test with their students, to better prepare them, provide
motivation, and encouragement to do their best. Some students, whether general or special
ed., may simply be feeling over-tested - tired of too many tests.
123. It took up lots of library computer time so classes could not do other activities.
124. Our MAP Lead at BF Day is very accessible.
125. This test is not in the format that students are used to using and I believe that brings scores
down. Also, if someone is tired or not well, it would be very easy to just fill in multipile choice
without much effort put forth. Although it has been explained to me, I am not convinced it is an
accurate picture of student performance.
126. It's too close to the MSP testing date. Too much assessing in too short a time. The students
tune out and don't care. This is a major problem.
127. Who is the data coach?

We should have known from the beginning the importance of MAP to our school and the info
on when scores are invalidated. It seemed very unfair to invalidate the scores of kids who
made great gains --and thus spent more time on their reading tests! Isn't the link between
reading better and for a long period of time go hand in hand. Get an educator on the team that
made that decision --one who hasn't lost touch with classroom realities.
128. It would be very helpful to have some sub money to help with administration of MAP.
129. The spring window was late April to early May. More appropriately, it should be late May to
early June.
130. Is there a specific person at each school the district identifies as the proctor?
131. Technology clitches can waste valuable time that could be spent teaching. I hope we can iron
these out next year.
132. Our school moved into a new building, and we were completely overwhelmed with issues
related to that. MAP was rolled out and introduced to teachers in September in the midst of
this great flux. Abysmal prep and support to teachers. Really now. Never seen such a poorly
planned introduction. This on top of not using DRA and switching horses midstream there too.
133. We have a great Computer Lab and teacher at West Woodland.
134. none
135. Although my students knew much of the information required on the test, their expressive and
receptive language challenges resulted in them not being able to understand what was being
asked. Also, they lack the computer skills necessary to be successful on this test. Many of my
students require one on one assistance to insure valid results.
136. It worked well second time around because we had amazing support that was in charge of the
technology part, but if that wasn't in place again what a bust that would have been.
137. MAP added HOURS to all of our schedules, yet I didn't get compensated more for all that
extra work. Did anybody get compensated more?
138. MAP testing took too much computer lab time. Other teachers could not schedule projects and
research as there were no computers available because of testing.
139. I do not have opportunities to observe my students taking the test ( except once for a minute
or so).
140. More flexible open testing windows to accommodate varied schedules for Homeschool
Resource Center
141. We've done very little in the way of incorporating MAP data into our curriculum, instruction and
assessment.
142. Seems like three times per year is more than necessary
143. We need to train the IA's who help the students with accomodations. I felt as though some
students recieved too much help, and some students didn't recieve enough help.
144. Think it's best to give math MAP test during math time and reading MAP test during LA time.
145. I know MAP is given during teacher planning time, but I think kids do better with teacher
supervision. One of the schools in which I work used the library for MAP and MSP testing,
resulting in it being closed to all but limited student use for over 9 weeks.
146. Some of my 5th grade advance students took 6th and up level

tests and they were quite frustrated with the level. I don't agree with the way we administered.
I think all students should take their grade level tests unless there are strong reasons.
147. It is a waste of time. Please get rid of this test.
148. None. Our school had a great MAP team that coordinate the testing well.
149. do not like how it takes time out of the instructional day
150. Several (about 7) computers froze routinely during MAP testing- their is SO many steps to log
back on, lots of time was wasted.
151. The spring testing window felt too early - I would like it to reflect more of their end-of-year
knowledge, but it was given too early to do that.
152. The idea of MAP testing is very good, but the logistics of tying up computer labs for long
periods of time impacts instruction in a negative fashion.
153. see above
154. MAP testing is done during our PCP schedule to lessen the impact on classroom studies.
However, it does cut into 9 weeks worth of library and computer lab curriculum to achieve this.
MAP should be given in the fall and spring with an option to test at risk kids during the winter.
155. Absorbs far mor time than it is worth
156. None.
157. Not sure WHO will be providing teacher training for the laptop cart roll-in to class on test days.
Is this the ET's job? It also seems like we will need a much larger testing window since we will
have to set-up shop in each classroom prior to any actual testing being done.
158. We discovered several questions that were WRONG or poorly worded in the testing. Really
poor oversights throughout the test...
159. Again, it monopolized the computer labs, preventing all other teachers from using the lab for
several weeks at a time, thus hindering our ability to deliver the kind of instruction we intended
when setting our curriculum.

In the future, if the MAP will continue to be required of us, extra resources (such as district
staff to administer it AFTER school, or additional computers) should be given as well so the
other fifty teachers can also use the computer labs for instructional purposes.
160. It makes no scientific sense to give a test of this nature three times a year.
161. It's still too hard to find a teacher's name and class when "registering" a new student. If the
names were alphabetized it would save a ton of time!
162. oh, sorry - I added this to the previous page.
163. Review and edit all questions before submitting them for students. There were mistakes
and/or questions that were puzzling to the students (and sometime the teacher). Clarity,
especially for elementary students. They should not have to wonder what the question is
trying to say, or what they are suppose to do.
164. many of my lower functioning students were unable to stay focused throughout the entire
assessment
165. Areas, like library, councelling center in building was shut down for long periods during testing
periods.
166. needs to be done again for those of us who didn't catch on the first time
167. not enough time allocated for math map testing so then students missed class(s) for mape-
up/finishing
168. District admin seems good, school admin of testing less functional
169. I tried promoting good effort on these for the last one especially, but I failed to convey what we
could do with the pieces of information that resulted, which I think left some of the skeptical
kids still skeptical.
170. All teachers should take this seriously and their students will improve in their MAP
performance also. My students are very motivated now and they know that they need to learn
to prepare themselves for the HSPE next year.
171. The multiple tests seemed to take a very long time for some students to finish.
172. Our administrator tried to rush everyone through by given two tests one right after the other. I
noticed that my students were frustrated by this because many of the questions were repeats.
Unfortunately this caused some of them to have an apathetic attitude and "throw" the test by
choosing the wrong answer on purpose. Therefore, I'm not confident that their results reflect
their true abilities. Each test needs to be administered on a seperate day/time so that this
doesn't happen again and gives the best reflection of the student's abilities.
173. we did a good job in administering the MAP at our school, we just hope that next year the MAP
assessment is more closely aligned with the Everyday Math program that students use on a
daily basis
174. Once again, I would like to see how the individuals do on each individual question.
175. it was easy and better than WASL, timewise.
176. Our school's MAP coordinator was unorganized. She had students retake the same test (for
Primary MPG) instead of taking each of the two different tests for each subject. For our
intermediate students, she refused to leave her seat and help proctor, even though they had
been told that the math questions could be read to them. (I have some ELL students and
SPED students who would have truly benefitted from having math questions read aloud).

Students experienced test fatigue. The test was too long for one sitting, but that was the way
our school planned for the intermediate students to take the test.
177. At our school most of the scheduling/coordinating fell to the librarian who did an excellent job,
but this was a lot of additional work for her this year and impacted how she was able to do her
regular work. Other staff in our building that were supposed to help did not step up or do what
they needed.
178. Map assessments are not authentic. NOT.
179. It's difficult in a school without a computer lab.
180. Height of chairs at the monitors can be too high for the height of some students. Remind
students to take their time and just because they finished first, does not mean that they got all
the questions correct. Emphasize reflection before the student answers the question.
181. This took 1/4 of my classes technology instruction for the year and diverted it into testing. It
also was very cumbersome for the primary grades. I have concerns about the design of the
test, assuming all learning is linear with a harder question possible only after answering an
easier one, and with the click the mouse with no chance to review answers methodology for
answering questions, exactly the opposite of how most children use computers and controllers
to play games on computers, where going fast and being able to start over and try again are
built into most game design. This takes an entirely different test-taking set of skills to do well
at, yet no training or curriculum was given to either the students or their teachers about how to
handle the switch from test booklet/fill in the bubble tests.
182. Way too much time - we went 6 weeks with no computer lab, missed 12-18 PCP slots to
administer this. Kids were burnt out and would not try on it by Spring, particularly those kids
being asked to test on incredibly difficult material, and it was way too close to the MSP. We
should just give twice, late Oct. and early March, then perhaps once more in June for kids not
making steady progress. Use the spring results for September/Oct. thinking.
183. I am truly upset when Kindergarteners havbe to retake the tests several times only to get
results that say "The proctor had terminated the test". This was not true.I was present when
the tests were given and even got a score each time. This is unacceptable. It is a waste of the
students time and mine.
184. The students I work with are too young and simply want to finish the test because they are
bored with it. I tried a variety of motivational strategies, none of which were effective.
185. I lost 10 weeks of instructional time in our library. While we were able to make accomodations
for most classroom needs, the administration of the MAP test eliminated access to our library's
collection for 10 weeks. This is too much. While I am working with our administrative team to
alliviate some of this strain on our program, I am also hoping that the district will continue to
support larger buildings with more technological infrastructure.
186. I think MAP should be offered in an audio version for the intermediate students.
187. Seemed to work very well, and our Assistant principal did a great job on adminstering for the
entire school. A very challenging task!
188. It creates too much testing.
189. We need a better proctor script. The one that comes in the binder is too dry and talks about
things kids don't actually need to know. It needs to be more motivationally focused - here's
how to do your best, etc. Without an effective proctor script, kids lose interest and don't do
their best.
190. the spring session occurred too early in the school year. it should have been late may or early
june. my students learned a lot of important material after they took the spring "final"
assessment. their progress was not accurately shown as a result.
191. What about the norm referencing of ELL data on the math assessment? The Math
assessment tests both reading and math ability so it is difficult, if not impossible, to know what
the students are not understanding if we do not have normative data.
192. It was administered fine, it' just the whole idea that it doesn't work for my students.
193. Availability of screener year round.

Computers networked to print out, particularly for screener.


194. We need a longer testing window in order to test students during the optimum time of day. My
students had to take the test the very last hour of the day, which I don't feel yields the
strongest testing results. It is also disruptive to the PCP schedule when testing has to occur in
the afternoon.
195. Why are we not testing later for the spring window?

I think it data would be more accurate if we tested as late as possible in the year. Due to the
quick turnaround time for MAP results, why isn't this possible?
196. See previous comment. MAP should be supported with staffing dollars to buildings.
197. It takes way too much time away from instruction and uses our library space for way too long
and therefore classes miss out on their regular library time for several weeks in a row. Also,
computers that could be used for instruction are taken up with testing.
198. Our data coach was hufgely helpful both during and after testing windows. Her
responsiveness and knowledge about the instrument and available tools has greatly
beneficial.
199. I don't think computerized tests are at all engaging or motivating for students. They are
passive and boring and the kids just seem to want to click on something and get it over with.
To be engaged they need to be writing.
200. Since my students do not learn on computers, or use computers to process their learning, I
found the method of computerized administration a poor fit for my students.
MAP in Seattle Schools June 2010
I/my school engaged students and families in MAP in the following ways (check all
that apply):
1. did not do any of these
2. i am not sure
3. We didn't do any of the above in the primary grades
4. I sent home an email explaining what to look for and how to take typical growth with a grain of
salt depending on the initial scores.
5. students have individual map goals for growth
6. Our teachers do not use MAP results
7. I'm not a classroom teacher but I supported the teachers!
8. I only clicked this to the next one, I did none of those things nor was I asked to
9. Set goals with students indivually.
10. There was no specific family engagement. It happened based on what teachers decided.
11. n/a
12. Have talked with families of target students the highs & the lows.
13. Have done, due to lack of information, almost nothing
14. Some teachers discussed map scores with students.
15. I don't teach this subject (reading / Math) - I'm pretty sure this occured in the other classes
16. I did not engage families much
17. Don't know.
18. sent home the results
19. Students were removed from class to take the MAP
20. My school is uncomfortable with talking to parents about the MAP.
21. no engagement
22. None of the above, actually, but this survey is requiring me to check one of these boxes in
order to move on.
23. Tried to ignore MAP as much as possible as other teachers did.
24. posted explanation in the schools website
25. n/a
26. unknown
27. sent flyers home
28. I had to check something but this is NOT applicable for m e
29. As teh cordinator I talk to most students about their target scores and wanting to do their best.
I also included information for families in teh monthly school newsletter.
30. Talked with families in SIT meetings.
31. I know other grades did more, but not in kinder
32. none of the above
33. None
34. nothing has come up
35. None - are teachers allowed to be in the room during the MAP?
36. Not applicable this year. I had no 9th graders
37. Sent home flyers about MAPs
38. School determined not to use the data from this test, thus the training was not leveraged to
improve student learning.
39. teaching kids how to do well
40. Asked students to set goals based on lexile ranges
41. Sent home letters with grade reports to the Parents to help them understand the Test Results.
42. Again, very little coordinated planning around MAP done at our school
43. I encouraged them to try their best but also told them it was a exercise designed primarily to
justify the existence of bureaucrats at the JSC (John would not be happy with this colossal
waste of time and money by the way. Nor would he be happy with the idiot control freak who
now has his job.)
44. fall p/t conferences
45. none of the above
46. None
47. I stopped talking about MAP with my students becasue the whole thing has turned into an
excercize in futility and I don't want to make things worse for my students. We need more time
for instruction, not less. Again, your survey insisted I click a box in order to move forward. I
had to click an answer that is in error because no other option was available.
48. winter conferences offered to discuss performance. 15-20% attended.
49. Sent letters to students after testing in English language only.
50. I did not do any of the above. And why do I have to check on if none of them apply?
51. i don't know
52. none, we only did this in the spring
53. I have NOT talked with families about the MAP because I do not feel confident enough about it
to discuss it or explain it to families.
54. I missed interpretive training and do not know what to tell studetns or parents.
55. not applicable - specialist
56. Please send out reports sooner. There was too much lag time between when the kids took the
test and info came out to parents.
57. Releasing data for the first year was very disruptive for our staff and families. Guidelines were
to not release to public the first year. We had lots of questions from families about the tests
and what they mean without getting adequate training. We need adequate training to address
ways we can use the data to help our students progress and address families inquiries about
the test.
58. Not certain
59. None of the above (I had to check something)
60. I spoke with my students about the importance of trying their best on this test, and with parents
about the importance of having students rested, well fed, on time.
MAP in Seattle Schools June 2010

What other feedback do you have about MAP?


1. It is not an effective assessment for kindergartners. The questions are not clear (a student
may be asked to put a K on the plate, but is not told to put ONLY the K on the plate). The
things that the children click on make noise and animation so getting the wrong answer (by
continuing to play with the screen and adding more cats, trains, etc.) is MORE motivating than
getting the correct answer (where the student may only be able to click on 1 or 2 cats and
have them meow). It should never be more motivating for a student to get a wrong answer. It
should either be less motivating or a neutral experience.
2. This test is useless.
3. It was impossible for us to attend lead/team training during the MSP testing window.
4. I would like to have the results for all testing for the students I teach.
5. More explicit explanations to parents about typical growth and the numbers tehmselves.
Parents were not happy with a typical growth of 7 points or so they wanted HUGE jumps.
6. None
7. We should spend more time teaching the best way to take the test.
8. If a student shows incredible growth, why are the scores invalidated? Is a 45 question test like
this really statistically valid?
9. Nothing at this time.
10. I just don't feel the data is useful; it tells what we should be focusing on for students based on
their leve- I do not need MAP data to tell me this. Additionally, I think many of the questions
are poorly worded, irrelevant, and the use of "red herring" doeesn't help me understand my
students. I find it odd/sad that in the 21st century we have gone back to assessing our
students using a multiple scoice test.
11. Map is the most useful standardized testing that we use.
12. I would have liked to take the test, myself. It would be nice for the scores to be automatically
recorded somewhere after the student takes it, so we don't run the risk of losing their score if
they close out of the test.
13. Reading seems to have different threshholds than math which needs to be made more clear.
Do elementary have different tests or the same as middle school, I'm being told they are not
the same but I thought there was a grade 2-10 continuum? How do we address the kids in 6th
grade scoring 250 and then not showing much growth because that equivalency is algebra or
geometry class? What about the ELL kids? Why can't we assist with vocab so the kids can at
least access the math and not be penalized for their reading skills? When am I going to get to
see scores from other schools like ours? What is the district's plan with these scores as it
seems only the LA and math teachers are on the hook for improvement?
14. Seattle Public Schools has not had a book adoption in many years. We are relying on
Readers' and Writers' Workshop which are pieces of the puzzle but not the entire puzzle.
Students need intentional teaching of the many skills tested on the MAP. We do not have a
curriculum or materials that support this. (Though we did at one time.)
15. Let use our financial resources more wisely than wasting it on more MAP
16. In looking over students' shoulders at the types of questions they received (particularly
students scoring at the upper end), I wonder how well Readers Workshop and this test are
aligned. Do mini-lessons provide the depth of practice/exposure that seems to be required in
order to answers questions about irony or iambic pentameter? There seem to be some very
high-level literary terms incorporated into the test, terms I didn't use until graduate-school
literature courses. I would hope that our district is looking carefully at the resources provided
to reading teachers and the professional development provided for Readers Workshop to
ensure that the MAP test is a fair assessment of what children learn in this model of reading
instruction.
17. I feel very concerned that young children are being tested via the computer. This method is
not developmentally appropriate. I felt that the scores were not a good reflection for the
academic progress for many of my students. I did not find the data useful for informing my
instruction.
18. I think the MAP results can be very useful to teachers - IF they had time to actually analyze
the results. Their time already has so many demands upon it, however, that this is a little
unrealistic. Not to mention how much instructional time it takes away from the schedule.
19. none
20. Give us access to the data, so we can do our own reports
21. Why post the scores for the students to see at the end of the test.......to see and compare!!!!?

Wait till after the test window closes before releasing scores to teachers.......then there is NO
retaking of the test to try for a better score!
22. none
23. I think it is not constructive and, instead, is detrimental to learning. the students are
discouraged and frustrated
24. it's Ok but am not wowed by it. I had a family who did not want their kid to use the internet at
school. Altho he took MAP his lack fo savvy-ness was evident in the results. In some wasy the
MAP results are over-detailed for general instruction and class placement,; yet yield nothing to
help me place kids for reading groups. It is not a panacea.
25. I would like more training on how to interpret the results.
26. We need more information on what questions the students are being asked.
27. It took too much time. It filled up the computer labs for too long.
28. Repetitive comments about there being too much testing, and it taking up too much of our
computer lab time.
29. The district needs to rethink the MAP schedule for the year. 3 times a year plus MSP testing is
way too much emphasis on standardized testing and lots of time out of learning and
achievement.
30. See e-mail from Kit McCormick, Garfield
31. Accomodations would be useful for special education students
32. Would be a useful report to show up online under students' assessments.
33. It should only be done with true 9th graders if at all in the high school, and it also needs to be
aligned with what we are teaching because right now it is not and I am not sure that the other
goals of district alignment compliment the change that would be needed for instruction to be
"informed"
34. The accomadations need to be fixed.
35. N/A
36. The asessment was a burden on our resources. There are not enough computers in our
school so the Library was not available for student use for almost an entire quarter.
37. I don't believe we should be administering this test.
38. It is insufficiently cost effective. Go back to Iowa Testing of all students, once per year.
39. Kids are figuring out how to cheat and inflate their scores. I observed this during the spring
test after suspecting it during the winter one. Students jumped 26 points without doing work in
class. Data didn't match observations.

Also, looking at the time a student took to take the test didn't correlate to success/lack of
success. Some kids jumped several points while taking less time. Some kids took more time
and tanked the test. Some kids retook the test and still failed to see a positive result- which
they really wanted.

What this might mean is that caution should be used when communicating what certain data
points might mean when analyzed.

I got more out of analyzing what I knew about students' lives as a causal factor to results
(negative).

Using this to evaluate me, at this point from what I have observed and tried to make
meaningful, would be a grievous mistake. My actions have little to do with much of the results
many of my students are showing.
40. Make this searchable by STUDENT. High schools and middle schools are severely crippled
by the way it is implemented. The Website needs a better search feature also - it is not
intuitive.
41. There is some resentment in my building about how this was just another thing the district
dumped on us. It was so difficult to access the scores, and harder still to interpret them. We
lost 4 days of instruction...for what, exactly?
42. Because this MAP is technology intensive & uses educational technology that is otherwise
used by classrooms for classroom work, it would be so helpful if all grades were done only
twice a year (which gives teachers & counsellors plenty of data to assign students to classes
& address learning issues) and do only targeted students midyear. That would shorten the
amount of time that MAP supercedes classroom use. If we ever do online MSP, that will be 4
subjects and additional days. We'll be up to a quarters worth of days that labs are unavailable
for classroom use. It is a bit unmanageable in 3 cycles. If this is to have long term
sustainability, we would either have to add substantial #'s of computers (another lab-worth) or
reduce MAP to a twice yearly cycle. Students would take it more seriously if it were only twice
a year. They no sooner get done with one cycle, than it seems to them they're on to the next.
They really don't do their best work in the winter cycle.
43. I think this test is receiving far too much attention from both the district and the parents. I don't
see it informing instruction for me at all, as I use a variety of in-class curriculum based
assessments that actually match what I have taught. It's another example of an assessment
that was implemented for one purpose (in this case to supposedly inform instruction), and has
already morphed into something else. I fear what the test results will eventually be used for.
44. The district talks about this as a measure of growth but teachers see it as a measure of their
own performance.
45. How is seattle's academic curriculum aligned with MAP assessment inventory?

If MAP is being used to assess student growth or performance are we then going to start
teaching to a test rather than teaching curriculum thats culturally relevant to our student's?
46. IT SHOULD NOT BE USED FOR TEACHER EVALUATION BUT ONLY FOR INFORMING
INSTRUCTION!!!!!!!!!!!!!!!!!!!!!!!
47. n/a
48. We were told that this was a test year for MAP but the district is using this test for placement in
math. I have been very disappointed. Many families have talked to me and feel it is a waste
of time.
49. Thanks for expanding testing windows for Secondary students. We still are not sure about the
efficacy of the mid-year tests, though.
50. Takes an awful lot of time away from instruction.
51. students enjoyed taking the test. Some missed problems because they reverse letters or
words, not because they didn't know the material. This affected their scores.
52. I'm curious if MAPS is going to be used to evaluate teacher performance; and if so, how will
that be done, if teachers are exchanging students for Math and Reading classes? Doesn't
seem fair to evaluate the homeroom teacher. For my reading class, I had 3 of my 23 students
in my class, so I really do not think it right to attach my name to the results. Something to
really think about. I am also on the MAPS team for my school. It was not transparent at all
that these scores would be used for Spectrum/AP eligibility and placement in middle school
math classes. I would have liked to inform my students of this use, since I teach 5th grade.
53. Should MAP be used as a student achievement measure or just as a progress monitoring
measure?
54. too new.
55. Please don't use again. Use PSAT, AP, SAT etc
56. This is a very frustrating test for children to take -- the expectation is that they will get 50% of
the problems wrong -- this is not a way to encourage kids. Once they get problems wrong,
because they make no sense to them, they lose confidence and enthusiam in trying the next
problem. Most students got presented with problems on concepts or procedures that they had
never been exposed to -- this makes them feel stupid or betrayed.

I feel we are putting a lot of time, energy and student discomfort into a test that is not
practically useful for most teachers. At the very least we should stop doing if for grade K - 2.
57. Please consider removing this MAP program. It's not a good use of time, money, energy, and
resources. Let's use all those resources in ways that actually help kids learn. There are
enough tests/exams/assessments already.
58. I strongly disagree with how the student scores are marked as not valid based on the amount
of time spent on one test compared to the other test. This is unfair to me as a teacher and to
my students who have tests that are marked invalid.
59. Our MAP administrator needs to be better trained.
60. I would like to know how teachers will be evaluated in relationship to MAP test data.
61. Very concerned about our advanced learners that score so high in the fall 99th percentile and
when they score 98 or 99 in the spring it doesn't meet typical growth.
62. What MAP score do kids need to have to be put up a grade level for middle school MATH?
63. This is only a test and nothing else. Trying to use this as a way to evaluate teachers
demonstrates the ignorance and lack of any vision in the administration of our schools.
64. This assessment is crap and a waste of time, effort and expense. Please get rid of it!
65. We need focused professional development about how we can use MAP results to inform our
practice. This has not been addressed all year even though I have asked about it twice.
66. Could use a MAP Coach to disseminate scores/ give useful/practical advice how to use the
data instructionally for our elementary team.
67. How does a teacher use the MAP data to make daily and continuous instructional decisions
for students who are significantly (years) behind their high school peers academically.
68. Three times a year is too much. We still need to administer the Developmental Reading
Assessment to get real, applicable data.
69. I really worry about it being used to evaluate teachers because I know that all students do not
try their hardest.
70. I had a student retake the MAP because he rushed the first time. He improved his score by
about 20 points. Guess which score got invalidate? You guessed it! It was the higher one.
71. I think it is a great test, which gives an honest assessment of where a student is on a larger
continnum. This is more helpful, esp. the strand data, than getting feedback late in the
year/early the next year for students I no longer have, and which doesn't really tell me what
areas are needing improvement.
72. I feel the test is heavily biased against our ELL population. Many of the questions are directly
related to how well the language is understood. I don't feel it is an accurate portrayal of
student's abilties. The DRA is much more informative to my instruction and gives me better
information about the student.
73. Staff members, including those who have had the training, still don't fuloy understand the
MAP, or even speak as though it is useful data. We need a better understanding of MAP
results at our school.
74. Please don't force our poor librarian to abandon her own job for weeks at a time in order to
make MAP testing her whole life. This wasn't fair to her or to the kids.
75. If you expect teachers to understand and use this data to support students, you need to
provide them with the time it takes to do so. I already work 60-70 hours/week, without using
MAP data at all.
76. Without time to use MAP data intelligently, the testing is a waste of time. Some of that is an
issue for schools to deal with, but the district could certainly make it easier by providing
release time to staff while we are learning to utilize this new system.
77. I would like the test to be administered in sequential days, my students had a week and a half
between days.
78. I have serious concerns about the lack of allignment has with the curriculum taught. What I
saw of the math MAP test did not allign with what I was teaching in EDM. I am also very upset
that families are being giving the results of the MAP test by the district. At this point if and
when parents are given the results should be a building discussion. Especially since this is
the first year we are using this assessment.
79. What a bunch of bunk.
80. I do not think the website is teacher friendly. I was never able to create my own report that
showed growth from fall to spring (only later did I receive it from my principal).

The scores were also confusing because the reports said a student made or did not make
progress, yet their RIT score was at the 50 percentile, and it is my understanding that that
means they are where they should be in the school year.

I also would have liked to see grade equivalent options.

Finally, I think it would have been more beneficial to administer the test in June, especially
since the results are available immediately following the test.
81. I am very skeptical that a computer test can be accurate for 1st graders with very little
computer/test taking practice. Several of my students got either very high or very low scores
and their classroom performance was the opposite. Also....some kids went backwards in their
test scores and yet they are flying in class work. I just don't believe this is an accurate
test....bring back DRA for reading!!!
82. Kindergarten should be exempt from this test since it is not at their level.
83. The language about the scores is not student-parent friendly. Therefore, it is difficult for
students to set goals. The RIT scores don't really show concretely what the kids know and can
do. Examples should be provided to parents about the tasks their child had to complete in
order to attain the RIT score they did. To the kids and parents, it is "just some number"---too
abstract.
84. I don't think a computer-based test is a good way to measure a student's growth in math or
reading.
85. Many of the younger kids just click through the test. One f our 5th graders finished the math
test in 13 minutes. What do the results really tell us about the students levels?
86. None
87. Time sequence between tests was not condusive. Time between Winter and Spring was
insufficient.
88. Why did we have to do MAP so close to also doing the MSP? We also so Writer's Workshop
and Balanced Literacy at my school and they require ongoing assessments also. Too much
assessment time in relation to teaching, practicing and mastering skill time.
89. MAP is a very useful tool for teachers.
90. I think the ratio of useful information to the cost in terms of dollars, time and resources is
extremely low.
91. Should only be given twice yearly. Team of District experts should assess the data then work
with the schools on narrowing down focus areas.
92. The data is great, I am gravely concerned though that it will be used in a high stakes way
which is not designed for such as student placement and teacher evaluation.
93. isn't it a conflict of interest that the superintendent is on the board for the company producing
MAP??
94. I STRONGLY feel that it is innappropriate to lump ELL students and students with IEPs into a
measure that shows "typical growth." A student who arrived a year ago from China will show
typical growth for a new language learner. That growth will look very different than the growth
of a so-called "typical" learner, aka, native English speaker. This, however, is not accounted
for in MAP data or discussions. Thus ELL students and students with IEPs are ALWAYS
presented as "less than." It is a deficit model and puts unnecessary pressure on both
students and teachers. Quite honestly I am sick of this ignorant approach.
95. It is an administrative waste of time. The questions are not according to age/grade level, they
are confusing and convoluted. The test only tests the students ability to take the test
96. I would really like to be able to see an easy to read grade level correlation for each student's
ability in reading. I would also like to see where MY students are struggling (grammar,
vocabulary, comprhension?)
97. Please allow access for related service personnel. Also, I am not sure that everyone has the
level of training in psychometricsn to fully understand the information provided.
98. It takes too much instructional time.
99. In the long run it seems like a good idea but it has no connection to what is happening in the
classroom.
100. Get rid of it, please. It is not useful in any way.
101. For many of our ELL students it was not so much a measure of what they knew, but also how
familiar they are with computers. It assumed a base-line knowledge of computer use that
many of our studetns did not have and still need a lot of practice on.
102. This survey did not allow the "other" choice for logistics. It kept coming back as "This question
requires an answer" until I was finally forced to choose one that was not my actual response.
103. I love the data is gives us.
104. I don't like it
105. For some students and teachers, a lot of instructional time is taken up in testing this year (ie.
MAP, PSAT, WLPT, HSPE
106. I would like ot see more of what is on the test, so I have a better understaning of what it is
evaluating
107. The previous question required me to lie.
108. I wonder how well the MAP is aligned with Readers Workshop and the CMP. It seems that
both curricula are geared toward student discovery and exploration, and the MAP is geared
toward more rote information.
109. The MAP test is harmful to students.
110. I wish the report making tools could be more customizable. I also wish the data could be
imported to Excel and to GradePro and to the Source.
111. I have concerns that the spring test was much more difficult than the prior two tests and was
not "read to my students so it is hard to compare and figure out the growth. I also have
apprehension over the fact that our superintendant sits on the board of the test.
112. MAP is a large improvement from the quarterly math tests but it is just another test in the
school year that takes teaching time away from the classrooms. We already do DRA, MAP,
MSP, Running Records, EDM assessments. Each assessment takes the minimum of 2 days
to administer for a class of 28 kids, that is 6-8 non-teaching days minimally a year!
113. For the time invested it has a small return for the students in my school.
114. N/A
115. Again- I would like access to the questions asked (for example, it would help to know exactly
what is being asked in the area of, say, "Concepts of Print")
116. None
117. MAP data is overwhelming. The sheer amount of data on each student is prohibitive to
teachers who teach 150 students, all tested three times a year. In order to make data useful
to secondary teachers, we would need two extra days with substitutes per testing period, one
to interpret data and one to conference with students about their results while the rest of the
class is supervised. If we are taking these results seriously, we also need to give students
meaningful incentives to try their best.
118. The whole idea of MAP testing is a bit confusing as we as teachers are told one thing and
then something totally opposite happens. Like using the MAP scores to determine if a 5th
grader gets into Honors Math in 6th grade. I never knew the MAP scores would be used that
way. How can we teach the curriculum that relates to the MAP questions so that the kids have
the best shot?
119. It a very assessment. The students get a glazed over look after they been working on the
computer for a while. I seen many students just start clicking any answer because they want
to get it done. I had to constantly remind them that is wasn't a race and the could stop and get
a break. The computer will wait as long as they needed to refresh themselves and try their
best. It is a long time for the student to be fully engaged in the work. Also much of the math is
also a reading assessment. They have a hard time with the reading parts of the math part.
120. I have answered a survey like this recently. I would prefer that all my students take one test
that is identical, rather than having different questions on each test. I cannot see the
questions they answered incorrectly as I am not given a test summary at any point after the
testing. I would like to see a test that I can hold in my hands and see how a student answered
a question incorrectly so that I can see what their thought process was and how I can help
them. On the MAP, I have no way of seeing each child's test. Rather, I am given a summary
of each strand. I would prefer if each student in a grade level in the district was given the
exact same test, like other national tests, so that I can see what they are answering
incorrectly.

Also, some of my K students were trying to finish quickly and I could see that they were not
trying their best on the test.

I feel that a lot of the concepts on the test are not in our district-provided curriculum and that I
have to watch my class as they are taking the test and take notes so that I know what is on the
test and what I need to cover in my teaching.

So, my opinion is that I am not in favor of the MAP and would rather have a test that is the
same for all students. I have never had students take a test where I cannot see the individual
results of how they answered on the test. Also, I feel that I now have to, on my own, learn
what is being asked on the test and incorporate that into my teaching so that my students do
well on the test. A lot of the material is not in our current curriculum.

I am not against students taking tests on a district-wide level, but do not favor this sort of test.
In California where I was a teacher, all students in a grade level were given an identical test in
my previous district and then we could see the individual, school, and district results. I could
also look at a hard copy of the students' tests so that I could see what questions were
answered incorrectly. To me, this is MUCH more helpful than the MAP results.
121. Too much testing! Please stop!!!
122. Would like to know if MAP can/should replace classroom based and curriculum based
assessment? I administer CBA's as well as the Everyday Math Assessments, not sure if that
is necessary/needed. How can they work together?
123. Again, the amount of time required in the computer lab to complete the MAP testing was huge
and adversely effected opportunties for students in other classes and grade levels from
completing projects. How will this problem be addressed?
124. There is way too much testing. This interfers with instructional time, not only for the classroom
teacher but also tech lab, library, and displaced teachers. One test in the fall and another in
the spring (for K-2 only as 3-5 also have the WASL in the spring). Some students could be
tested if the teacher feels that the information would be helpful (an not replicating classroom
assessments).
125. none
126. It took days away from teaching to give math tests when I teach History. It also made the
school library unable for class use while the testing was going on. Since what is tested is
Language Arts and Math, then those class should be used for the testing.
127. As I suggested last year the student strand data should be linked to specific sections within
the discovery text to allow teachers and students to be more prescriptive with respect to the
current curriculum.
128. nothing for now
129. I think it is a useful tool. It is too bad students don't have the opportunity to change their
answers.

It will be interesting to see how it correlates with the MSP.


130. We were not informed that this data would be used for EVERYTHING! I do not feel this one
test gave a well-rounded view of my students.
131. Testing after a holiday or break was not good for Middle School students, they were not at a
level of focus that would more accurately measure their success. We should not test after a
break at all. Also, the testing window needs to be longer to allow for a better schedule with k-
8 schools who have a limited amount of computers. Also, more frequent training on MAP
would be great.
132. The MAP test does not accomodate learning challenged students in the upper
grades(anything past 3rd). I was told to use the lower level test that had the voice promts with
it for reading disabled, but found that it never accelerated enough to know the true level of the
students.
133. It is very useful and measureable.
134. While MAP may be useful, compared to losing two months of Title services and two months of
library services, it is certainly not worth it. Unless our school can get rolling laptop carts or
figure out another non-invasive, non-service-shut-down method to give MAP, we should abort
it.
135. I would love to attend professional development about how to use the results, prepare my
students to test, converse with families about the results, and better administer the test to an
entire K-8.
136. Kindergarten MAP is very confusing!
137. Just because it is electronic does not make it a great way to check childrens' progress. I love
computers and teck but there has to be a better program out there. Maybe if we had up to
date hardware and enough to test more then 4 kids at a time it would not seem so....
138. As a teacher, I tried the MAP test, and it was very difficult to read. It might help to change the
font and also structure it in a paragraph instead of all the way across the screen.
139. none
140. I was disapointed that my special education students were required to take their grade level

MAP tests, when they are working on reading, writing, and math at least one to two grades
lower than their grade levels.
141. Transitional kindergarten students were not proficient in mouse usage--results barely valid for
them because of lack of exposure top computers.
142. Timing is the complaint I am hearing from teachers. However, we need the MAP data for our
all-school sit...and we need to have it before it's time to make recommendations about
retention...So I am not sure how to resolve it.
143. There should be a stipend attached to the proctor role.
144. The MAP strategy of labeling students and teachers as below average, average, or above
average is extremely detrimental. It shows a lack of respect for students, parents, and
teachers. In this system, too many are labeled "below average." Our focus should be on
helping all students meeet or exceed standards. It should be on creating a collaborative
professional environment in which all educators can demonstrate success in helping students
learn.
145. I think it is helpful to remember that this is a snapshot assessment. I noticed a few of my
students scores dropped significantly, in winter or spring. When asked if they knew why they
reported that they were tired or feeling sick that day.

A few questions students had trouble with appeared more content specific rather than reading
skills/strategy specific.
146. Too much time is spent taking the tests. Tests did not mesh with our curriculum. I know of at
least one teacher who felt compelled to teach outside our standard curriculum, in order for
his/her students to perform well on MAP. It's questionable to use students' scores to
determine a teacher's effectiveness when so many matters are involved in a child's academic
success.
147. I liked the tests..and feel they are pretty acurate in the spring.
148. We seemed to have many more technical problems with MAP during the spring window (i.e.,
skipping questions, computers "freezing", and requiring re-booting. Please consider finding a
way to allocate some of the funding set aside for MAP so that district tech folks can proactively
maintain our computers (by doing things such as re-imaging them) PRIOR to each testing
window. By ignoring the tech issues, the district is not acknowledging a true cost of this
project. And the longer our computers are tied up with MAP testing, the less time there is for
authentic and engaging learning activities...which "best practices" tell us this technology
should be utilized for.
149. Students are very unskilled in taking a math test on a computer. The scores do not measure
just their math ability but also their techie ability. Students don't take the time to write things
down on the scratch paper and therefore lose their visual connectedness and ability to
manipulate their thinking. Teaching students how to use their scratch paper well will help
toward improving scores, but having a motivation to do well especially when sitting through 52
problems, is asking an awful lot of them.
150. Unless my students have repeated exposure to using the laptop computers, much of the time
is spent with them trying out various things on the computer even though they know that this is
a test.
151. It is a very frustrating test for students who are not accustomed to working independently and
necessitate a great deal of scaffolding for understanding therefore, results are not an accurate
picture of their knowledge
152. I strongly question whether the results are consistently accurate. I would like to know the
reliability of this test. I had a hard time believing/interpreting students' results. Some had
extremely low drops or very high peaks from Fall to Winter to Spring while many just appeared
to make no progress.
153. Map is a very expensive undertaking. I assess my students individually three times a year. My
testing provides valid evidence. I work with students in small groups, 2-4 children in math and
reading daily, and assess throughout my instruction. I would prefer to see the money the
district is spending on MAP go toward resources to directly support students; reduced class
size, reading recovery, better support for behaviorally challenged students, time for teacher
collaboration...
154. can't think of anything in particular
155. I hope the Spring MAP can be administered later in the school year so we'll have time to finish
teaching as much as we can the grade level curriculum
156. It's taking way too much time away from instruction.
157. Until MAP is better integrated into school's overall academic plan, it's just an add-on test
students take.
158. As I looked at some math problems on MAP, EDM does not seem to align with the standard
too well.
159. Please get rid of this test. It consumes valuable teaching time and does not inform instruction.
Good teachers have good ways of analyzing student work. Bad teachers don't. The MAP test
will not help them and it wastes my time as it does for every other good math teacher.
160. The testing window should be revised so that it does not include the two weeks right after the
students return from a school break, such as Winter Break. Students need about a month
after a break to settle back into school routines and to perform their best.
161. We need to find a way to overcome the logistical problems, because the data is very
worthwhile. As we get year to year data for students, it will help us even more. However,
overcoming student apathy towards the test is going to be an ongoing concern that we are
going to have to figure out at our sites. It has to be more than "goal setting" to work for high
school students.
162. Teachers had NO idea how important this test was in the fall, that it was replacing the DRA for
one, and that it was such a large part of assessment from downtown and sending our school
in a very new direction concerning assessment. Performance assessment seems once again
on the very back burner for SPS. This is an ITBS on line.
163. I feel we have been very accommodating with a test that we had virtually no input on. For the
past year we have given feedback to data coaches and others at the district about our
concerns with the test. I.E. the number of times we take it, machines available, content of the
questions, and validity of the test. Yet the district mandates a testing schedule for the
upcoming year that has no changes and only a week after our last MAP training. To say we
are frustrated with the communication from the district is an understatement. Feel free to
contact me if you would like to discuss this further.

Tom Brown

Laurelhurst Elementary

tjbrown@seattleschools.org
164. 3 X per year is too much. The better you do, the more it becomes a content knowledge test
and not a "reading" test.
165. None.
166. I want to reflect that many teachers are disappointed with MAP and its implications toward
merit-pay. I do find the results useful, but not compared to the amount of work (and $$) that
goes into making it all happen. If we are told to triangulate our scores with our own
observations and our own tests and the new WASL, why are we even spending the time doing
the testing? It seems like technology is being used in its most base form when we close labs,
or bring computers into classrooms, only for testing! Why isn't there as much of a push toward
getting students onto computers to be used creatively or at-least in real-world application!?
167. Get rid of it.
168. This test has caused great many questions from parents that I have a difficult time answering.
169. I recommend we stop doing it because it saps resources.
170. I hope SSD stays with this for more than a year or 2 so the results have meaning over time.
171. Please consider using the funds now used on these tests for technology as it was originally
intended. We need working computers to porvide a world class education.
172. It's a useful tool.
173. I think that the computerized format is extremely difficult for students with ADHD/attention
challenges and for those with Autism Spectrum Disorder. The computer provides a significant
distraction to these students, and their academic abilities are not always accurately refelcted
in their final MAP scores.
174. Education should be a journey, not a race.
175. please repeat initial training
176. classroom teachers should be required to attend this type of trainning and to administer
student testing so that they are clearly familiar with what students are tested on
177. I am concerned about the accuracy of some student's data when they have a lack of desire to
do well on the test or have testing anxiety. How can we truely measure students ablility with
one test, especially when students are young (kindergarten) and don't understand the need to
do their best.
178. PLEASE get the NWEA site to make the data easily transferable to an excel spreadhseet.
179. I am very glad our school district purchased this program.
180. Some of the questions seemed to advanced for some of the younger grades. Also, of course
kindergarteners would do better after the fall since they havn't even been around computers to
often or a mouse for that matter at the start of the year.
181. I think I might be able to like it if I was able to use the results in a useful manner. Also, it
would be helpful if teachers were able to login and set it up in their classrooms so students
who are intimidated by the computer lab, absent, or need more time are able to work on it in
the classroom. This would also give more flexibility in how the computer lab is
used/monopolized by the MAP during the testing window.
182. please align more Everyday Math items with the the math problems on the MAP assessment
so students can feel even more empowered in taking the MAP assessments three times a
year
183. Blocking access to our computer resources 3 times a year seriously impacted classroom
instruction and work on various assignments in a negative way. there needs to be more
access to technology resources throughout all buildings in the District if this much time is
going to be dedicated to testing.
184. It should NOT be a bias on performance assessments for students or teachers
185. teachers are concerned that the MAP will be used in evaluations; teachers are concerned that
the MAP does not align with standards and curriculums; teachers are concerned about testing
in a format that does not mirror instructional practice
186. don't put so much weight on it - too much reliance on computer and allows so much guessing
since there's no writing involved.
187. It does not reflect not does it actually test what students are learning in math, science, or
language arts.
188. I do not believe that MAP should be used in the manner that it is. Teachers were told
repeatedly that it was NOT going to become a high-stakes test, but in fact, it is becoming just
that internally within the district. This is NOT RIGHT (for students or for teachers) because:

1) MAP is not based on state standards;

2) it is norm-referenced;

3) the questions are highly biased and culturally insensitive;

4) the questions do not change with each test (the students see the same questions fall,
winter, and spring - and thus they just keep answering them the same way, rather than truly
thinking through the answers);

5) young children are not ready to take a computer-based high-stakes assessment yet - it's
still too new, novel, and fun to just point and click, click, click;

6) students that are already above grade level do not have the opportunity to show the same
level of growth as students below grade level because of the way questions are written;

7) the formatting of the MPG questions and ways to answer is not user-friendly or easily
understood by K-1 students;

8) the questions are not based on situations familiar to children and thus lead to immediate
test anxiety (questions about life insurance premiums? state income tax? perceived status?
awareness of limited liability? - These are not questions that should be asked on a well-written
elementary assessment);

9) math questions are written for a much higher reading level than the math grade level the
question is testing.

I am not against testing or assessments. I believe that assessments can be powerful tools to
guide instruction and track student growth. I also believe, as educators, we have a
tremendous responsibility to advocate for fair, unbiased testing materials. Our students, who
are children of color and/or children of poverty, do not need one more thing to demoralize
them or their academic efforts.
189. It seems to be a waste of time, money and resources. I know the needs of my students and i
meet them. Having the library closed was difficult for the school.
190. I am shocked at the waste this involves for so little return.
191. See other comment regarding having the reports come out sooner.
192. Map assessments are NOT authentic. You've got to be kidding. Map assessment is about $$$
$.
193. Keep it! We need MAP, especially at the high school level.
194. I don't think it is appropriate to focus on any standardized test results or pressure students
about it in elementary school. We did encourage them to try their best so we could plan the
best teaching possible for them, but I don't want it to be anxiety-producing. Fact is, the margin
of error is so high that a kid who has in fact exceeded typical growth could show negative
growth at the 5th grade level, if the fall score was 3 points high and the spring score was 3
points low.
195. There are many problems with this test. These problems should have been eliminated before
asking students to take it. In the Spring, one Math problem at the Kindergarten level was so
confusing that two teachers could not figure out what the question was asking. Meanwell the
student was in tears after trying many, many different ways to solve the problem. The students
solutions were very logical too. This should not be happening.
196. A number of the questions werenot what fifth grade students would ever learn in our school.
We would not want to gear instruction around those concepts. An example would be the use
of the word "Red Herring" when asking about an advertising technique. We don't even do
propaganda techniques any more, and certainly not with reading. Another example was one
of the Poetics terms that was asked about. These technical terms are not something that our
students would be conversant with, nor should they be.
197. I think it is an excellent tool, but again we need training to be able to inform our instruction.
198. Not certain
199. All in all, a good first year - an excellent rollout. I think a lot of people are bothered by how
much power MAP is taking on, though - it's being used for so many purposes that weren't
discussed ahead of time. I don't think it's a bad idea to use this data for any appropriate
purpose now that we have it, but I do want to acknowledge that some people are on edge
about this.
200. We need guidance from the district about what to give up in order to create time for an
administrator to give the MAP test. For example, just one thing I didn't do this year was safety
committee meetings and minutes / notes, safety meetings with staff, all things about safety.
There was a loss of time (to do the regular office work) because of the new time demands of
MAP. Next year we will go back to safety training but give up something else. Suggestions?
201. There are questions about the validity of this instrument for ELL students. It is important that
we either ensure validity or exempt ELs. It is no use to give to these students if I can't use it to
inform my instruction.
202. As a Middle School SPED teacher, I used the primary assessments. I only had the
opportunity to attend the Middle School MAP trainings at the building and none of the
information applied to my students or my classroom situation. I would like to have the chance
to train and learn how to explain the data coming from the primary assessments to kids and
parents. One issue that I observed with almost all of my students (mentally challenged) was a
lack of test to test and day to day data that was 'accurate' or reliable. I had difficulties in
signing in a student AFTER having to choose a test adn this meant that several kids repeated
the same assessment within a day of the prior test. Scores varied widely, one day 33% and
the next 70% on the same subtest, such as word families. I liked and did use the website
printouts for individual kids in subtests, especially the word families and blends, as both
targeted decoding skills. It would be helpful to have a printout of what all the tests and
subtests are on, and thus allow me to be more informed and targeted in what I want students
to attempt. The grade level and class reports weren't helpful to me, as all of my kids are "BR"
beginning readers. I knew that already!! The lexile information is somewhat useful, but there
isn't any correlation between lexile, Teacher's College/ Fountas & Pinnell reading levels, etc.
Also, working at a Middle School where the library materials begin at level J don't support the
instructional needs of my kids who are reading in the ranges A-C and E-G. I do appreciate the
chance to provide some feedback, as I think the MAP assessment could be developed into a
useful tool, especially to track some of the incremental growth that is particularly important to
notice in our populations of kids in ELL and SPED. Thanks,
203. I am wondering how much the content of what is tested is related to the content taught and
expected of students. Also, the way the test is set up makes it difficult to assess students who
are not technologically skilled. My classroom data showed different growth than my students'
MAP scores.
204. I tried my best to explain the MAP results to many parents, but the many I have who don't
speak English well, and/or aren't well-educated, didn't have a clue what I was talking about,
even though I explained it in Spanish to my many Hispanic parents.
205. A sample/training test would be beneficial to help my students with methods of responding.
206. The amount of time it takes to schedule and manage MAP testing is much more than the 24
hours allotted to each school. Please be more realistic regarding the extra time this
assessment actually takes to implement effectively and compensate staff for their hard work.
207. I am not convinced this is a useful test. My biggest concern is that students do not try their
best across the board. The computer aspect of the test I think contributes to this. There is a
lot of mindless clicking going on, and I have not found a way to get some of my students to
slow down and actually think. I also resent the computer lab being used for only testing 9
weeks out of the year - that's a lot of computer lab time that can be used for other
learning/uses
208. The winter scores did not seem to fit with the fall and spring scores. Is there a reason for the
discrepancy?
209. Our biggest need right now is a way to "digest" all that is in the Des Cartes. Teachers are still
wanting a narrowed down, specific "diagnostic" for each student, and we are struggling with
finding a process to narrow down what a child actually needs. The ladders strategy did not
feel as effective as it could have, mainly because of how cumbersome the Des Cartes was,
and the fact that after using it, teachers still felt like they were "guessing" a bit.
210. Is this just lining the pockets of the Superintendent of Seattle Schools, who has a vested
interest in the testing company?
211. I felt strongly that the tests did not reflect the basic curriculum we are presenting in our
classes. The students encountered questions that were unlike anything they had ever seen,
and were therefore demoralized early on in the process. The test format was totally linear, and
students had to complete each test question before they could move on. They were unable to
go back to address questions. The text of many questions in the reading MAP required
students to track all across the screen with very small print, something they don't encounter in
their classrooms. Also, the timing for testing was very late in the day, and pressed up against
recess time, with bells going off while students were testing. Those students who were still
testing were aware they were missing recess, (and they hadn't been outside all day due to
rainy-day indoor recess.) I feel many students hurried up and answered anything just to finish
up and get outside. Also, students were very crowded into an unfamiliar setting to test, with
almost no room for figuring on stratch paper with the math MAP.

I felt this testing took an inordinate amount of time out of our class instruction, and feedback
via reports was difficult for parents and me to understand. Many students who do very well in
the classroom did not show growth on these tests, and this was upsetting to many parents. I
felt the tests did not reflect the learning that is going on in the classroom.

You might also like