What’s the story with the data?

student data

As I proctor my students taking the NWEA Science MAP test, I notice blank stares at the screens. Students are slowly absorbing the random test questions. Trying to put the words into a context, thinking back through classroom experiences. Attempting to find the correct answer. We have been studying severe weather events in class. Now the students are being bombarded by questions from physics to biology to geology. Some of the questions hit directly on units covered in our classroom, others were studied in previous years and some of the material has not been taught to my students yet (In curriculum for 8th Grade). I wonder: Will the results show that students grew their science knowledge? Are my teaching methods preparing my students for this type of assessment? Is this assessment meaningful for my students?

As the results come in I find that 72% of my students hit their NWEA Projected Growth Targets. Well that doesn’t seem bad. 7 out of 10 students in my class would hit the expected growth. But did my instruction garner these results? and Why do I feel the 72% should be higher? These are the hard questions to answer. I feel that many of the questions I observe from over the shoulder of a student taking a test are disconnected from my teaching. From the random sample observed, I did notice a few questions that go directly with our instruction. Many of the questions were totally disconnected from any of the units taught this year in seventh grade. If my students correctly answered these questions the credit should go to them, not my teaching since it was never covered in my class. 72% seems low because it is a C on most grading scales, I feel I am better than a C teacher. How can any test measure the effectiveness of teaching when it is not directly connected to the content taught in class? MAP does seem to measure a student’s ability in a subject matter and their growth over time. This could be tied to teaching but mainly is students learning ability.

I teach the way I have been trained. I teach units. Ideas are introduced with global experiences. Lessons are organized so students can learn related ideas together. This method allows students to make connections in their learning. Ideas seem to flow together naturally. Light and sound are taught together in a unit on wave energy. Cells are studied at the same time as genetics and plants. Students like the flow and can dive deeply into content with the connections. Standardized tests seem to forget this concept. Questions jump all over the place. The questions have no context or connections for the students to anchor their understanding. Simple reason why Jay Leno’s segments of “Jay Walking” are so funny and popular.

Should I change my teaching to be more random? It might help students think before jumping at the first thought that enters their mind.

My students hate the MAP test. They feel it is a waste of time. Many commented about how they had no clue on how to answer certain questions. Some felt frustrated during the test and gave up on trying when they encountered questions that had content that was never taught to them. Are these test necessary? Aren’t there better ways to show students are learning?

When we really look at the data what story does it tell?

Feeling like a failure…is it valid?

My district uses NWEA MAP scores to measure student growth. Our students take the test in the fall, are give a target to reach when they take the spring test. We have been using the Math and Reading tests for the past 3 years and this year we added the general science test. This week my classes took thescience test. We missed the growth target! One of my student growth data points will not be rated as effective. For me to have gotten effective 60% or more of my students needed to hit their growth target as projected by NWEA. We missed. I feel I have failed my class.

Or at least I did until:

A student took the 45 question test in 10 minutes and saw their test score jump 13 points!! Wait, What? I can’t read 45 questions in 10 minutes, That is answering a question about every 13 seconds is that possible with any accuracy? Yes, I know this student met their target, but it makes me question the validity of the test for every student. If someone can score higher by chance, can’t they also score lower? Should their be a way to make sure students actually read the test? Or is that one me monitoring 30+ students? (In fact this student tested 1 on 1 with another teacher because they we absent when the test was given) The fact is this student growth is reflected in MY teacher evaluation, it leaves me with a few questions.


1. Where do the growth targets come from? Not all students grow at the same rate so how in the world can NWEA project these targets? I have been told that they are calculated as the average growth for everyone that scores the same RIT score. IF so then 50% of ALL students will fall above and 50% will fall below as a law of averages.

2. What standards is the NWEA test based upon? I assume common core for ELA and Math, is it Next Gen for science? Surely not the Michigan 7th grade science standards that I am required to teach.

3. If students are above grade level, is it expected for them to grow? Teachers teach a grade level content standard, how can students grow in areas that are not taught as defined by curriculum? I know teachers need to offer enrichment opportunities in class but the dig deeper into curriculum not into high level curriculum that the NWEA test measures.

4. Do multiple choice test really measure knowledge? I often call them multiple guess tests. Most of my student love multiple choice because they can take a guess. They hate fill in the blank and short answer questions because that requires them to have the knowledge. I find it funny that a student who takes 30 minutes to try and unsuccessful complete a short answer test is done in 30 seconds with a similar multiple choice one! The new assessment for the common core are placing an emphasis on more open ended questions so why not NWEA?

5. Do these test scores correlate to content mastery? Is there evidence that doing will on MAP tests means students DO know the content knowledge?

I know these are changing times. Teachers are responsible for making sure our students grow. I KNOW every single student in my class grew in many different ways this year. I have their classwork to prove it. I hope the laws will be fixed so teachers like me don’t feel like FAILURES.

I will continue to strive to be the best teacher I can. I don’t want to resort to teaching to the NWEA (or any test) just to keep my job, I personally feel that would be educational malpractice.