Using NWEA Norms to Establish Goals on State Tests

Question:  How do we use NWEA norms to identify student performance and growth goals around meeting state proficiency standards?

Answer:  Simply put, we can’t.

NWEA performance and growth norms are nationally representative, and the students who comprised those norming samples came from all 50 states.  Consequently, NWEA norms inform us of how a student’s observed MAP performance or growth compares to other similar kids across the nation.  For example, a fifth grader who completes her fall MAP reading assessment with a score of 191 performed at the 13th percentile according to NWEA status norms for the fall.  This means that about 87% of fifth graders across the nation would be expected to produce a higher fall MAP reading score than she did.  Similarly, the NWEA growth norms tell us that this same fifth grader (with the 191 in the fall) would show about six points of growth between fall and spring, on average.  Therefore, if she actually did make six points of growth between fall and spring that year, her growth would be “typical” (or at the 50th percentile, with half showing greater growth and half showing less).

However, NWEA performance and growth norms tell us little about state proficiency standards.  Some states set high proficiency standards that relatively small percentages of students can meet.  Other states set lower standards that more students can meet.  In Massachusetts for example, the fifth grade reading proficiency standard is set at about the 53rd percentile on NWEA’s norms.  Just under half (47%) of fifth graders across the nation would be expected to meet these standards.  In Ohio, however, the fifth grade reading proficiency standard is set at the 26th percentile.  About 74% of U.S. fifth graders would be expected to meet that easier reading standard.

The fact that states set different proficiency standards means that we cannot use NWEA performance and growth norms by themselves for setting performance and growth goals, if the goal involves meeting the state proficiency standard. Our hypothetical fifth grader, with her fall reading score of 191 would need to make 12 points of growth between fall and spring to meet the Ohio proficiency standard, whereas she would need to demonstrate 22 points of growth to meet theMassachusetts proficiency standard.  For someone of her grade and initial achievement level, the 12 point goal represents about 84th percentile growth, attainable by about 16% of students like her (based on a mean of 6 points and a standard deviation of about six points, as published in NWEA norms), whereas the 22 point goal represents greater than 99th percentile growth.  Both of these goals would be challenging, but the second goal will be virtually impossible.  In neither case would typical (50th percentile) growth be sufficient to meet the state proficiency standards.

Information about the level of performance on MAP assessments that correspond to the various state test proficiency standards is available for all states with MAP/State Test linking studies.  These reports are found online, and are critical for setting performance and growth goals tied to meeting state standards.

Now that most states have adopted versions of the common core standards for reading and math, some have argued that the cross-state differences in performance will decrease.  This is possible, but it doesn’t necessarily follow.  Cross-state differences in performance are a consequence of proficiency standards, not content standards.  So differences in proficiency rates across states will decrease only if states establish common proficiency standards (i.e., passing scores) on the new assessments.  Time will tell.