Phantom Test Results and the Need for Assessment and Data Literacy

Phantom Test Results and the Need for Assessment and Data LiteracyOur own John Cronin and Nate Jensen recently penned an article that appeared in Phi Delta Kappan titled The Phantom Collapse of Student Achievement in New York (PDF download). In it they addressed the big decline in New York State’s proficiency rates following the adoption of a new assessment designed to measure performance on Common Core standards. Namely proficiency rates dropped from 55 percent to 31 percent in reading and from 65 percent to 31 percent in math. Needless to say, these results sparked a flurry of conversation taking into question the new tests and the Common Core State Standards themselves.

Back in September of 2013, one of our bloggers, John Wood, wrote a piece that highlighted New York’s general reaction and likened it to what Massachusetts went through way back in 1998 when it implemented the Massachusetts Comprehensive Assessment System or MCAS tests to measure student achievement against a more rigorous set of standards it implemented the same year.

Here’s what the Boston Globe reported back then:

In all three grades tested — fourth, eighth, and tenth — a majority of students scored in the “failing” or “needs improvement” categories in English, math, and science, with only one exception: most eighth-graders did well in English.

The results mirror what New York went through last year, but rather than change course, Massachusetts weathered criticism and forged ahead. As we begin to see the implementation of assessments to support or measure instruction based on the CCSS, we need make sure that all voices of criticism are considered and judged as to whether they are simply the kind of hand wringing we saw in Massachusetts or whether they are genuine critiques of test quality.  This is particularly true because the tests to measure the CCSS have been developed quickly and without the extensive field testing that accompanied the previous generation of assessments.

So what did our researchers find upon diving into the results? Be sure to read the article, but the bottom line is that by diving into MAP assessment data they were able to use student results to estimate what a school system’s 2013 proficiency rates would have been if the state had not changed the proficiency cut scores. They found that student proficiency rates actually increased – in some cases significantly. The perception that student achievement actually declined with the new, higher standards was not in fact reality.

As the researchers summarize in the article:

If student achievement declines, educators should take appropriate steps to rectify the reason for it. However, if student proficiency goes down, this does not necessarily mean student achievement has declined, and the potential reasons behind these drops in proficiency — such as the implementation of a higher proficiency standard — should be clearly and accurately articulated to parents, teachers, and the public as a whole.

The conversations around test results do suggest the need for educators to become more data and assessment literate. Understanding assessment data – especially in light of curricula changes, higher standards, and new tests – is paramount to accurately benchmarking student achievement.