Massachusetts today has some of the best test scores in the country. When Massachusetts is considered by itself in PISA and TIMMS comparisons, the results in math, science and reading are on par with high performing countries. What’s overlooked here is what they went through to get to this point, and how it parallels what other states, like New York which implemented a CCSS aligned test this year, will need to go through as well.
Back in 1998, the Boston Globe ran stories almost daily on reaction to the MCAS (Massachusetts Comprehensive Assessment System) tests, designed to measure student achievement against standards that were updated in Massachusetts.
This op-ed abstract from Judy Borocheck, Alan Oliff, and Diane Tabor in March of 1998 describes the MCAS test and standards it seeks to measure:
The initiative behind the reform effort is to set a common, high standard for excellence in what is taught, learned, and tested. The tests promise to assess learning in a more comprehensive and accurate way than typical standardized tests. They aim to assess the complex range of what is learned in school, not merely what can be rehearsed for a multiple choice test. They promise to assess content knowledge and students’ capacity to apply that knowledge in problem-solving situations — the skills that indicate proficiency for future independent learning and contributions as workers and citizens.
Excerpts from Boston Globe articles in 1998 sound an awful lot like what we’re hearing now from parents, teachers, and even some students:
Pisani and other parents are concerned that as the tests near, stress among teachers and school administrators will reach a fever pitch and frighten the children. Some say that teachers, intent on preparing for the tests, will discourage creativity and risk-taking among students.
In all three grades tested — fourth, eighth, and tenth — a majority of students scored in the “failing” or “needs improvement” categories in English, math, and science, with only one exception: most eighth-graders did well in English.
The lesson from Massachusetts is that the move to strong standards will not be easy, and it will be met with vocal opposition. However, the greater challenge going into the implementation and assessment phase of the CCSS will be separating the part of the criticism which is valid and will lead to better outcomes from the part that is just teeth gnashing and hand wringing about higher expectations.
Clearly, in 1998, when Massachusetts adopted standards and then chose to create tests that assessed the standards, it was a vast improvement over simply using off-the-shelf standardized norm referenced tests not aligned to anything in particular. Over the years, the Massachusetts MCAS has evolved and has been an important component in the state’s success.
As we begin to see the implementation of assessments to support or measure instruction based on the CCSS, we need make sure that all voices of criticism are considered and judged as to whether they are simply the kind of hand wringing we saw in Massachusetts or whether they are genuine critiques of test quality. This is particularly true because the tests to measure the CCSS have been developed quickly and without the extensive field testing that accompanied the previous generation of assessments. Also, the previous generation of tests, as the articles remind us, was implemented incrementally in three grades and then expanded. The current generation is being rushed forward in all grades. Much of the recent criticism I have read of the New York tests given this spring seems to me to fall on the side of genuine concern about test quality.
However, let’s not mistake genuine concerns about test quality as critiques of the CCSS. We need a challenging and rigorous education system and the Common Core State Standards are a good start to getting us there.