Good and bad, I define these terms
Quite clear, no doubt somehow
Ahh, but I was so much older then
I’m younger than that now
Bob Dylan – My Back Pages Lyrics | MetroLyrics
States and districts are all wrestling with implementation questions around the Common Core State Standards (CCSS). Even states that chose not to adopt the CCSS are addressing some of the central issues the CCSS raise about how best to prepare students to be career and college ready. Primarily, these questions are around issues like what instructional materials will help us reach these goals, and which curricular approaches will be most effective. Many vendors had CCSS aligned materials almost before the standards were finalized. In the same vein, many districts have taken their existing instructional materials and realigned them to address the new standards. Is this adequate? On the other hand, are materials created since the CCSS have been finalized better, more effective?
One place districts can turn for help in answering these questions is Toolkit for Evaluating the Alignment of Instructional and Assessment Materials to the Common Core State Standards which was developed jointly by Achieve, The Council of Chief State School Officers, and Student Achievement Partners. Applying and using these tools and rubrics can be daunting and time consuming. Some districts may choose to devote the resources to implement the full review process with fidelity. Other districts may want to review the tools and look for general principles they can apply as they conduct their local work in choosing instructional materials and crafting curriculum documents. I would like to offer some thoughts to these districts and even to districts that choose another review process.
Recently, I was trained to be a peer reviewer for secondary ELA in applying one part of the toolkit, the EQuIP Quality Review Rubric. Though I can’t summarize that training in one blog post, I plan over several posts to offer insights from the training and from the process of reviewing materials which I will be doing over the next several months. This article in EdWeek by a writer who sat in on our ELA training session gives a good sense of the intensity of the experience.
In this post, I will mention two big impressions from the two days of training. The first takeaway for me was that we are all “young” in this process of defining effective materials and determinations of good and bad are not quite clear. Even our two trainers—a primary author of the CCSS and a curriculum writer who helped create the rubric—were tentative in their judgments about how best to apply the framework and criteria the rubric defines. As we worked through materials with them, we had rich discussions about what would work and what meets the criteria the rubric sets forth. The EdWeek article mentioned in the previous paragraph gives a nice sense of some of those discussions.
The second big takeaway for me was the common sense approach the trainers modeled in that they recognized that all lessons or materials can’t do everything in the rubric. All lessons need not address all the instructional shifts. For example, the rubric calls for units to offer increasingly complex texts, a key CCSS shift. Yet, a social studies unit may more logically take a chronological approach in which older texts may prove more complex to students who are less familiar with the background contexts of the older texts. This led us to discuss whether the first texts of the unit had sufficient supports for students rather than to a rote application of the criteria. The rubric becomes a framework or lens for evaluating quality instructional materials rather than a cudgel used to enforce a strict set of rules. In future posts I will unpack some of the criteria in the rubric in more detail.
Photo credit to Francisco Antunes.