This is the third in a series of posts by Christina Schneider and Robert Johnson about building trajectory-based performance tasks. Stay tuned for follow-up posts as well as a webinar on this topic.
We hope you’re becoming intrigued with the idea of creating trajectory-based performance tasks for measuring student growth. We introduced the idea of using these types of tasks in our first post in this series and have also explored step one, defining the big idea.
Now we’d like to focus on creating the trajectory. This post will lay out the basics, and we will discuss the creation of performance tasks in our October 24 webinar. We elaborate on this process in greater detail and provide more tools in our book, Using Formative Assessment to Support Student Learning Objectives.
Developing a trajectory can be a complex part of the process, so we invite you to take your time with this post. Grab a cup of coffee, find a cozy place to sit, and let’s get started.
Step 1: Define the big idea
We defined the big idea for the set of tasks we’ll be creating as an example in our last blog:
Grade 4 students will observe patterns through inquiry and analysis of data and use these patterns to test cause-and-effect relationships, which pervade all the disciplines of science and at all scales.
You’ll want to define your own big idea for your own set of tasks.
Step 2: Study the grade-level standards that align with the big idea
What are all the standards that underlie and support your big idea? Carefully review standards related to your big idea and document them. For our example, we are using the Next Generation Science Standards (NGSS). Performance expectations found in the NGSS are meant to provide examples of ways the three-dimensional standards could be integrated on an assessment.
Carefully review standards related to your big idea and document them.”
As NWEA colleagues Kevin McCarthy and Paul Nichols will discuss in a future post, the disciplinary core ideas of the NGSS are often embedded into phenomena-based tasks. They are the springboard for the inquiry that activates students thinking like scientists. Therefore, we are going to build our trajectory by focusing on the integration of the cross-cutting concepts and scientific and engineering practices. To begin we need to unpack and examine standards related to the following:
- The cross-cutting concept of patterns
- The cross-cutting concept of cause and effect
- The scientific and engineering practices that intersect with patterns and cause and effect
Step 3: Analyze which standards are easier and harder for students
How are standards sequenced in your curriculum? Think about which standards are precursors to others, and how standards become fused to show more sophisticated levels of thinking in a content area.
For science, it may also be necessary to consider both the science standards and the Common Core Standards in math. Students begin to plot pairs of values on the coordinate plane to show relationships and unit rates in grade 6. When grade 4 students plot data for patterns in science, they are expected to primarily use bar graphs or pictographs to reveal patterns. For young students, analyzing simple rates of change is likely to be a more difficult skill. Using such considerations can help you sort the standards from easier to more difficult.
Using our teaching and assessment experience, we hypothesized that standards related to cause and effect and building arguments are likely to be the most difficult for grade 4 students. Interpreting the meaning of the pattern in the context of the phenomenon was also likely to be difficult, but less difficult than cause and effect.
Think about which standards are precursors to others, and how standards build to show more sophisticated levels of thinking in a content area.
We also investigated the research literature. One researcher found that few grade 7 students could conjecture whether graphically presented data showed a noncausal relationship. Another research team reported that multiple research studies show that literal reading of graphs is an easier skill; however, they found that grade 5 students had difficulty connecting what they learned from observations to the quantitative data they collected.
We included all of these as considerations as we sorted and sequenced standards into easier and more difficult skills. Your experience and knowledge will be a useful guide as you work on your own trajectory building. Try to take time to consult the latest research as well, since it may help you sort those more-difficult-to-sequence skills.
Step 4: Study how adjacent above-grade standards intersect with on-grade standards
Completing this step will help you identify what advanced students are likely to know and be able to do.
We noticed two grade 6–8 patterns elements:
- Graphs, charts, and images can be used to identify patterns in data
- Patterns can be used to identify cause-and-effect relationships
Noticing both of these elements also supported our conjecture that elements related to cause and effect are appropriately placed at the advanced level for grade 4 students. We moved standards related to cause and effect under a header for advanced skills.
Step 5: Study how adjacent below-grade standards intersect with on-grade standards
This step will help you identify what beginning students are likely to know and be able to do.
We noticed the following:
- Grades K–2 pattern element: Patterns in the natural and human designed world can be observed, used to describe phenomena, and used as evidence
- Grade K–2 SEP Analyzing element: Observations (firsthand or from media) to describe patterns and/or relationships in the natural and designed world can be used to answer scientific questions and solve problems
These two elements connected with the grade 3–5 standard we placed as easier: Construct an explanation of observed relationships (e.g., the distribution of plants in the backyard).
Annotate each standard for the part that denotes the input to student thinking and the part that denotes the output expected from the student.
We moved these three standards under a header for beginning skills. We then sequenced the remaining standards as approaching skills and on track skills.
Step 6: Parse the standards for the input to student thinking and the expected output from the student
Annotate each standard for the part that denotes the input to student thinking and the part that denotes the output expected from the student. Some standards have one or the other, but many have both.
The input is the source material for student thinking. Across content areas, the input—what specifically a child is asked to think about—can influence multiple things, including the:
- Difficulty of a task used to measure student thinking based on the content difficulty
- Number of concepts or processes required for students to integrate within a single input or across multiple inputs
- Cognitive complexity of the task that will elicit the observable behaviors from the student
For example, the following grade 3–5 element has an input and an output: events that occur together with regularity might or might not be a cause-and-effect relationship. The input is “events that occur together with regularity,” or patterns. The student is asked to analyze the events and provide a response—the output—that these events are likely or unlikely a cause-and-effect relationship. A component of the input is how complex the pattern is to detect. Is it overt or more subtle?
Step 7: Merge the standards into the trajectory
Trajectories are based on related standards or indicators in lower and upper grades and increases in reasoning of the integrations of standards or indicators within a grade. In this step, you’ll merge the standards you placed under each header into the trajectory that describes the content of the input and the content of the output teachers should see as students grow in their skills over time.
Content difficulty can be based on the following:
- Complexity of the intended tasks given to students
- What students must produce to show their thinking
- The use of familiar versus unfamiliar scenarios
Here’s an example:
We’ll walk through general procedures for creating trajectory-based performance tasks in our next blog post. NWEA experts in science and performance task development will join our blog series and be available to support you in our live webinar on creating trajectory-based performance tasks October 24. We hope to see you there!
Robert Johnson, a professor in educational research and measurement at the University of South Carolina, coauthored this post. His research related to assessment and evaluation has been published in journals including Applied Measurement in Education, Assessing Writing, and Teaching and Teacher Education. He holds a PhD in educational research, measurement, and evaluation from the University of North Carolina-Greensboro.
Drs. Schneider and Johnson coauthored Using Formative Assessment to Support Student Learning Objectives.