How can principals use student, school and district data to create a positive, growth-oriented climate for teachers and learners? In a new article for the National Association of Elementary School Principals’ (NAESP) Principal magazine, Raymond Yeagley, NWEA Chief Academic Officer, takes a closer look at the promise and challenges of academic growth models.
Academic growth models describe “how much growth is occurring among and within groups of students.” Yeagley examines the five different models used by the U.S. Department of Education to understand the growth of schools and districts:
1. Value added model (VAM)
VAM attempts to isolate the impact of an individual teacher from “other variables that have an impact on student learning.” If a student performs better than predicted by the model after accounting for other variables, a positive influence is attributed to his or her teacher (and vice versa for students performing worse than predicted). VAM also aggregates individual student data to calculate the effect of a particular school on student performance.
2. Value table model
In this model, points are assigned to each school for students who move from lower levels of proficiency to higher levels. Schools also earn points for proficient and advanced students who maintain proficiency.
3. Trajectory model
The trajectory model uses a student’s performance as most recently measured to project a three to four year progression of scores leading to proficiency. Schools can meet AYP by helping students achieve proficiency or demonstrate progress along or above the trajectory.
4. Projection model
Similarly to the trajectory model, the projection model plots a trend line predicting future performance. This model takes a student’s past performance into account to calculate the likelihood that a student will or will not meet proficiency by a particular target date.
5. Student growth percentile model (SGP)
Under the SGP or Colorado Growth Model, students are grouped with those who have had similar performance scores in the past. The model uses data from these academic peers to predict future performance for these students and results are framed as a percentile of the student’s academic peers’ scores.
Each of these models has the potential to offer school and district leaders new windows into student performance and areas for teacher development, even as they are limited by the caveats that accompany any attempt to predict and interpret human behavior. What’s most interesting is what can be done to use this information to improve outcomes for individual students. This often gets short shrift in considering these models.
For example, Yeagley suggests that VAM data can be used to understand whether teachers are serving low, average and high-achieving students differently and with different results. Might they be focusing instruction solely on students at the lowest end the performance range at the expense of challenging advanced students? Does one teacher’s instructional strategy produce particularly strong results for students in a particular demographic category and thus warrant replication across a school or district?
Of course, no single data point can capture the full picture of a student, classroom, school or district. It’s up to principals to consider multiple measures of student and teacher performance (including peer observation and self-reflection) to best identify how teachers can improve their performance and to create the climate, supports and opportunities to help them do so.
Read “Understanding Academic Growth Models” in Principal magazine online.