3 Key Takeaways from School Effectiveness Research

School Effectiveness, Summer Learning Loss, and Federal Accountability - TLG-IMG-05212019The Every Student Succeeds Act (ESSA) has significantly changed the nation’s school accountability landscape. Increasingly, schools are being held accountable for their contributions to student academic growth, with many states weighting growth as much as, or more than, single point-in-time achievement measures. Thus, the stakes for estimating student growth in a reliable and justifiable way are higher.

Research shows that estimates of school effectiveness are sensitive to seasonal patterns in student achievement data, particularly summer learning loss, and whether estimates account for the time students spend out of school during the summer. Studies have shown that if accountability models do not account for summer loss, rank ordering of schools based on their contribution to growth can shift, and impact which schools are deemed effective or ineffective. (Gershenson, S., & Hayes, M. S. (2018). McEachin, A., & Atteberry, A. (2017))

Despite these findings, accountability plans and program evaluation methods typically ignore summer learning loss. In many cases, this omission is because of lack of data (e.g., if states test students only once per year, within-year growth from fall to spring cannot be estimated). Another main reason that summer loss is often ignored in accountability measures is that accounting for both within-year (fall-to-spring) and between-year (spring-to-spring) growth together in the same model is complicated.

Tweet: 3 Key Takeaways from School Effectiveness Research https://ctt.ec/Rcfnk+ #edchat #education #edresearchDr. Jim Soland, Senior Research Scientist at NWEA, and Dr. Yeow Meng Thum, Senior Research Fellow at NWEA, recently conducted a study where they applied the Compound Polynomial, or “CP,” model in a school evaluation context to address the seasonality of student achievement data. Here are three key findings from the study:

  1. School effectiveness measures are sensitive to summer loss. Schools that would be held accountable based on students’ fall-to-spring growth were often not the same as those that would be held accountable using spring-to-spring growth, a common practice under ESSA.
  2. More student growth is attributable to schools when summer loss is considered. Further, more of the gains in student test scores were attributable to schools when using fall-to-spring growth rather than spring-to-spring scores.
  3. Ignoring summer loss can impact which schools are identified as low-performing under ESSA. Thus, under ESSA, schools are likely being identified as low-performing based in part on decreases in growth during the summer when students are not in school. When coupled with previous research (Alexander, K. L., Entwisle, D. R., & Olson, L. S. (2007)) on summer loss that finds greater summer losses for students from lower-income backgrounds, this finding suggests we may be unfairly penalizing schools serving the most marginalized students.

To see our recommendations, visit our website and download the research brief today.