South Carolina has its share of challenges. Poverty is pervasive, and the state tends to rank low in educational achievement relative to other states. On the positive side, elected officials and education leaders in the Palmetto State recognize the value of assessment—even during a tough period when it might be easier to waive or postpone it.
We saw that commitment to assessment in June 2020 when South Carolina passed Act 142, which called for schools to test all K–8 students within the first two weeks of the 2020–21 school year and again before the end of 2020. This followed the publication of NWEA’s research brief, “The COVID-19 slide: What summer learning loss can tell us about the potential impact of school closures on student academic achievement.” In December, the NWEA research team followed up on these projections with data on student performance drawn from 4.4 million students from more than 8,000 schools who took MAP® Growth™ assessments.
Using this national research as a starting point, NWEA built on its strong partnership with education leaders in South Carolina to produce a customized report for the state. I sat down via Zoom with a few of the key players behind this report to talk about how the study came about, lessons learned, and how the study is informing next steps in the state.
Matthew Ferguson is the executive director of the Education Oversight Committee (EOC), South Carolina’s education watchdog agency. Before joining the EOC, Matthew worked in a South Carolina school district in various capacities, including district- and school-level administration, curriculum instruction, and personnel management. Dana Yow is the deputy director of the EOC and has been with the agency for most of the last 15 years. Dana handles communications for the EOC as well as much of the agency’s overall operations. Greg King is a research scientist at NWEA. By partnering directly with states and districts, Greg is able to leverage specific demographic data to figure out how students fare in the wake of major events like COVID-related school shutdowns and the pivot to remote learning. My conversations with them have been edited for length and clarity.
How did the partnership with NWEA on this South Carolina report come about?
Matthew: NWEA has been well-known in South Carolina for years and is currently in 62 of our 82 districts. In fact, I worked directly with NWEA when I worked at the district level, using NWEA data to coach teachers to make better-informed decisions. People in our state know NWEA and trust their test results. When NWEA released its projections of the COVID slide last spring, it spurred many education leaders to consider increased and innovative efforts to reach students. South Carolina policymakers were intent on measuring and reporting the actual impacts of school closures on student learning. We knew that once our kids were back in school, we would need NWEA’s help in digging into our state data.
Going into this project, what were your specific concerns about South Carolina students?
Matthew: The national projections gave us valuable information, but we needed our own data so we could try to predict student performance on our state end-of-year assessments. In addition, we’ve been especially concerned about the small, rural districts in our state that don’t have the same access to high-speed internet as our urban districts. We wanted to know: Are those students experiencing worse learning loss than their urban counterparts? Are our pupils in poverty disproportionately affected by remote learning because of the reliance on computers?
What was the main finding of your report?
Matthew: We learned that we have a lot of work to do. We found that our elementary students were the most impacted by learning loss, and that math saw a sharper decline than reading. Overall, the study projected that based on their preliminary tests this fall in both reading and math, 7 out of 10 students are not projected to meet state standards. And when we compare that to prior years, we see declines from where students are in a typical year.
Dana: It makes sense that our state-level data on learning loss in math would mirror what NWEA found in its national report. When you think about the foundational practices that are used in a math classroom, so much of it depends on direct instruction. So if students are losing that, then it makes sense that the declines would be more pronounced.
What did you learn about how those underrepresented students from rural districts are doing?
Greg: One of the important things we learned is that among students who are still participating in testing, we’re not seeing widening gaps between the state’s underserved kids and other student groups. Of course, the big caveat that we need to be mindful of going forward is that our overall testing numbers are down. Fewer students are testing, and we know from the data that a disproportionate number of these missing students come from schools with higher percentages of minority students and students in poverty. Since many of these students already lagged their peers, any learning loss is particularly bad news for this group. That said, we don’t have a complete picture of the current state of that gap. That’s why testing remains so important.
What kind of data were you able to include that went beyond the scope of the national report?
Greg: We got a variety of demographic data that helped us produce a more targeted and relevant report for South Carolina. That data included numbers on pupils in poverty, which is a more accurate estimator of socioeconomic status than free and reduced lunch. The state also gave us numbers for ESL students, racial and ethnic subgroups, and students receiving individualized education programs.
Dana: Our study also included interviews with leaders and educators in 15 districts, so that we could include the perspectives and lessons learned of those who have faced so many challenges this year. We’ve also surveyed parents.
Matthew: Districts had an incentive to take part in these interviews because we let them know that in addition to the statewide data the report would offer, NWEA would break down the data for those specific districts. So we were able to provide these districts with their own reports.
Has this report led to any specific actions in South Carolina?
Matthew: One of the outcomes so far is that the report has prompted our state’s Department of Education to require a mitigation plan for our students. The districts are going to be asked to look at their student data to identify strengths and weaknesses, and then to plan for instructional interventions to help make up for COVID-related learning loss.
This might have happened anyway, but the report allowed us to point to specific, significant learning loss—and that there are students who are still missing from our sample. Without this data, I’m not sure that the state and the districts would have been able to be quite so intentional and focused on that planning process.
I understand that other states have commissioned similar reports. Given that NWEA’s national report was so comprehensive and that all states are facing similar challenges, what motivates states to produce their own studies?
Greg: The national report starts the conversation and provides a great picture of what’s happening in the country, but a state report is what can really help drive policy and create action at the local level. That’s why states hunger for information on what’s happening to them. From the research perspective, the ability to examine student learning progress at the state level and talk to state leaders about how this pandemic has impacted their students provides a great deal of additional context that you won’t find in a national report.
What can other states learn from your experience in South Carolina?
Matthew: Every state is different, but we did find that our state findings basically aligned with NWEA’s national sample, showing that math was more negatively affected after the COVID shutdown and that elementary grades, in particular, were the hardest hit. This was valuable data for us, and I imagine other states would find it useful as well. The data allows policymakers to make informed decisions and move purposefully toward improvement, rather than just hoping for the best—or not wanting to know what the data says.