What’s happening with data and assessment in teacher prep programs? We asked an expert

As the director of policy and advocacy at NWEA, and a former teacher, I love engaging in meaningful conversations about how to improve public education, particularly around the use of data and assessments to help transform teaching and learning.

I was recently fortunate enough to have such a conversation—this time with my old friend and fellow North Carolinian, Dr. Anthony Graham. He’s the provost and vice chancellor for academic affairs at Winston-Salem State University, which was recognized for having one of the nation’s best teacher prep programs.

Dr. Graham was previously the dean of the College of Education at North Carolina Agricultural and Technical State University and is a former high school English teacher. He had so much wisdom to share during our conversation. Here are some of the topics we discussed:

  • Trends and progress in educator preparation programs
  • Challenges HBCUs are facing and the vital role they play in teacher preparation
  • Inequities and innovation amid COVID-19 recovery efforts
  • The need to check for bias in assessment and data analysis

You can find our conversation below. It’s real talk, between real friends and educators, lightly edited for clarity and length.

On preparing our teachers to use assessment data

LaTanya Pattillo (LP): Using data and assessments in the classroom to inform instruction and be responsive to student needs is so important. But I don’t recall getting a lot of training on that when I was preparing to become a teacher 12 years ago. What differences do you see emerging in how educators are prepared today when it comes to using assessment and data compared to when you were a teacher?

Anthony Graham (AG): When I was in the classroom, it almost seemed as if we were in the infancy stage when it came to language around data and assessment. Now, the language is so commonplace. You talk about the disaggregation of data, and everyone immediately knows what you mean. Back then, I don’t know if that would have been the case.

Unfortunately, though, we also fell into a rut where we got away from what I think is the true essence of assessment, which is guiding your teaching practice to ensure students are learning. Instead, we moved to using assessments to separate and segregate learners, not necessarily to change instructional practices to better support the learner. I would like to believe that is now starting to get better. But I don’t know, quite honestly. I hear teachers talk about what it means to differentiate instruction to better meet the needs of all learners, but I don’t know if I see it consistently with a sense of fidelity across classrooms. We’ve gotten more comfortable with the language of assessment. I don’t know if we’ve gotten better with the actual practice of responding to the learner’s needs.

LP: The discussion around assessment is a nuanced one. There are different types of assessments and different uses for assessments. I’m wondering what you think is behind this lack of good assessment practices you describe. Is it a lack of understanding around the different types of assessments and practices? What else might be behind this disconnect between what we hear being communicated and what is occurring?

We’ve gotten more comfortable with the language of assessment. I don’t know if we’ve gotten better with the actual practice of responding to the learner’s needs.

AG: I don’t know if we do a very effective job across the board helping people understand the different ways to assess learners. You’ve got to know the learners and the different assessment models.

Then, you’ve got to be willing to try those things and not abandon them when they don’t appear to work. I see that a lot. We try something, and it doesn’t go well the first time, and we automatically conclude, “Well that didn’t work so I’m going to go back to what I’m comfortable doing.” You’ve got to stick with it and keep trying that thing that you’re uncomfortable with to see if there might be a different outcome. You just established a baseline by doing it the first time. You can’t abandon it just by establishing a baseline.

LP: I remember being in the classroom. Students would take an assessment, and I’d get back the results. I remember thinking I need to learn how to do it better, to analyze the data better. I didn’t come out of my teacher training program with that skill. And then, as a teacher, we would sit in these 30-minute, maybe 60-minute, professional development meetings and simply be told to look at the data and then go and use the data in our classrooms. I share all this with you to ask, how can we think about better preparing our candidates and teachers to analyze data for instruction? Where are we, and where do we need to go?

AG: We’ve gotten better in our educator preparation programs. You see more of our candidates in classrooms with learners engaging in the practice of teaching, assessing, and sitting down and analyzing that information. And then the most important part of this process is asking, “What do I do differently?” I see that occurring much more now than I did 10 years ago. It’s that closing of the loop: what do I do differently based on the analyses of these data?

I’ve seen that for some of our classroom teachers, the struggle is, “The data is telling me I have to do something different, but I don’t know what to do. I’ve been teaching this way and now the data is saying to me there is a group or perhaps subgroups of learners who aren’t quite getting what I’m trying to teach, which suggests to me I need to do something differently, but what?” It’s that coaching aspect that I think is needed if we are to make differences in terms of how we engage all learners in the classroom. The only way you will know what to do differently is if someone is there to help you think differently about what you’ve been doing. It’s the closing of the loop and understanding what you can do differently as a result of what the data is telling you.

LP: What have you heard from your professors about their needs around assessment and how to support teacher candidates? Are they saying that there are specific things that would be helpful to them as they prepare our teacher candidates?

The only way you will know what to do differently is if someone is there to help you think differently about what you’ve been doing.

AG: Often, the professors feel we’ve prepared the candidate to do certain things, assessment oriented and otherwise, and then they leave the ed prep program, and what’s happening in the school district is totally different from how we prepared the candidate.

There has to be a way to address that gulf, so the candidate doesn’t feel as if he or she was ill-prepared. And the speed at which technology is changing underneath our feet also makes it difficult, particularly for educator preparation programs where resources might be tight. We might not be able to emulate the types of technology that school districts are purchasing and adopting.

How to support HBCUs

LP: You’re mentioning access and exposure to tools. That has to do with the resources schools are provided. We know that HBCUs (historically Black colleges and universities) struggle with resources. There is a difference in resource allocation for our HBCUs and some of our other institutions. When we talk about access and opportunity, it’s important that we not only talk about access and opportunity for the candidates but also for the professionals who teach and train the candidates. We know the value of trying to stay current. It’s hard when you don’t have the resources to make your program viable and attractive to candidates.

AG: Part of the challenge we face at HBCUs specifically is a perception challenge. If our candidates go into a school district not exposed to a technological resource the district has, you tend to hear, “That student is from X University and they’re always behind.” It becomes a larger referendum on the quality of person we produce. It’s a perception that ends up getting connected with reputation unfairly when the truth of the matter is we’re not funded at the level at which we should be funded.

The other thing is we could put a hundred teachers out. If one doesn’t meet an expectation, then the entire lot is perceived as not meeting the expectation. That doesn’t happen to some of the majority institutions.

LP: As we think about what we need to do on a national level to support all our institutions and provide equity for the institutions who need it the most, we have to think about how we support our HBCUs using the same lens we use when we talk about supporting our students who need the resources the most. We continue to see that imbalance and those inequities. Thank you for calling that out. It’s important.

What’s next, amid the COVID-19 recovery?

LP: We know the challenges we’re seeing in education today, more than two years after the start of the pandemic, existed long before COVID. How do we acknowledge that and use this moment to address those systemic issues, particularly how we support educator preparation programs? We know if we support how our candidates are trained, it will make its way into schools.

[W]e tend not to ask […] the person looking at the results of that data to take the time to check their own potential biases.

AG: COVID put a spotlight on the inequities, as you said, that existed long before the pandemic. I cringe when I hear people say, “I can’t wait until we get back to normal.” Normal for some people was significantly abnormal. Going back to that wouldn’t make sense. It would continue to disadvantage and disenfranchise people. COVID did force us to attempt to innovate and be problem solvers in how we went about educating students and grappling with accessibility. I hope it opened our eyes to what the possibilities can be moving forward.

LP: One thing I want to continue to emphasize is the need for quality assessments that are used effectively, where the data is used effectively, and to encourage folks not to make rash decisions without really paying attention to and recognizing what the data is saying. We have to balance the urge and need to do something—because this is a historic time and historic amount of funding—with the need to be strategic. That comes with thinking about data, thinking about instructional practices, and thinking about models in place that allow us to be innovative and capitalize on the things that exist and are effective.

AG: One more point: when you have people using assessments to analyze data, we tend not to ask, or perhaps even require, the person looking at the results of that data to take the time to check their own potential biases and subjectivities. You can look at any quantitative set of information and you can make it say what you want it to say. If you’re coming to that data set with your own biases, you’re going to impose that on the data set, and you’ll come to a rushed conclusion that takes you where you want it to go before you sat down to look at the data. I don’t know if we question the data analyzer enough in terms of, what are your subjectivities? What are your biases? If you don’t check that, you end up making some conclusions that can end up harming people.

LP: That is critical. The assessment instrument must be unbiased and inclusive, and it must provide an opportunity to get real and equitable results. And then, to your point, we have to think about how you support folks who analyze and read that data. There needs to be an equity lens to data analysis.

I just want to thank you for talking today. I loved the discussion.

AG: Anytime. I did too!

Learn more

For more on the impact of COVID-19 on student learning, check out NWEA research on the topic. You can read more about Dr. Graham and his work by visiting Winston-Salem State University online. We’re @NWEAPolicy on Twitter and would love to hear from you.

Blog post

Helping students grow

Students continue to rebound from pandemic school closures. NWEA® and Learning Heroes experts talk about how best to support them here on our blog, Teach. Learn. Grow.

See the post

Guide

Put the science of reading into action

The science of reading is not a buzzword. It’s the converging evidence of what matters and what works in literacy instruction. We can help you make it part of your practice.

Get the guide

Article

Support teachers with PL

High-quality professional learning can help teachers feel invested—and supported—in their work.

Read the article