School & test engagement
Educators need accurate assessment data to help students learn. But when students rapid-guess or otherwise disengage on tests the validity of scores can be affected. Our research examines the causes of test disengagement, how it relates to students’ overall academic engagement, and its impacts on individual test scores. We look at its effects on aggregated metrics used for school and teacher evaluations, achievement gap studies, and more. This research also explores better ways to measure and improve engagement and to help ensure that test scores more accurately reflect what students know and can do.
Researchers can detect when students aren’t trying on computerized tests
A testing company spots disengaged students who are guessing answers too quickly.
The Hechinger Report
Mentions: Steven Wise
Topics: Equity, Innovations in reporting & assessment, School & test engagement
Do students rapidly guess repeatedly over time? A longitudinal analysis of student test disengagement, background, and attitudes
This study investigates whether rapid guessing is a stable trait-like behavior or if rapid guessing is determined mostly by situational variables, and whether rapid guessing over the course of several tests is associated with certain psychological and background measures. We find that rapid guessing tends to be more state-like compared to academic achievement scores, which are fairly stable and that repeated rapid guessing is strongly associated with students’ academic self-efficacy and self-management scores.
By: James Soland, Megan Kuhfeld
Topics: Measurement & scaling, School & test engagement, Social-emotional learning
Why we’re all-in on student test engagement
A lack of test engagement can negatively impact scores. Learn about NWEA’s work to prevent and mitigate impacts of rapid guessing.
By: Steven Wise
Topics: Innovations in reporting & assessment, School & test engagement
Computer-based testing offers glimpse into ‘rapid guessing’ habits
When students speed through a computer-based test, their responses are far less likely to be accurate than if they took longer to find the solution, according to new research.
Mentions: Steven Wise
Topics: Equity, School & test engagement
Can test metadata help schools measure social-emotional learning?
Social-emotional learning (SEL) competencies like self-efficacy and conscientiousness can be predictive of long-term academic achievement. But they can also be difficult to measure. In a new study led by NWEA’s James Soland, researchers investigated whether assessment metadata – the way students approach tests and surveys – can provide useful SEL data to schools and educators. Soland joins CPRE research specialist Tesla DuBois to discuss his findings, their implications, and the promise and limitations of student metadata in general.
Consortium for Policy Research in Education Knowledge Hub podcast
Mentions: James Soland
Topics: Innovations in reporting & assessment, School & test engagement, Social-emotional learning
Are achievement gap estimates biased by differential student test effort?
New research shows that test effort differs substantially across student gender and racial subgroups. What does this mean for achievement gap estimates?
By: James Soland
Topics: Equity, School & test engagement, Social-emotional learning
The relationship between test-taking disengagement and performance on MAP Growth retests
Educators sometimes ask: do students rapidly guess because they don’t know the answer to a question, or do rapid guesses reflect a lack of engagement with the test? Would a student’s scores improve if that student engaged more with the assessment and rapidly guessed on fewer items? Examining MAP® Growth™ test scores and levels of student test engagement for over 100,000 tests for which students retested within one day, the results showed that students’ test taking engagement often differed between the initial test and the retest.
By: Steven Wise
Topics: School & test engagement