School & test engagement
Educators need accurate assessment data to help students learn. But when students rapid-guess or otherwise disengage on tests the validity of scores can be affected. Our research examines the causes of test disengagement, how it relates to students’ overall academic engagement, and its impacts on individual test scores. We look at its effects on aggregated metrics used for school and teacher evaluations, achievement gap studies, and more. This research also explores better ways to measure and improve engagement and to help ensure that test scores more accurately reflect what students know and can do.


The impact of technology-enhanced items on test-taker disengagement
Can technology-enhanced items increase student engagement on assessments? A new study provides insight.
By: Steven Wise, James Soland, Laurence Dupray
Topics: School & test engagement, Innovations in reporting & assessment


What happens when test takers disengage? Understanding and addressing rapid guessing
How does rapid-guessing differ from solution behavior? Research provides insight into test disengagement and how disengagement should be managed in scoring.
By: Steven Wise, Megan Kuhfeld


Can item response times provide insight into students’ motivation and self-efficacy in math
What can we glean about students’ social-emotional learning from how long they spend on math test questions? New research shows promise and limitations of using response time metadata to measure SEL.
By: James Soland
Topics: School & test engagement, Math & STEM, Social-emotional learning


“No fun games”: Engagement effects of two gameful assessment prototypes
This study examines the impact of two “gameful assessment” prototypes on student engagement and teacher perceptions among 391 Grades 3–7 students and 14 teachers in one Midwestern and one Northwestern school.
By: Meg Guerreiro, Chase Nordengren
Topics: School & test engagement, Innovations in reporting & assessment


Identifying disengaged survey responses: New evidence using response time metadata
In this study, we condition results from a variety of detection methods used to identify disengaged survey responses on response times. We then show how this conditional approach may be useful in identifying where to set response time thresholds for survey items, as well as in avoiding misclassification when using other detection methods.
By: James Soland, Steven Wise, Lingyun Gao


When computer-based tests are used, disengagement can be detected through occurrences of rapid-guessing behavior. This empirical study investigated the impact of a new effort monitoring feature that can detect rapid guessing, as it occurs, and notify proctors that a test taker has become disengaged.
By: Steven Wise, Megan Kuhfeld, James Soland
Topics: Measurement & scaling, Innovations in reporting & assessment, School & test engagement


This paper briefly discusses the trade-offs involved in making such a transition, and then focuses on a relatively unexplored benefit of computer-based tests – the control of construct-irrelevant factors that can threaten test score validity.
By: Steven Wise
Topics: Measurement & scaling, Innovations in reporting & assessment, School & test engagement