Journal article

Using retest data to evaluate and improve effort-moderated scoring

2020

Published in:

Journal of Educational Measurement, https://doi.org/10.1111/jedm.12275

By: Steven Wise, Megan Kuhfeld

Abstract

There has been a growing research interest in the identification and management of disengaged test taking, which poses a validity threat that is particularly prevalent with lowā€stakes tests. This study investigated effortā€moderated (Eā€M) scoring, in which item responses classified as rapid guesses are identified and excluded from scoring. Using achievement test data composed of test takers who were quickly retested and showed differential degrees of disengagement, three basic findings emerged. First, standard Eā€M scoring accounted for roughly oneā€third of the score distortion due to differential disengagement. Second, a modified Eā€M scoring method that used more liberal time thresholds performed betterā€”accounting for twoā€thirds or more of the distortion. Finally, the inability of Eā€M scoring to account for all of the score distortion suggests the additional presence of nonrapid item responses that reflect lessā€thanā€full engagement by some test takers.

See More
Visit the journal

This article was published outside of NWEA. The full text can be found at the link above.


Related Topics