Modeling Covarying Responses In Complex Tasks

Document Type

Conference Proceeding

Publication Date

2022

Published In

Quantitative Psychology: The 86th Annual Meeting Of The Psychometric Society, Virtual, 2021

Abstract

In testing situations, participants are often asked for supplementary responses in addition to the primary response of interest, which may include quantities like confidence or reported difficulty. These additional responses can be incorporated into a psychometric model either as a predictor of the main response or as a secondary response. In this paper we explore both of these approaches for incorporating participant’s reported difficulty into a psychometric model using an error rate study of fingerprint examiners. Participants were asked to analyze print pairs and make determinations about the source, which can be scored as correct or incorrect decisions. Additionally, participants were asked to report the difficulty of the print pair on a five point scale. In this paper, we model (a) the responses of individual examiners without incorporating reported difficulty using a Rasch model, (b) the responses using their reported difficulty as a predictor, and (c) the responses and their reported difficulty as a multivariate response variable. We find that approach (c) results in more balanced classification errors, but incorporating reported difficulty using either approach does not lead to substantive changes in proficiency or difficulty estimates. These results suggest that, while there are individual differences in reported difficulty, these differences appear to be unrelated to examiners’ proficiency in correctly distinguishing matched from non-matched fingerprints.

Keywords

Item response theory, Forensic science, Bayesian statistics

Share

COinS