A Regression Discontinuity Design Framework for Controlling Selection Bias in Evaluations of Differential Item Functioning

Natalie A. Koziol, J. Marc Goodrich, Hyeon Jin Yoon

Research output: Contribution to journalArticlepeer-review

Abstract

Differential item functioning (DIF) is often used to examine validity evidence of alternate form test accommodations. Unfortunately, traditional approaches for evaluating DIF are prone to selection bias. This article proposes a novel DIF framework that capitalizes on regression discontinuity design analysis to control for selection bias. A simulation study was performed to compare the new framework with traditional logistic regression, with respect to Type I error and power rates of the uniform DIF test statistics and bias and root mean square error of the corresponding effect size estimators. The new framework better controlled the Type I error rate and demonstrated minimal bias but suffered from low power and lack of precision. Implications for practice are discussed.

Original languageEnglish (US)
JournalEducational and Psychological Measurement
DOIs
StateAccepted/In press - 2022
Externally publishedYes

Keywords

  • differential item functioning (DIF)
  • logistic regression
  • regression discontinuity design
  • selection bias

ASJC Scopus subject areas

  • Education
  • Developmental and Educational Psychology
  • Applied Psychology
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'A Regression Discontinuity Design Framework for Controlling Selection Bias in Evaluations of Differential Item Functioning'. Together they form a unique fingerprint.

Cite this