A novel nonparametric framework for DIF detection using kernel-smoothed item response curves
By: Adéla Hladká, Patrícia Martinková
Potential Business Impact:
Finds unfair test questions for different groups.
This study introduces a novel nonparametric approach for detecting Differential Item Functioning (DIF) in binary items through direct comparison of Item Response Curves (IRCs). Building on prior work on nonparametric comparison of regression curves, we extend the methodology to accommodate binary response data, which is typical in psychometric applications. The proposed approach includes a new estimator of the asymptotic variance of the test statistic and derives optimal weight functions that maximise local power. Because the asymptotic distribution of the resulting test statistic is unknown, a wild bootstrap procedure is applied for inference. A Monte Carlo simulation study demonstrates that the nonparametric approach effectively controls Type I error and achieves power comparable to the traditional logistic regression method, outperforming it in cases with multiple intersections of the underlying IRCs. The impact of bandwidth and weight specification is explored. Application to a verbal aggression dataset further illustrates the method's ability to detect subtle DIF patterns missed by parametric models. Overall, the proposed nonparametric framework provides a flexible and powerful alternative for detecting DIF, particularly in complex scenarios where traditional model-based assumptions may not be applicable.
Similar Papers
Reducing Differential Item Functioning via Process Data
Applications
Makes tests fairer for everyone by watching how they solve problems.
Robust Estimation of Item Parameters via Divergence Measures in Item Response Theory
Methodology
Makes test scores more honest, even with guessing.
Reliability-Targeted Simulation of Item Response Data: Solving the Inverse Design Problem
Methodology
Makes tests more fair by controlling how much information they give.