Article

Detecting Differential Item Functioning and Differential Step Functioning Due to Differences that Should Matter

Authors
  • Tess Miller
  • Saad Chahine
  • Ruth A. Childs

Abstract

This study illustrates the use of differential item functioning (DIF) and differential step functioning (DSF) analyses to detect differences in item difficulty that are related to experiences of examinees, such as their teachers’ instructional practices, that are relevant to the knowledge, skill, or ability the test is intended to measure. This analysis is in contrast to the typical use of DIF or DSF to detect differences related to characteristics of examinees, such as gender, language, or cultural knowledge, that should be irrelevant. Using data from two forms of Ontario’s Grade 9 Assessment of Mathematics, analyses were performed comparing groups of students defined by their teachers’ instructional practices. All constructed-response items were tested for DIF using the Mantel Chi-Square, standardized Liu Agresti cumulative common log-odds ratio, and standardized Cox’s noncentrality parameter. Items exhibiting moderate to large DIF were subsequently tested for DSF. In contrast to typical DIF or DSF analyses, which inform item development, these analyses have the potential to inform instructional practice. Accessed 9,577 times on https://pareonline.net from July 13, 2010 to December 31, 2019. For downloads from January 1, 2020 forward, please click on the PlumX Metrics link to the right.

Keywords: Test Construction, Standardized Tests

How to Cite:

Miller, T., Chahine, S. & Childs, R. A., (2010) “Detecting Differential Item Functioning and Differential Step Functioning Due to Differences that Should Matter”, Practical Assessment, Research, and Evaluation 15(1): 10. doi: https://doi.org/10.7275/dzm4-q558

Downloads:
Download PDF
View PDF

138 Views

41 Downloads