Abstract :
With the rising concerns over the fairness of psychometric tests, Differential Item Functioning (DIF) is increasingly applied in test development and use situations. Despite its widespread application, many questions remain unresolved as to the impact of DIF on the overall performance of test takers on the test and the fairness of the test. Rasch model and the invariance principle provide versatile tools for investigating the issue. This study aimed to shed light on the issue using the framework of the Rasch model. The participants were 1651 examinees (N=1651) who had taken a high-stakes test in 2010. A DIF analysis of the data showed that a large number of items were in fact functioning differentially for examinees from different academic backgrounds. Despite the existence of so many DIF items, the ability estimates obtained from the original test and those obtained from an item composite containing only neutral items showed that the estimates were highly invariant. The implications for validity and fairness issues are discussed in the light of the results of the study.