%0 Journal Article %J Journal of Educational Measurement %D 2017 %T Dual-Objective Item Selection Criteria in Cognitive Diagnostic Computerized Adaptive Testing %A Kang, Hyeon-Ah %A Zhang, Susu %A Chang, Hua-Hua %X The development of cognitive diagnostic-computerized adaptive testing (CD-CAT) has provided a new perspective for gaining information about examinees' mastery on a set of cognitive attributes. This study proposes a new item selection method within the framework of dual-objective CD-CAT that simultaneously addresses examinees' attribute mastery status and overall test performance. The new procedure is based on the Jensen-Shannon (JS) divergence, a symmetrized version of the Kullback-Leibler divergence. We show that the JS divergence resolves the noncomparability problem of the dual information index and has close relationships with Shannon entropy, mutual information, and Fisher information. The performance of the JS divergence is evaluated in simulation studies in comparison with the methods available in the literature. Results suggest that the JS divergence achieves parallel or more precise recovery of latent trait variables compared to the existing methods and maintains practical advantages in computation and item pool usage. %B Journal of Educational Measurement %V 54 %P 165–183 %U http://dx.doi.org/10.1111/jedm.12139 %R 10.1111/jedm.12139 %0 Journal Article %J Applied Psychological Measurement %D 2016 %T Parameter Drift Detection in Multidimensional Computerized Adaptive Testing Based on Informational Distance/Divergence Measures %A Kang, Hyeon-Ah %A Chang, Hua-Hua %X An informational distance/divergence-based approach is proposed to detect the presence of parameter drift in multidimensional computerized adaptive testing (MCAT). The study presents significance testing procedures for identifying changes in multidimensional item response functions (MIRFs) over time based on informational distance/divergence measures that capture the discrepancy between two probability functions. To approximate the MIRFs from the observed response data, the k-nearest neighbors algorithm is used with the random search method. A simulation study suggests that the distance/divergence-based drift measures perform effectively in identifying the instances of parameter drift in MCAT. They showed moderate power with small samples of 500 examinees and excellent power when the sample size was as large as 1,000. The proposed drift measures also adequately controlled for Type I error at the nominal level under the null hypothesis. %B Applied Psychological Measurement %V 40 %P 534-550 %U http://apm.sagepub.com/content/40/7/534.abstract %R 10.1177/0146621616663676