ITEM ANALYSIS OF MULTIPLE-CHOICE READING LITERACY INSTRUMENTS USING ITEM RESPONSE THEORY

Main Article Content

Yanika Lunrasri
Kamonwan Tangdhanakanond
Shotiga Pasiphol

Abstract

Reading literacy instruments were designed and validated to assess students’ reading literacy performance. The purposes of this study were 1) to validate the overall model fit and item fit of the reading literacy instruments, 2) to analyze the item discrimination and item difficulty parameters of the instruments, and 3) to analyze the reliability coefficients of the instruments. There were a total of 277 Grade 9th students in this study. The instruments consisted of 20 multiple-choice pretest items and 20 multiple-choice posttest items. Five measurement item response theory (IRT) models were fitted and compared as follows: 1) the one-parameter logistic model, 2) the two-parameter logistic model, 3) the three-parameter logistic model, 4) the multidimensional item response theory model, and 5) the 2PL bifactor model. The 2PL bifactor model was found to be the most appropriate model for the data. There were only three misfit items in the model. A majority of the items had good discrimination values, except three items needed to be modified. The difficulty estimates were in the acceptable range. Moreover, the instruments yielded highly internal reliability.

Article Details

How to Cite
Lunrasri, Y., Tangdhanakanond, K. ., & Pasiphol, S. . (2022). ITEM ANALYSIS OF MULTIPLE-CHOICE READING LITERACY INSTRUMENTS USING ITEM RESPONSE THEORY. Journal of Education and Innovation, 24(4), 61–72. Retrieved from https://so06.tci-thaijo.org/index.php/edujournal_nu/article/view/248937
Section
Research Articles

References

Baghaei, P., & Ravand, H. (2016). Modeling local item dependence in cloze and reading comprehension tests using testlet response theory. Psicologica, 37, 85-104.

Baker, F. B. (2001). The basics of item response theory (2nd ed.). College Park, MD: ERIC Clearinghouse on Assessment and Evaluation.

Bock, R. D., & Aitkin, M. (1981). Marginal maximum likelihood estimation of item parameters: Application of an EM algorithm. Psychometrika, 46(4), 443-459.

Byun, J. H., & Lee, Y. W. (2016). The latent trait modeling of passage-based reading comprehension test: Testlet-based MIRT approach. English Language Assessment, 11, 25-45.

Cai, Y., & Kunnan, A. J. (2018). Examining the inseparability of content knowledge from LSP reading ability: An approch combining bifactor-multidimensional item response theory and structural equation modeling. Language Assessment Quarterly, 15(2), 109-129.

Chalmers, R. P. (2012). mirt: A multidimensional item response theory package for the R environment. Journal of Statistical Software, 48(6), 1–29. https://doi.org/10.18637/jss.v048.i06

Chandai, S. (2016). Development of an instructional model based on scaffolded reading experiences approach and self-regulated learning for enhance reading literacy of lower secondary school students (Doctoral dissertation). Bangkok: Chulalongkorn University.

Creswell, J. W. (2012). Educational research: Planning, conducting, and evaluating quantitative and qualitative research (4th ed.). Boston, MA: Pearson.

Debelak, R., & Koller, I. (2020). Testing the local independence assumption of the Rasch model with Q3-based nonparametric model tests. Applied Psychological Measurement, 44(2), 103-117.

DeMars, C. E. (2006). Application of the bi-factor multidimensional item response theory model to testlet-based tests. Journal of Educational Measurement, 43(2), 145-168.

DeMars, C. E. (2012). Confirming testlet effects. Applied Psychological Measurement, 36(2), 104-121.

Desjardins, C. D., & Bulut, O. (2018). Handbook of educational measurement and psychometrics using R. Boca Raton, FL: CRC Press.

Diowvilai, D., Samranjai, J., Sommano, B., Janwong, V., & Mookham, T. (2012). Development of elementary children Grade 4-6 with reading literacy through Lampang learning enrichment network (Research report). Lampang: Lampang Rajabhat University.

Fox, J.-P., Wenzel, J., & Klotzke, K. (2020). The Bayesian covariance structure model for testlets. Journal of Education and Behavioral Statisitcs, 46(2), 219-243.

Gibbons, R. D., & Hedeker, D. R. (1992). Full-information item bi-factor analysis. Psychometrika, 57, 423-436.

Jumnaksarn, S. (2013). The development of causal relationship model of factors influencing on reading literacy of 15-year-old students in Thailand. Journal of education Research, Faculty of Education, Srinakharinwirot University, 8(2), 213-230.

Kanjanawasee, S. (2012). Modern test theory (4th ed.). Chulalongkorn University Printery.

Kim, W. H. (2017). Application of the IRT and TRT models to a reading comprehension test (Doctoral dissertation). Tennessee: Middle Tennessee State University.

Min, S., & He, L. (2014). Applying unidimensional and multidimensional item response theory models in testlet-based reading assessment. Language Testing, 31(4), 453-477.

Nilsawang, P. (2011). Factors related to reading literacy of grade 9 students under primary education service area office in Sisaket province (Master thesis). Mahasarakham: Mahasarakham University.

OECD. (2019a). PISA 2018 assessment and analytical framework. PISA, OECD Publishing.

OECD. (2019b). Thailand-Country Note - PISA 2018 Results. OECD Publishing.

Praputtakun, P., Dahsah, C., Tambunchong, C., & Mateapinikul, P. (2013). The case study of prethomsuksa 6 students’ scientific literacy and reading ability. Journal of Education Thaksin University, 13, 127-140.

Quaigrain, K., & Arhin, A. K. (2017). Using reliability and item analysis to evaluate a teacher-developed test in educational measurement and evaluation. Cogent Education, 4(1). DOI: 10.1080/2331186X.2017.1301013

Sabbag, A. G., & Zieffler, A. (2015). Assessing learning outcomes: An analysis of the goals-2 instrument. Statistics Education Research Journal, 14(2), 93-116.

The Institute for the Promotion of Teaching Science and Technology. (2018). PISA 2015 results in science, reading, and mathematics: Excellence and equity in education. Bangkok: Success Publication.

The Institute for the Promotion of Teaching Science and Technology. (2020). PISA 2018 results: What student know and can do. Retrieved from https://pisathailand.ipst.ac.th/issue-2019-48/