Yıl: 2018 Cilt: 5 Sayı: 4 Sayfa Aralığı: 682 - 700 Metin Dili: İngilizce DOI: 10.21449/ijate.472185 İndeks Tarihi: 14-11-2020

Automating Simulation Research for Item Response Theory using R

Öz:
A simulation study is a useful tool in examining how validly itemresponse theory (IRT) models can be applied in various settings. Typically,a large number of replications are required to obtain the desired precision.However, many standard software packages in IRT, such as MULTILOGand BILOG, are not well suited for a simulation study requiring a largenumber of replications because they were developed as a stand-alonesoftware package that is best suited for a single run. This articledemonstrated how built-in R functions can be used to automate thesimulation study using the stand-alone software packages in IRT. For ademonstration purpose, MULTILOG was used in the example codes in theappendices, but the overall framework of a simulation study and the builtin R functions used in this article can be applied for a simulation study usingother stand-alone software packages as well.
Anahtar Kelime:

Belge Türü: Makale Makale Türü: Araştırma Makalesi Erişim Türü: Erişime Açık
  • Bandalos, D. L. (2006). The use of monte carlo studiesin structural equation modeling research. In Structural equation modeling: A second course (pp. 385–426). Greenwich, CT: Information Age.
  • De Ayala, R. J. (2009). Theory and practice of item response theory. New York, NY: The Guilford Press.
  • Finch, H. (2008). Estimation of item response theory parameters in the presence of missing data. Journal of Educational Measurement, 45, 225–245.
  • Friedl, J. (2006). Mastering regular expressions. Sebastopol, CA: O’Reilly Media, Inc. Harwell, M., Stone, C. A., Hsu, T.-C., & Kirisci, L. (1996). Monte carlo studiesin item response theory. Applied Psychological Measurement, 20, 101–125.
  • Kim, H. J., Brennan, R. L., & Lee, W. C. (2017). Structural Zeros and Their Implications With Log‐Linear Bivariate Presmoothing Under the Internal‐Anchor Design. Journal of Educational Measurement, 54, 145-164.
  • Kim, K. Y., & Lee, W. C. (2017). The Impact of Three Factors on the Recovery of Item Parameters for the Three-Parameter Logistic Model. Applied Measurement in Education, 30, 228-242.
  • Kim, S., & Lee, W. C. (2006). An Extension of Four IRT Linking Methods for Mixed‐Format Tests. Journal of Educational Measurement, 43, 53-76.
  • Nader, I. W., Tran, U. S., & Voracek, M. (2015). Effects of Initial Values and Convergence Criterion in the Two-Parameter Logistic Model When Estimating the Latent Distribution in BILOG-MG 3. PloS one, 10, e0140163.
  • Partchev, I. (2009). irtoys: Simple interface to the estimation and plotting of irt models. R package version 0.1, 2.
  • R Core Team. (2015). R: A language and environment for statistical computing [Computer software manual]. Vienna, Austria. Retrieved from http://www.R-project.org/ (ISBN 3- 900051-07-0)
  • Reckase, M. D. (1979). Unifactor latent trait models applied to multifactor tests: Results and implications. Journal of Educational and Behavioral Statistics, 4, 207–230.
  • Spector, P. (2008). Data manipulation with r. New York, NY: Springer.
  • Stone, C. A. (2000). Monte Carlo based null distribution for an alternative goodness‐of‐fit test statistic in IRT models. Journal of Educational Measurement, 37, 58-75.
  • Thissen, D., Chen, W.-H., & Bock, R. D. (2003). Multilog 7 for windows: Multiple-category item analysis and test scoring using item response theory [computer software]. lincolnwood, il: Scientific software international. IL: Scientific Software International.
  • Zimowski, M. F., Muraki, E., Mislevy, R. J., & Bock, R. D. (1996). Bilog-mg: Multiple-group irt analysis and test maintenance for binary items. Chicago: Scientific Software International, 4(85), 10.
APA Lee S, CHOI Y, Cohen A (2018). Automating Simulation Research for Item Response Theory using R. , 682 - 700. 10.21449/ijate.472185
Chicago Lee Sunbok,CHOI Youn-Jeng,Cohen Allan Automating Simulation Research for Item Response Theory using R. (2018): 682 - 700. 10.21449/ijate.472185
MLA Lee Sunbok,CHOI Youn-Jeng,Cohen Allan Automating Simulation Research for Item Response Theory using R. , 2018, ss.682 - 700. 10.21449/ijate.472185
AMA Lee S,CHOI Y,Cohen A Automating Simulation Research for Item Response Theory using R. . 2018; 682 - 700. 10.21449/ijate.472185
Vancouver Lee S,CHOI Y,Cohen A Automating Simulation Research for Item Response Theory using R. . 2018; 682 - 700. 10.21449/ijate.472185
IEEE Lee S,CHOI Y,Cohen A "Automating Simulation Research for Item Response Theory using R." , ss.682 - 700, 2018. 10.21449/ijate.472185
ISNAD Lee, Sunbok vd. "Automating Simulation Research for Item Response Theory using R". (2018), 682-700. https://doi.org/10.21449/ijate.472185
APA Lee S, CHOI Y, Cohen A (2018). Automating Simulation Research for Item Response Theory using R. International Journal of Assessment Tools in Education, 5(4), 682 - 700. 10.21449/ijate.472185
Chicago Lee Sunbok,CHOI Youn-Jeng,Cohen Allan Automating Simulation Research for Item Response Theory using R. International Journal of Assessment Tools in Education 5, no.4 (2018): 682 - 700. 10.21449/ijate.472185
MLA Lee Sunbok,CHOI Youn-Jeng,Cohen Allan Automating Simulation Research for Item Response Theory using R. International Journal of Assessment Tools in Education, vol.5, no.4, 2018, ss.682 - 700. 10.21449/ijate.472185
AMA Lee S,CHOI Y,Cohen A Automating Simulation Research for Item Response Theory using R. International Journal of Assessment Tools in Education. 2018; 5(4): 682 - 700. 10.21449/ijate.472185
Vancouver Lee S,CHOI Y,Cohen A Automating Simulation Research for Item Response Theory using R. International Journal of Assessment Tools in Education. 2018; 5(4): 682 - 700. 10.21449/ijate.472185
IEEE Lee S,CHOI Y,Cohen A "Automating Simulation Research for Item Response Theory using R." International Journal of Assessment Tools in Education, 5, ss.682 - 700, 2018. 10.21449/ijate.472185
ISNAD Lee, Sunbok vd. "Automating Simulation Research for Item Response Theory using R". International Journal of Assessment Tools in Education 5/4 (2018), 682-700. https://doi.org/10.21449/ijate.472185