참고문헌
- Ministry of Education (2022). Mathematics curriculum. Notification of Ministry of Education No. 2022-33 [Vol 8].
- Kwon, J. R. (2024). Analysis of the trend of mathematical achievement of students according to school grade change in TIMSS. Communications of Mathematical Education, 38(2), 121-144. https://doi.org/10.7468/jksmee.2024.38.2.121
- Park, J. H., & Kim, S. (2015). The analysis of characteristic achievement of TIMSS 2011 G8 high-performing countries according to the mathematics cognitive attributes. Journal of Educational Research in Mathematics, 25(3), 303-321.
- Sang, K. A., Kim K. H., Park S. W., Jeon S. K., Park, M. M., & Lee, J. W. (2020). An international comparative study on the trend of mathematical and scientific achievement: TIMSS 2019 (RRE 2020-10). Korea Institute of Curriculum and Evaluation.
- Song, M. Y., & Kim, S. H. (2007). Investigating the hierarchical nature of content and cognitive domains in the mathematics curriculum for Korean middle school students via assessment items. School Mathematics, 9(2), 223-240.
- Lee, K. H., Yoo, Y. J., & Tak, B. (2021). Towards data-driven statistics education: An exploration of restructuring the mathematics curriculum. School Mathematics, 23(3), 361-386. http://doi.org/10.29275/sm.2021.09.23.3.361
- Rim, H., Kim, S. K., & Park, J. H. (2018). Development of assessment framework and items of NAEA considering the math competencies of the 2015 revised mathematics curriculum. School Mathematics, 20(1), 65-82. http://doi.org/10.29275/sm.2018.03.20.1.65
- Tak, B. (2018). An analysis on classifying and representing data as statistical literacy: Focusing on elementary mathematics curriculum for 1st and 2nd grades. Journal of Elementary Mathematics Education in Korea, 22(3), 221-240.
- Han, C., & Park, M. (2015). A comparison study on mathematics assessment frameworks: Focusing on NAEP 2015, TIMSS 2015 and PISA 2015. The Mathematics Education, 54(3), 261-282. https://doi.org/10.7468/mathedu.2015.54.3.261
- Ackerman, T. A. (1994). Using multidimensional item response theory to understand what items and tests are measuring. Applied Measurement in Education, 7(4), 255-278. https://doi.org/10.1207/s15324818ame0704_1
- Anderson, L. W., Krathwohl, D. R., Airasian, P. W., Cruikshank, K. A., and Wittrock, M. C. (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom's taxonomy of educational objectives. Longman.
- Arikan, S. (2015). Construct validity of TIMSS 2011 mathematics cognitive domains for Turkish students. International Online Journal of Educational Sciences, 7(1), 29-44.
- Balfaqeeh, A., Mansour, N., & Forawi, S. (2022). Factors influencing students' achievements in the content and cognitive domains in TIMSS 4th grade science and mathematics in the United Arab Emirates. Education Sciences, 12(9), 618. http://dx.doi.org/10.3390/educsci12090618
- Buck, G. (1994). The appropriacy of psychometric measurement models for testing second language listening comprehension. Language Testing, 11(2), 145-170. https://doi.org/10.1177/026553229401100204
- da Silva, M. A., Liu, R., Huggins-Manley, A. C., & Bazan, J. L. (2019). Incorporating the q-matrix into multidimensional item response theory models. Educational and Psychological Measurement, 79(4), 665-687. https://doi.org/10.1177%2F0013164418814898 https://doi.org/10.1177%2F0013164418814898
- Dancey, C. P., & Reidy, J. (2017). Statistics without maths for psychology. Pearson.
- Delil, A. (2019). How fifth graders are assessed through central exams in Turkey: A comparison with TIMSS 2019 Assessment Framework. International Online Journal of Educational Sciences, 3(11), 222-234.
- Embretson, S. E., & Reise, S. (2000). Item response theory as model-based measurement. In Embretson, S. E., & Reise, S. (Eds.), Item response theory for psychologists (pp. 158-186). Lawrence Erlbaum Associates.
- Fishbein, B., Foy, P., & Tyack, L. (2020). Reviewing the TIMSS 2019 achievement item statistics. In Martin, M. O., von Davier, M., & Mullis, I. V. S. (Eds.), Methods and procedures: TIMSS 2019 Technical report (pp. 10.1-10.70). TIMSS & PIRLS International Study Center. https://timssandpirls.bc.edu/timss2019/methods/chapter-10.html
- Foy, P., Fishbein, B., von Davier, M., & Yin, L. (2020). Implementing the TIMSS 2019 scaling methodology. In Martin, M. O., von Davier, M., & Mullis, I. V. S. (Eds.), Methods and procedures: TIMSS 2019 technical report (pp. 12.1-12.146). TIMSS & PIRLS International Study Center, Boston College. https://timssandpirls.bc.edu/timss2019/methods/chapter-12.html
- George, A. C., & Robitzsch, A. (2018). Focusing on interactions between content and cognition: a new perspective on gender differences in mathematical sub-competencies. Applied Measurement in Education, 31(1), 79-97. https://doi.org/10.1080/08957347.2017.1391260
- Gierl, M. J., Bisanz, J., Bisanz, G. L., & Boughton, K. A. (2003). Identifying content and cognitive skills that produce gender differences in mathematics: A demonstration of the multidimensionality-based DIF analysis paradigm. Journal of Educational Measurement, 40(4), 281-306.
- Harks, B., Klieme, E., Hartig, J., & Leiss, D. (2014). Separating cognitive and content domains in mathematical competence. Educational Assessment, 19(4), 243-266. https://doi.org/10.1080/10627197.2014.964114
- Jang, Y. J. (2022). Reliability and validity evidence of diagnostic methods: Comparison of diagnostic classification models and item response theory-based methods [Unpublished doctoral dissertation, University of Minnesota].
- Moore, D. (1992). Teaching statistics as a respectable subject. In F. Gordon & S. Gordon (Eds.), Statistics for the twenty-first century (p. 14-25). The Mathematical Association of America.
- Mullis, I. V. S., & Martin, M. O. (2017). TIMSS 2019 assessment frameworks. Retrieved from Boston College, TIMSS & PIRLS International Study.
- Mullis, I. V. S., Martin, M. O., Foy, P., Kelly, D. L., & Fishbein, B. (2020). TIMSS 2019 international results in mathematics and science. Retrieved from Boston College, TIMSS & PIRLS International Study.
- Natesan, P., Nandakumar, R., Minka, T., & Rubright, J. D. (2016). Bayesian prior choice in IRT estimation using MCMC and variational Bayes. Frontiers in psychology, 7, 1422. https://doi.org/10.3389/fpsyg.2016.01422
- Niss, M. (2003). Mathematical competencies and the learning of mathematics: The Danish KOM project. In A. Gagatsis & S. Papastavridis (Eds.), Mediterranean Conference on Mathematical Education (pp. 115-124). Athens, Greece: Hellenic Mathematical Society and Cyprus Mathematical Society.
- Novikasari, I. (2016). The improvement of mathematics content knowledge on elementary school teacher candidates in problem based learning-models. International Journal of Education and Research, 4(17), 153-162.
- OECD. (2023). PISA 2022 assessment and analytical framework. Retrieved from https://doi.org/10.1787/19963777
- Plummer, M. (2017). JAGS Version 4.3.0 user manual. Retrieved from https://sourceforge.net/projects/mcmc-jags/files/Manuals/4.x/
- Shu, T., Luo, G., Luo, Z., Yu, X., Guo, X., & Li, Y. (2023). An explicit form with continuous attribute profile of the partial mastery DINA model. Journal of Educational and Behavioral Statistics, 48(5), 573-602.
- Su, Y. S., & Yajima, M. (2020). R2jags: Using R to run 'JAGS'. R package version 0.6-1. Retrieved from https://CRAN.R-project.org/package=R2jags
- Tatsuoka, K. K. (1983). Rule-space: An approach for dealing with misconceptions based on item response theory. Journal of Educational Measurement, 20, 345-354.
- von Davier, M. (2020). TIMSS 2019 scaling methodology: Item response theory, population models, and linking across modes. In M. O. Martin, M. von Davier, & I. V. S. Mullis (Eds.), Methods and procedures: TIMSS 2019 technical report (pp. 11.1-11.25). TIMSS & PIRLS International Study Center, Boston College. https://timssandpirls.bc.edu/timss2019/methods/chapter-11.html
- Wu, M., & Adams, R. (2006). Modelling mathematics problem solving item responses using a multidimensional IRT model. Mathematics Education Research Journal, 18(2), 93-113.
- Young, J. W., Cho, Y., Ling, G., Cline, F., Steinberg, J., & Stone, E. (2008). Validity and fairness of state standards-based assessments for English language learners. Educational Assessment, 13, 170-192.
- Zhang, J. (2004). Comparison of unidimensional and multidimensional approaches to IRT parameter estimation (ETS Research Report 04-44). Educational Testing Service.