DOI QR코드

DOI QR Code

The Study of the Mode Effect between Computer-based and Paper-based Science Tests in TIMSS 2019

TIMSS 2019 과학 문항에서 컴퓨터 기반 평가와 지필 평가 간의 검사 매체 차이 탐색

  • Kim, Hyun-Kyung (Division of Science Education (Chemistry) and Institute of Science Education, Jeonbuk National University)
  • 김현경 (전북대학교 과학교육학부 화학교육전공 및 과학교육연구소)
  • Received : 2020.11.02
  • Accepted : 2020.12.01
  • Published : 2021.02.20

Abstract

This study analyzed the science assessment items that were conducted by the paper and the computer-based assessment in the TIMSS 2019 pre-test, which is an international academic achievement assessment. We examined whether the test mode influenced the percentage of items answered correctly by each graders. As a result, there was no significant difference in the correct answer rates of science items according to the test media in 4th and 8th graders. Looking at the each graders, the difference in the percentage of correct answers was relatively large in the fourth grader rather than the eighth grader. In terms of items, the difference in the percentage of correct answers was relatively larger for the constructed items than the multiple choice questions. As a result of analyzing the content of science and assessment items on the influence of the test media, the multiple choice items showed little difference between the paper and the computer-based assessment items, but the answer-type items tended to have a lower percentage of correct answers than the paper-based assessment. In addition, by grade, 4th graders showed lower percentages of correct answers in the constructed items than 8th graders. This study provides implications related to the development and introduction of computer-based assessment in Korea, and suggests an educational implications for computer-based assessment to be well established as school evaluation.

본 연구는 국제 학업성취도 평가인 TIMSS 2019 사전검사에서 지필 평가와 컴퓨터 기반 평가로 시행된 과학 평가 문항을 분석하여 검사 매체에 따라 문항 정답률에 영향을 미치는지를 문항 및 학년별로 살펴보았다. 그 결과, 4학년과 8학년 모두 검사 매체에 따른 과학과 문항 정답률 차이가 크지 않았다. 학년별로 살펴보면, 8학년보다는 4학년의 경우 정답률 차이가 상대적으로 크게 나타났다. 문항별로 살펴보면, 선다형 문항보다는 서답형 문항의 경우 정답률 차이가 상대적으로 크게 나타났다. 검사 매체 영향에 대한 과학과 평가 문항 내용을 분석한 결과로는 선다형 문항은 지필 평가와 컴퓨터 기반 평가 문항간의 차이가 거의 없었으나, 서답형 문항은 지필 평가에 비해 서답형 문항의 정답률이 낮아지는 경향을 보였다. 또한, 학년별로는 4학년 학생들이 8학년 학생들보다 서답형 문항에서 정답률이 낮아지는 것으로 나타났다. 본 연구를 토대로, 우리나라 컴퓨터 기반 평가의 개발과 현장 도입과 관련된 시사점을 제공하고 컴퓨터 기반 평가가 학교 평가로 잘 정착하기 위한 교육적 함의를 고찰하고자 한다.

Keywords

References

  1. Lee, C. H.; Kim, H. K.; Sang, K. A.; Choi, J. S. KICE ISSUE PAPER. 2018, ORM 2018-39-22.
  2. British Columbia. BC's Digital Literacy Framework. 2017. Retrieved from http://www2.gov.bc.ca/assets/gov/-219 education/kindergarten-to-grade-12/teach/teaching-tools/digital-literacy-framework pdf.
  3. Department for Education (DfE) (2013). The National Curriculum for England. GOV.UK. Framework document. Retrieved July 29, 2020, from https://www.gov.uk/national-curriculum.
  4. Australian Curriculum, Assessment and Reporting Authority (ACARA). (2013). The Australian curriculum. Retrieved July 29, 2020, from http://www.australiancurriculum.edu.au.
  5. Bocconi, S.; Chioccariello, A.; Dettori, G.; Ferrari, A.; Engelhardt, K. European Commission, Joint Research Centre. 2016.
  6. Kim, S. H.; Kim, J. H.; Kim, H. Y.; Lee, U. J.; Park, I. J.; Kim, M. E.; Lee, E. H.; Gye, B. K.; Kim, J. S. Korea Education Research Information Service. 2017. Research Report. KR 2017-4.
  7. OECD. PISA 2018 Technical Report. Paris: OECD. 2020.
  8. Mullis, I. V. S.; Martin, M. O. TIMSS 2019 Overview eTIMSS and paper TIMSS Item Development Activities. MA: Boston College. 2017.
  9. Choi, Y. H.; Kim, H. K. Journal of Learner-Centered Curriculum and Instruction. 2019, 19, 1335. https://doi.org/10.22251/jlcci.2019.19.8.1335
  10. AERA, A. American Educational Research Association. 2nd ed. Washington DC. 1999.
  11. Way, W. D.; Davis, L. L.; Fitzpatrick, S. In annual meeting of the National Council on Measurement in Education, San Francisco, CA. 2006.
  12. Song, J. J. A Validity Study of Computer-based Tests in An Elementary Social Study: Comparability with Paper-and-pencil Tests. MD thesis. Jeonju National University of Education. 2009.
  13. Kim, S. R.; Kim, J. D. The Journal of Korean Teacher Education. 2007, 24, 347. https://doi.org/10.24211/tjkte.2007.24.1.347
  14. Lee, M. J. The Journal of Education Research. 2004, 24, 121.
  15. Lim, H. J.; Seong, T. J. Education Evaluation Research. 2001, 14, 193.
  16. Karay, Y.; Schauber, S. K.; Stosch, C.; Schuttpelz-Brauns, K. Teaching and Learning in Medicine. 2015, 27, 57. https://doi.org/10.1080/10401334.2014.979175
  17. Anakwe, B. Journal of Education for Business. 2008, 84, 13. https://doi.org/10.3200/JOEB.84.1.13-17
  18. Akdemir, O.; Oguz, A. Computers & Education. 2008, 51, 1198. https://doi.org/10.1016/j.compedu.2007.11.007
  19. Park, D. S.; Kim, J. P.; Yang, G. S. Korean Educational Evaluation Society. 2002, 15, 247.
  20. Pommerich, M. The Journal of Technology, Learning and Assessment. 2004, 2.
  21. Prisacari, A. A.; Danielson, J. Computers in Human Behavior. 2017, 77, 1. https://doi.org/10.1016/j.chb.2017.07.044
  22. Shin, S.-K. Journal of Research in Curriculum & Instruction. 2014, 18, 1305. https://doi.org/10.24231/rici.2014.18.4.1305
  23. Im, E.-J.; Lee, W.-K.; Lee, Y.-C.; Choe, B.-H.; Chung, S.-K.; Lee, T.-H.; Cho, H.; Cohn, J.-H.; Won, D.-I.; Kong, H.-H.; Chang, B.-H.; Lee, J.-M. Korean Journal of Medical Education. 2008, 20, 145. https://doi.org/10.3946/kjme.2008.20.2.145
  24. Pomplun, M.; Custer, M. Journal of Educational Computing Research. 2005, 32, 153. https://doi.org/10.2190/D2HU-PVAW-BR9Y-J1CL
  25. Kim, K.; Chung K.; Cha, J; Kang, Y.; Noh, T. Journal of the Korean Chemical Society, 2007, 51, 193. https://doi.org/10.5012/jkcs.2007.51.2.193
  26. Paek, S. Center for Education Research, 2006, 22, 68.