DOI QR코드

DOI QR Code

Multifaceted validity analysis of clinical skills test in the educational field setting

교육 현장에서 시행된 임상 술기 시험의 다면적 타당도 분석

  • Han Chae (School of Korean Medicine, Pusan National University) ;
  • Min-jung Lee (Department of Medical Education, Seoul National University College of Medicine) ;
  • Myung-Ho Kim (Department of Internal Korean Medicine, Woosuk University Medical Center) ;
  • Kyuseok Kim (Department of Ophthalmology, Otorhinolaryngology, and Dermatology of Korean Medicine, College of Korean Medicine, Kyung Hee University) ;
  • Eunbyul Cho (KM Science Division, Korea Institute of Oriental Medicine)
  • 채한 (부산대학교 한의학과) ;
  • 이민정 (서울대학교 의과대학) ;
  • 김명호 (우석대학교 한방병원) ;
  • 김규석 (경희대학교 한의과대학 한방안이비인후피부과) ;
  • 조은별 (한국한의학연구원 한의과학연구부)
  • Received : 2023.09.01
  • Accepted : 2024.02.16
  • Published : 2024.03.01

Abstract

Introduction: The importance of clinical skills training in traditional Korean medicine education is increasingly emphasized. Since the clinical skills tests are high-stakes tests that determine success in national licensing exams, it is essential to develop reliable multifaceted analysis methods for clinical skills tests in actual education settings. In this study, we applied the multifaceted validity evaluation methods to the evaluation results of the cardiopulmonary resuscitation module to confirm the applicability and effectiveness of the methods. Methods: In this study, we used internal consistency, factor analysis, generalizability theory G-study and D-study, ANOVA, Kendall's tau, descriptive statistics, and other statistical methods to analyze the multidimensional validity of a cardiopulmonary resuscitation test in clinical education settings over the past three years. Results: The factor analysis and internal consistency analysis showed that the evaluation rubric had an unstable structure and low concordance. The G-study showed that the error of the clinical skills assessment was large due to the evaluator and unexpected errors. The D-study showed that the variance error of the evaluator should be significantly reduced to validate the evaluation. The ANOVA and Kendall's tau confirmed that evaluator heterogeneity was a problem. Discussion and Conclusion: Clinical skills tests should be continuously evaluated and managed for validity in two steps of pre-production and actual implementation. This study has presented specific methods for analyzing the validity of clinical skills training and testing in actual education settings. This study would contribute to the foundation for competency-based evidence-based education in practical clinical training.

Keywords

Acknowledgement

본 연구는 부산대학교의 연구비지원을 받았음.

References

  1. Chae H, Cho E, Kim SK, et al. Analysis on validity and academic competency of mock test for Korean Medicine National Licensing Examination using Item Response Theory. Keimyung Medical Journal 2023. DOI: .
  2. Park SH. Possibilities and Limits of High Stakes Testing in US. Korean Journal of Comparative Education 2010; 20: 1-21.
  3. Eggen TJ and Stobart G. High-stakes testing-value, fairness and consequences. High-Stakes Testing in Education. Routledge, 2015, pp.1-6.
  4. Korea Health Personnel Licensing Examination Institute. Clinical skill test, https://www.kuksiwon.or.kr/EngHome/cnt/c_3109/view.do?seq=18(2023, accessed 2023-06-18 2023).
  5. Kim KS. Introduction and administration of the clinical skill test of the medical licensing examination, republic of Korea (2009). Journal of Educational Evaluation for Health Professions 2010; 7.
  6. Han SY, Lee S-H and Chae H. Developing a best practice framework for clinical competency education in the traditional East-Asian medicine curriculum. BMC Med Educ 2022; 22: 352. 2022/05/11. DOI: https://doi.org/10.1186/s12909-022-03398-4.
  7. Shin J, Go Y, Song C, et al. Presentation on research trends and suggestion for further research and education on Objective Structured Clinical Examination and Clinical Performance Examination in Korean Medicine education: Scoping review. Society of Preventive Korean Medicine 2022; 26: 87-112. DOI: 10.25153/SPKOM.2022.26.2.008.
  8. Korean Laws Information Center ACT ON DEVELOPMENT OF E-LEARNING INDUSTRY AND PROMOTION OF UTILIZATION OF E-LEARNING. In: Ministration of Trade, Industry and Engergy, (ed.). 18358. Sejong, Korea: Korean Laws Information Center, 2021.
  9. Chae H, Han SY, Yang G, et al. Study on the herbology test items in Korean medicine education using Item Response Theory. Kor J Herbology 2022; 37: 13-21. DOI: https://doi.org/10.6116/kjh.2022.37.2.13.
  10. Chae H, Lee SJ, Han c-h, et al. Study on the Academic Competency Assessment of Herbology Test using Rasch Model. J Korean Med 2022; 43: 27-41. DOI: https://doi.org/10.13048/jkm.22017.
  11. Kang Y. Evaluating the cutoff score of the advanced practice nurse certification examination in Korea. Nurse Education in Practice 2022; 63: 103407. DOI: https://doi.org/10.1016/j.nepr.2022.103407.
  12. Kim S and Kim Y. Generalizability Theory. 2nd ed. Paju: Education Science Publishing, 2016.
  13. Nunnally JC and Bernstein IH. Psychometric theory. 3 rd ed. New York: McGraw-Hill, 1994.
  14. Shin S, Kim GS, Song JA, et al. Development of examination objectives based on nursing competency for the Korean Nursing Licensing Examination: a validity study. J Educ Eval Health Prof 2022; 19: 19. 2022/08/23. DOI: https://doi.org/10.3352/jeehp.2022.19.19.
  15. Seong T, Kang DJ, Kang E, et al. Introduction to Modern Pedagogy. Seoul: Hakjisa, 2018.
  16. Lee M-j. Exploring the feasibility of implementing criterion-referenced assessment in Korean medicine education: enhancing comprehension and relevance. J Kor Med Edu 2023; 1: 10-14. DOI: https://doi.org/10.23215/JKME.PUB.1.1.10.
  17. Chae H. Jamovi, an open-source software for teaching data literacy and performing medical research. J Kor Med edu 2023; 1: 28-36. DOI: https://doi.org/10.23215/JKME.PUB.1.2.28.
  18. Navas-Ferrer C, Urcola-Pardo F, Subiron-Valera AB, et al. Validity and reliability of objective structured clinical evaluation in nursing. Clinical Simulation in Nursing 2017; 13: 531-543. https://doi.org/10.1016/j.ecns.2017.07.003
  19. Hur HK, Park SM, Kim KK, et al. Evaluation of Lasater judgment rubric to measure nursing student' performance of emergency management simulation of hypoglycemia. Journal of Korean Critical Care Nursing 2012; 5: 15-27.
  20. Kim J and Cho L-R. Analysis of error source in subjective evaluation on patient dentist interaction: Application of Generalizability Theory. The Journal of the Korean Dental Association 2019; 57: 448-455. https://doi.org/10.22974/JKDA.2019.57.8.003
  21. Lee SY, Lm SJ, Yune SJ, et al. Assessment of Medical Students in Clinical Clerkships. Korean Medical Education Review 2013; 15: 120-124. https://doi.org/10.17496/KMER.2013.15.3.120
  22. Rim MK, Ahn D-S, Hwang IH, et al. Validation study to establish a cutoff for the national health personnel licensing examination. 2014. Korea Health Personnel Licensing Examination Institute.
  23. Ahn S and Choi S. Proposal for a Cut Score for the Physics Ability Test: Comparison between the Modified Angoff, Bookmark, and IDM Methods. New Physics: Sae Mulli 2018; 68: 599-610. DOI: http://dx.doi.org/10.3938/NPSM.68.599.
  24. Schoonheim-Klein M, Muijtjens A, Habets L, et al. Who will pass the dental OSCE? Comparison of the Angoff and the borderline regression standard setting methods. European Journal of Dental Education 2009; 13: 162-171. DOI: https://doi.org/10.1111/j.1600-0579.2008.00568.x.
  25. Pell G, Fuller R, Homer M, et al. How to measure the quality of the OSCE: A review of metrics - AMEE guide no. 49. Medical Teacher 2010; 32: 802-811. DOI: 10.3109/0142159X.2010.507716.
  26. Downing SM. Validity: on meaningful interpretation of assessment data. Med Educ 2003; 37: 830-837. DOI: 10.1046/j.1365-2923.2003.01594.x.
  27. Harden RM, Lilley P and Patricio M. The Definitive Guide to the OSCE: The Objective Structured Clinical Examination as a performance assessment. Elsevier Health Sciences, 2015.
  28. Patricio MF, Juliao M, Fareleira F, et al. Is the OSCE a feasible tool to assess competencies in undergraduate medical education? Medical teacher 2013; 35: 503-514. https://doi.org/10.3109/0142159X.2013.774330
  29. Ghouri A, Boachie C, McDowall S, et al. Gaining an advantage by sitting an OSCE after your peers: a retrospective study. Medical Teacher 2018; 40: 1136-1142. https://doi.org/10.1080/0142159X.2018.1458085
  30. Iyer A and Dovedi V. Is there such a thing as a fair OSCE? Medical Teacher 2018; 40: 1192-1192. https://doi.org/10.1080/0142159X.2018.1484081