DOI QR코드

DOI QR Code

원격판독 의료기관에서 시행한 원격판독의 동료평가: 세부전공분야와 촬영장비에 따른 판독 불일치 정도와 불일치 판독의 임상적 의의 비교

Peer Review of Teleradiology at a Teleradiology Clinic: Comparison of Unacceptable Diagnosis and Clinically Significant Discrepancy between Radiology Sections and Imaging Modalities

  • 서형석 (을지대학교 의정부병원 영상의학과) ;
  • 박재성 (순천향대학교 부천병원 영상의학과) ;
  • 오유환 (고려대학교 안암병원 영상의학과) ;
  • 성동욱 (경희대학교병원 영상의학과) ;
  • 이아름 (순천향대학교 부천병원 영상의학과)
  • Hyung Suk Seo (Department of Radiology, Eulji University Uijeongbu Hosptial) ;
  • Jai Soung Park (Department of Radiology, Soonchunhyang University Bucheon Hosptital) ;
  • Yu-Whan Oh (Department of Radiology, Korea University Anam Hospital) ;
  • Dongwook Sung (Department of Radiology, Kyung Hee University Hospital) ;
  • A Leum Lee (Department of Radiology, Soonchunhyang University Bucheon Hosptital)
  • 투고 : 2020.11.03
  • 심사 : 2021.02.03
  • 발행 : 2021.11.01

초록

목적 동료평가를 시행하여 원격판독 불일치의 정도와 불일치 판독의 임상적 의의를 세부전공분야와 촬영장비별로 평가하고자 하였다. 대상과 방법 한 국내 원격판독 의료기관에서 2018년과 2019년에 시행한 동료평가를 대상으로 세부전공분야와 촬영장비별로 판독 불일치를 분석하였다. 동료평가 점수를 적격 진단과 부적격 진단으로 분류하였고, 판독 불일치는 임상적 의의에 따라 무의한 불일치와 유의한 불일치로 분류하였다. 카이제곱 검증으로 통계를 시행하였다. 결과 총 1312건의 동료평가 중 부적격 진단은 117건(8.9%)이고, 총 462건의 진단 불일치 중 유의한 불일치는 104건(21.6%)이었다. 세부전공분야에서 부적격 진단율은 근골격(21.4%)이 가장 높았고(각각 p < 0.05), 복부(7.3%)는 신경두경부(1.3%)에 비해 높았다(p < 0.05). 유의한 불일치는 흉부(32.7%)가 복부(17.1%)와 근골격(19.5%)에 비해 높았다(각각 p < 0.05). 촬영장비에서 부적격 진단은 MR (16.2%)이 일반촬영(7.8%)에 비해 높았고(p < 0.05), 유의한 불일치는 차이가 없었다. 결론 동료평가를 통해서 원격판독의 부적격 진단과 임상적 유의한 불일치의 정도, 세부전공분야와 촬영장비에 따른 차이를 알 수 있었다.

Purpose The purpose of this study was to evaluate the rates of unacceptable diagnosis and clinically significant diagnostic discrepancy in radiology sections and imaging modalities through a peer review of teleradiology. Materials and Methods Teleradiology peer reviews in a Korean teleradiology clinic in 2018 and 2019 were included. The peer review scores were classified as acceptable and unacceptable diagnoses and clinically insignificant and significant diagnostic discrepancy. The diagnostic discrepancy rates and clinical significance were compared among radiology sections and imaging modalities using the chi-square test. Results Of 1312 peer reviews, 117 (8.9%) cases had unacceptable diagnoses. Of 462 diagnostic discrepancies, the clinically significant discrepancy was observed in 104 (21.6%) cases. In radiology sections, the unacceptable diagnosis was highest in the musculoskeletal section (21.4%) (p < 0.05), followed by the abdominal section (7.3%) and neuro section (1.3%) (p < 0.05). The proportion of significant discrepancy was higher in the chest section (32.7%) than in the musculoskeletal (19.5%) and abdominal sections (17.1%) (p < 0.05). Regarding modalities, the number of unacceptable diagnoses was higher with MRI (16.2%) than plain radiology (7.8%) (p < 0.05). There was no significant difference in significant discrepancy. Conclusion Peer review provides the rates of unacceptable diagnosis and clinically significant discrepancy in teleradiology. These rates also differ with subspecialty and modality.

키워드

참고문헌

  1. Medical Today. Available at. http://www.mdtoday.co.kr/mdtoday/index.html?no=150028. Published 2011. Accessed Oct 10, 2020
  2. Woo H, Choi MH, Eo H, Jung SE, Do K, Lee JS, et al. Teleradiology of Korea in 2017: survey and interview of training hospitals and teleradiology center. J Korean Soc Radiol 2019;80:490-502 https://doi.org/10.3348/jksr.2019.80.3.490
  3. Maeil Business News. Available at. https://mk.co.kr/news/it/view/2018/01/38640/. Published 2018. Accessed Oct 10, 2020
  4. Chosun Biz. Available at. https://biz.chosun.com/site/data/html_dir/2018/02/06/2018020600920.html. Published 2018. Accessed Oct 10, 2020
  5. Goldberg-Stein S, Frigini LA, Long S, Metwalli Z, Nguyen XV, Parker M, et al. ACR RADPEER committee white paper with 2016 updates: revised scoring system, new classifications, self-review, and subspecialized reports. J Am Coll Radiol 2017;14:1080-1086 https://doi.org/10.1016/j.jacr.2017.03.023
  6. Maloney E, Lomasney LM, Schomer L. Application of the RADPEERTM scoring language to interpretation discrepancies between diagnostic radiology residents and faculty radiologists. J Am Coll Radiol 2012;9:264-269 https://doi.org/10.1016/j.jacr.2011.11.016
  7. Borgstede JP, Lewis RS, Bhargavan M, Sunshine JH. RADPEER quality assurance program: a multifacility study of interpretive disagreement rates. J Am Coll Radiol 2004;1:59-65 https://doi.org/10.1016/S1546-1440(03)00002-4
  8. Mangrum W, Christianson K, Duncan S, Hoang P, Merkle E, Song A. Duke review of MRI principles. St. Louis: Mosby 2012
  9. American College of Radiology. ACR-AAPM-SIIM technical standard for electronic practice of medical imaging. Revised 2017. Available at. https://www.acr.org/-/media/ACR/Files/Practice-Parameters/elec-practicemedimag.pdf. Accessed Dec 11, 2020
  10. Silva E 3rd, Breslau J, Barr RM, Liebscher LA, Bohl M, Hoffman T, et al. ACR white paper on teleradiology practice: a report from the Task Force on Teleradiology Practice. J Am Coll Radiol 2013;10:575-585 https://doi.org/10.1016/j.jacr.2013.03.018
  11. Van Moore A, Allen B Jr, Campbell SC, Carlson RA, Dunnick NR, Fletcher TB, et al. Report of the ACR task force on international teleradiology. J Am Coll Radiol 2005;2:121-125 https://doi.org/10.1016/j.jacr.2004.08.003
  12. Ha DH. Standard guideline for teleradiography. The Korean Journal of PACS 2005;11:129-133
  13. Degnan AJ, Ghobadi EH, Hardy P, Krupinski E, Scali EP, Stratchko L, et al. Perceptual and interpretive error in diagnostic radiology-causes and potential solutions. Acad Radiol 2019;26:833-845 https://doi.org/10.1016/j.acra.2018.11.006
  14. Donald JJ, Barnard SA. Common patterns in 558 diagnostic radiology errors. J Med Imaging Radiat Oncol 2012;56:173-178 https://doi.org/10.1111/j.1754-9485.2012.02348.x
  15. Kielar AZ, McInnes M, Quan M, O'Sullivan J. Introduction of QUIP (quality information program) as a semiautomated quality assessment endeavor allowing retrospective review of errors in cross-sectional abdominal imaging. Acad Radiol 2011;18:1358-1364 https://doi.org/10.1016/j.acra.2011.06.012
  16. Mankad K, Hoey ET, Jones JB, Tirukonda P, Smith JT. Radiology errors: are we learning from our mistakes? Clin Radiol 2009;64:988-993 https://doi.org/10.1016/j.crad.2009.06.002
  17. Kim YW, Mansfield LT. Fool me twice: delayed diagnoses in radiology with emphasis on perpetuated errors. AJR Am J Roentgenol 2014;202:465-470 https://doi.org/10.2214/AJR.13.11493
  18. Lee CS, Nagy PG, Weaver SJ, Newman-Toker DE. Cognitive and system factors contributing to diagnostic errors in radiology. AJR Am J Roentgenol 2013;201:611-617 https://doi.org/10.2214/AJR.12.10375
  19. Tudor GR, Finlay DB. Error review: can this improve reporting performance? Clin Radiol 2001;56:751-754 https://doi.org/10.1053/crad.2001.0760