DOI QR코드

DOI QR Code

Kidney Tumor Segmentation Using a Hybrid CNN-Transformer Network for Partial Nephrectomy Planning

부분 신장 절제술 계획을 위한 하이브리드 CNN-트랜스포머 네트워크를 활용한 신장 종양 분할

  • Goun Kim (Dept. of Software Convergence, Seoul Women's University) ;
  • Jinseo An (Dept. of Software Convergence, Seoul Women's University) ;
  • Yubeen Lee (Dept. of Software Convergence, Seoul Women's University) ;
  • Helen Hong (Dept. of Software Convergence, Seoul Women's University)
  • 김고운 (서울여자대학교 소프트웨어융합학과) ;
  • 안진서 (서울여자대학교 소프트웨어융합학과) ;
  • 이유빈 (서울여자대학교 소프트웨어융합학과) ;
  • 홍헬렌 (서울여자대학교 소프트웨어융합학과)
  • Received : 2024.06.05
  • Accepted : 2024.07.30
  • Published : 2024.09.01

Abstract

In partial nephrectomy for kidney cancer treatment, accurate segmentation of the kidney tumor is crucial for surgical planning, as it provides essential information on the precise size and location of the tumor. However, it is challenging due to the tumor's similar intensity to surrounding organs and the variability in its location and size across patients. In this study, we propose a hybrid network that integrates a convolutional neural network and a transformer to capture both local and global features, aiming to improve the segmentation performance of kidney tumors. We validated our method through comparative experiments with UNETR++, outperforming it with a Dice Similarity Coefficient (DSC) of 78.54% and a precision of 85.0 7%. Moreover, in the analysis by tumor size, our method demonstrated improvements by reducing over-segmentation and outlier cases observed in UNETR++.

신장암 치료를 위한 부분 신장 절제술에서 수술 계획을 위한 신장 종양의 정확한 크기와 위치 등의 정보가 필수적이다. 따라서 신장 종양을 정확하게 분할하는 것이 중요하지만, 종양이 주변 장기와 밝기값이 유사하고 위치 및 크기가 환자마다 다양하여 분할에 어려움이 있다. 본 연구에서는 신장 종양의 분할 성능 개선을 목표로, 영상 내 지역적 및 전역적 특징을 모두 고려할 수 있는 합성곱 신경망과 트랜스포머가 결합된 하이브리드 네트워크를 제안한다. 제안 방법은 UNETR++와의 비교 실험에서 다이스 유사계수 78.54%, 정밀도 85.07%로 전반적으로 우수한 성능을 보였다. 또한, 종양 크기별 분석에서는 UNETR++에서 관찰되는 과대 분할 및 이상치가 개선된 결과를 보였다.

Keywords

Acknowledgement

본 연구는 보건복지부의 재원으로 한국보건산업진흥원의 보건의료기술연구개발사업 지원(HI22C1496) 및 서울여자대학교 학술연구비의 지원(2024-0218)을 받아 수행되었으며 이에 감사드립니다.

References

  1. M.M. Picken, W. Lu, and N.G. Gopal, "Positive Surgical Margins in Renal Cell Carcinoma: Translating Tumor Biology Into Clinical Outcomes," Amercian Journal of Clinical Pathology, 143(5), pp. 620-622, 2015. 
  2. O. Ronneberger, P. Fischer, and T. Brox, "U-Net: Convolutional Networks for Biomedical Image Segmentation," Medical Image Computing and Computer Assisted Intervention (MICCAI) 2015, pp. 234-241, 2015. 
  3. P. Sun, Z. Mo, F. Hu, F. Liu, T. Mo, Y. Zhang, et al., "Kidney Tumor Segmentation Based on FR2PAttU-Net Model," Frontiers in Oncology, 12, pp. 1-13, 2022. 
  4. J. Guo, W. Zeng, S. Yu, and J. Xiao, "RAU-Net: U-Net Model Based on Residual and Attention for Kidney and Kidney Tumor Segmentation," IEEE International Conference on Consumer Electronics and Computer Engineering (ICCECE), pp. 353-356, 2021. 
  5. A. Myronenko, and A. Hatamizadeh, "3D Kidneys and Kidney Tumor Semantic Segmentation using Boundary-Aware Networks," arXiv:1909.06684, 2019. 
  6. X. Xie, L. Li, S. Lian, S. Chen, and Z. Luo, "SERU: A cascaded SE-ResNeXT U-Net for kidney and tumor segmentation," Concurrency and Computation: Practice and Experience, 32(14), pp. 1-11, 2020. 
  7. L. Kang, Z. Zhou, J. Huang, and W. Han, "Renal tumors segmentation in abdomen CT Images using 3D-CNN and ConvLSTM," Biomedical Signal Processing and Control, 72, pp. 1-16, 2022. 
  8. W. Zhao, D. Jiang, J.-P. Queralta, and T. Westerlund, "MSS U-Net: 3D segmentation of kidneys and tumors from CT images with a multi-scale supervised U-Net," Informatics in Medicine Unlocked, 19, pp. 1-11, 2020. 
  9. H. Cao, Y. Wang, J. Chen, D. Jiang, X. Zhang, Q. Tian, et al., "Swin-Unet: Unet-Like Pure Transformer for Medical Image Segmentation," European Conference on Computer Vision (ECCV) 2022, pp.205-218, 2022. 
  10. R. Azad, A. Kazerouni, B. Azad, E. Khodapanah Aghdam, Y. Velichko, U. Bagci, et al., "Laplacian-Former: Overcoming the Limitations of Vision Transformers in Local Texture Detection," Medical Image Computing and Computer Assisted Intervention (MICCAI) 2023, pp. 736-746, 2023. 
  11. Z. He, M. Unberath, J. Ke, and Y. Shen, "TransNuSeg: A Lightweight Multi-task Transformer for Nuclei Segmentation," Medical Image Computing and Computer Assisted Intervention (MICCAI) 2023, pp. 206-215, 2023. 
  12. Q. Guan, Y. Xie, B. Yang, J. Zhang, Z. Liao, Q. Wu, et al., "Unpaired Cross-Modal Interaction Learning for COVID-19 Segmentation on Limited CT Images," Medical Image Computing and Computer Assisted Intervention (MICCAI) 2023, pp. 603-613, 2023. 
  13. A.M. Shaker, M. Maaz, H. Rasheed, S. Khan, M.H. Yang, and F.S. Khan, "UNETR++: Delving into Efficient and Accurate 3D Medical Image Segmentation," IEEE Transactions on Medical Imaging, pp. 1-14, 2024. 
  14. UNETR++: Delving into Efficient and Accurate 3D Medical Image Segmentation. [Online] Available: https://github.com/Amshaker/unetr_plus_plus 
  15. F. Isensee, K.H. Maier-Hein, "An attempt at beating the 3D U-Net," arXiv:1908.02182, 2019. 
  16. N. Heller, F. Isensee, K.H. Maier-Hein, X. Hou, C. Xie, F. Li, et al., "The state of the art in kidney and kidney tumor segmentation in contrast-enhanced CT imaging: Result s of the KiTS19 challenge," Medical Image Analysis, 67, pp. 1-16, 2021. 
  17. N. Heller, N. Sathianathen, A. Kalapara, E. Walczak, K. Moore, H. Kaluzniak, et al., "The KiTS19 Challenge Data: 300 Kidney Tumor Cases with Clinical Context, CT Semantic Segmentations, and Surgical Outcomes," arXiv:1904.00445, 2019. 
  18. KiTS19. [Online]. Available: https://github.com/neheller/kits19 
  19. E. Yang, C.K. Kim, Y. Guan, B.-B. Koo, and J.-H. Kim, "3D multi-scale residual fully convolutional neural network for segmentation of extremely large-sized kidney tumor," Computer Methods and Programs in Biomedicine, 215, pp. 1-12, 2022.