Comparison on the Deep Learning Performance of a Field of View Variable Color Images of Uterine Cervix

컬러 자궁경부 영상에서 딥러닝 기법에서의 영상영역 처리 방법에 따른 성능 비교 연구

  • Seol, Yu Jin (Dept. of Biomedical Eng., School of Health Science, Gachon University) ;
  • Kim, Young Jae (Dept. of Biomedical Eng., School of Medicine, Gachon University) ;
  • Nam, Kye Hyun (Dept. of Gynecology & Obstetrics, Soonchunhyang University Bucheon Hospital) ;
  • Kim, Kwang Gi (Dept. of Biomedical Eng., Graduate School GAIHST, Gachon University)
  • 설유진 ;
  • 김영재 ;
  • 남계현 ;
  • 김광기
  • Received : 2020.03.16
  • Accepted : 2020.07.08
  • Published : 2020.07.31


Cervical cancer is the second most common female cancer in the world. In Korea, cervical cancer accounts for 13 percent of female cancers and 4,200 cases occur annually[1]. The purpose of this study is to use a deep learning model to identify the possibility of lesions in the cervix and to evaluate the efficient image preprocessing in order to diagnose diverse types of cervix in form. The study used 4,107 normal photographs of uterine cervix and 6,285 abnormal photographs of uterine cervix. Two types of image preprocessing were resized to square. The methods are cropping based on height and filling the space up and down with black images. In addition, all images were resampled to 256×256. The average accuracy of cropped cases is 94.15%. The average accuracy of the filled cases is 93.41%. According to the study, the model performance of cropped data was slightly better. But there were several images that were not accurately classified. Therefore, the additional experiment with pre-treatment process based on cropping is needed to cover images of the cervix in more detail.


  1. Cervix Cancer(2020), January 15, 2020).
  2. Features of cervix cancer (2017), (accessed January 15, 2020).
  3. E.H. Lee, T.H. Um, H.S. Chi, Y.J. Hong, and Y.J. Cha, “Prevalence and Distribution of Human Papillomavirus Infection in Korean Women as Determined by Restriction Fragment Mass Polymorphism Assay,” Journal of Korean Medical Science, Vol. 27, No. 9, pp. 1091-1097, 2012.
  4. Identifying cervical precancer with AI approach (2019), (accessed January 16, 2020).
  5. T.J. Song, S.J. Seong, S.K. Lee, B.R. Kim, W. Ju, K.H. Kim, et al., "Screening Capacity and Cost-effectiveness of the Human Papillomavirus Test Versus Cervicography as an Adjunctive Test to Pap Cytology to Detect Highgrade Cervical Dysplasia," ScienceDirect, Vol. 234, pp. 112-116, 2019.
  6. K. He, X. Zhang, S. Ren, and J. Sun, "Deep Residual Learning for Image Recognition," Proceeding of 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 770-778, 2016.
  7. A. Esteva, B. Kuprel, R.A. Novoa, J. Ko, S.M. Swetter, H.M. Blau, et al., "Dermatologist-level Classification of Skin Cancer with Deep Neural Networks," Nature, Vol. 542, pp. 115-118, 2017.
  8. S. Lee, H. Joe, and H. Joe, “Computer-aided Diagnosis System for Abnormalities Classification in Gastric Endoscopy Images Using Machine Learning,” Journal of Electrical Engineering and Technology, Vol. 69, No. 1, pp. 107-113, 2020.
  9. M. Sato, K. Horie, A. Hara, Y. Miyamoto, K. Kurihara, K. Tomio, et al., “Application of Deep Learning to the Classification of Images from Colposcopy,” Oncology Letters, Vol. 15, No. 3, pp. 3518-3523, 2018.
  10. Cerviacal cytology screening (1988), (accessed January 17, 2020).
  11. J. Lee, N. Kim, and S. Hong, "Performance Analysis of Similar Plant Leaves According to Transformation Input Data of Deep Learning Model." The Institute of Electronics and Information Engineers, pp. 759-762, 2019.
  12. M. Kyung and H. Lee "A Deep Learningbased Document Title Detection for Automatic Document Type Classification," Journal of the Institute of Electronics and Information Engineers, Vol. 55, No. 9, pp. 53-61, 2018.
  13. S. Rhyou, H. Kim, and K. Cha, “Development of Access Management System Based on Face Recognition Using ResNet,” Journal of Korea Multimedia Society, Vol. 22, No. 8, pp. 823-831, 2019.
  14. Machine Learning Cheat Sheet(2019), (accessed January 19, 2020).