DOI QR코드

DOI QR Code

데이터 유사도를 이용한 지속적 학습방법

Continual Learning using Data Similarity

  • Park, Seong-Hyeon (Dept. of Embedded Systems Engineering, Incheon National University) ;
  • Kang, Seok-Hoon (Dept. of Embedded Systems Engineering, Incheon National University)
  • 투고 : 2020.06.01
  • 심사 : 2020.06.17
  • 발행 : 2020.06.30

초록

Continuous Learning 환경에서 인공 신경망의 학습이 진행됨에 따라 이전에 학습했던 데이터의 정보를 잊는 Catastrophic Forgetting 현상이 있다. 서로 다른 Domain을 갖는 데이터 사이에서 쉽게 발생한다. 이 현상을 제어하기 위해 신경망의 출력 분포를 통해 이전에 학습된 데이터와 새로 학습할 데이터들의 관계를 측정하는 방법과 이 측정값을 사용하여 Catastrophic Forgetting 현상을 완화하는 방법을 제시한다. 평가를 위해 MNIST, EMNIST 데이터를 사용하였고 실험 결과, 이전 데이터에 대한 정확도가 평균적으로 약 22.37% 향상되었다.

In Continuous Learning environment, we identify that the Catastrophic Forgetting phenomenon, which forgets the information of previously learned data, occurs easily between data having different domains. To control this phenomenon, we introduce how to measure the relationship between previously learned data and newly learned data through the distribution of the neural network's output, and how to use these measurements to mitigate the Catastrophic Forcing phenomenon. MNIST and EMNIST data were used for evaluation, and experiments showed an average 22.37% improvement in accuracy for previous data.

키워드

참고문헌

  1. IJ. Goodfellow, et al., "An empirical investigation of catastrophic forgetting in gradient-based neural networks," arXiv preprint arXiv:1312.6211, 2013.
  2. RM. French, "Catastrophic forgetting in connectionist networks," Trends in cognitive sciences, Vol.3, No.4, pp.128-135, 1999. DOI: 10.1016/S1364-6613(99)01294-2
  3. J. Kirkpatrick, et al., "Overcoming catastrophic forgetting in neural networks," Proceedings of the national academy of sciences, Vol.114, No.13, pp.3521-3526, 2017. DOI: 10.1073/pnas.1611835114
  4. YC. Hsu, et al., "Re-evaluating continual learning scenarios: A categorization and case for strong baselines," arXiv preprint arXiv:1810.12488, 2018.
  5. V. Lomonaco, and D. Maltoni, "Core50: a new dataset and benchmark for continuous object recognition," arXiv preprint arXiv:1705.03550, 2017.
  6. D. Maltoni, and V. Lomonaco, "Continuous learning in single-incremental-task scenarios," Neural Networks, Vol.116, pp.56-73, 2019. DOI: 10.1016/j.neunet.2019.03.010
  7. Y. LeCun, et al., "Gradient-based learning applied to document recognition," Proceedings of the IEEE, Vol.86, No.11, pp.2278-2324, 1998. DOI: 10.1109/5.726791
  8. G. Cohen, et al. "EMNIST: Extending MNIST to handwritten letters," In: 2017 International Joint Conference on Neural Networks (IJCNN). IEEE, pp.2921-2926, 2017. DOI: 10.1109/IJCNN.2017.7966217
  9. GI. Parisi, et al., "Continual lifelong learning with neural networks: A review," Neural Networks, 2019. DOI: 10.1016/j.neunet.2019.01.012.
  10. R Kemker, et al, "Measuring catastrophic forgetting in neural networks," In: Thirty-second AAAI conference on artificial intelligence. 2018.