Acknowledgement
This work was supported by Incheon National University (International Cooperative) Research Grant in 2021 (2021-0089)
References
- Z. Li, and D. Hoiem, "Learning without forgetting," IEEE transactions on pattern analysis and machine intelligence, Vol.40, No.12, pp.2935-2947, 2017. DOI: 10.1109/TPAMI.2017.2773081
- IJ. Goodfellow, et al., "An empirical investigation of catastrophic forgetting in gradient-based neural networks," arXiv preprint arXiv:1312.6211, 2013.
- GI. Parisi, et al., "Continual lifelong learning with neural networks: A review," Neural Networks, Vol.113, pp.54-71, 2019. DOI: 10.1016/j.neunet.2019.01.012
- RM. French, "Catastrophic forgetting in connectionist networks," Trends in cognitive sciences, Vol.3, No.4, pp.128-135, 1999. DOI: 10.1016/S1364-6613(99)01294-2
- YC. Hsu, et al., "Re-evaluating continual learning scenarios: A categorization and case for strong baselines," arXiv preprint arXiv:1810.12488, 2018.
- K. McRae, and PA. Hetherington, "Catastrophic interference is eliminated in pretrained networks," In: Proceedings of the 15h Annual Conference of the Cognitive Science Society, pp.723-728, 1993.
- J. Kirkpatrick, et al., "Overcoming catastrophic forgetting in neural networks," Proceedings of the national academy of sciences, Vol.114, No.13, pp.3521-3526. 2017.
- G. Hinton, O. Vinyals, and J. Dean, "Distilling the knowledge in a neural network," in NIPS Workshop, 2014.
- F. Zenke, B. Poole, and S. Ganguli, "Continual learning through synaptic intelligence," In: Proceedings of the 34th International Conference on Machine Learning-Volume 70. JMLR. org, pp.3987-3995, 2017.
- SH. park, SH. Kang, "Continual Learning using Data Similarity," Journal of IKEEE, Vol.24, No.2, pp.514-522, 2020. DOI: 10.7471/ikeee.2020.24.2.514