Acknowledgement
본 연구는 과학기술정보통신부 및 정보통신기획평가원의 대학ICT연구센터사업의연구결과로 수행되었음(IITP-No.2024-2021-0-01817,No.RS-2020-II201373, No.2022-0-00498)
References
- Hinton Geoffrey. "Distilling the knowledge in a neural network." NIPS. City of Montreal, Canada. 2015. 9P.
- Cho, Jang Hyun, and Bharath Hariharan. "On the efficacy of knowledge distillation." IEEE/CVF. Long Beach, CA, USA. 2019. 13P.
- Sun, Siqi, et al. "Patient knowledge distillation for bert model compression." ACL. Firenze, Italy. 2019. 10P.
- Zhao, Borui, et al. "Decoupled knowledge distillation." IEEE/CVF. Louisiana, USA 2022. 10p.
- Mirzadeh, Seyed Iman. "Improved knowledge distillation via teacher assistant." AAAI. NewYork, USA. 2020. 11p