과제정보
This work was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Education(2022R1F1A1074696).
참고문헌
- C. Y. Lin, "ROUGE: A Package for Automatic Evaluation of Summaries," in Proceedings of the Workwhop on Text Summarization Branches Out, Barcelona, Spain, pp. 74-81, 2004.
- J. Wei and K. Zou, "EDA: Easy Data Augmentation Techniques for Boosting Performance on Text Classification Tasks," in Proceeding of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, Hong Kong, China, pp. 6382-6388, 2019. DOI:10.18653/v1/D19-1670.
- Z. Liu, J. Li, and M. Zhu, "Improving Text Generation with Dynamic Masking and Recovering," in International Joint Conference on Artificial Intelligence, Online, pp. 3878-3884, 2021. DOI: 10.24963/ijcai.2021/534.
- A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, L. Kaiser, and I. Polosukhin, "Attention is all you need," in 31st Conference on Neural Information Processing Systems, Long Beach: CA, USA, 2017.
- I. Cachola, K. Lo, A. Cohan, and D. Weld, "TLDR: Extreme Summarization of Scientific Documents," in Findings of the Association for Computational Linguistics: EMNLP 2020, Online, pp. 4766-4777, 2020. DOI: 10.18653/v1/2020.findings-emnlp.428.
- S. Narayan, S. B. Cohen, and M. Lapata, "Don't Give Me the Details, Just the Summary! Topic-Aware Convolutional Neural Networks for Extreme Summarization," in Proceeding of the 2018 Conference on Empirical Methods in Natural Language Processing, Brussels, Belgium, pp. 1797-1807, 2018. DOI: 10.18653/v1/D18-1206.
- B. Kim, H. Kim, and G. Kim, "Abstractive Summarization of Reddit Posts with Multi-level Memory Networks," in Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Minneapolis: MN, USA, pp. 2519-2531, 2019. DOI: 10.18653/v1/N19-1260.
- M. Lewis, Y. Liu, N. Goyal, M. Ghazvininejad, A. Mohamed, O. Levy, V. Stoyanov, and L. Zettlemoyer, "BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension," in Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Online, pp. 7871-7880, 2020. DOI: 10.18653/v1/2020.acl-main.703.
- C. Raffel, N. Shazeer, A. Roberts, K. Lee, S. Narang, M. Matena, Y. Zhou, W. Li, and P. J. Liu, "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer," Journal of Machine Learning Research, vol. 21, pp. 1-67, Jun. 2020.