Acknowledgement
이 연구는 2021년도 산업통상자원부 및 산업기술평가관리원(KEIT) 연구비 지원에 의한 연구임('10077538').
References
- Z. Yang, P. Qi, S. Zhang, Y. Bengio, W. Cohen, R. Salakhutdinov, and C. D. Manning, "HotpotQA: A dataset for diverse, explainable multi-hop question answering," In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, Brussels, Belgium, pp.2369-2380, 2018.
- L. Qiu, Y. Xiao, Y. Qu, H. Zhou, L. Li, W. Zhang, and Y. Yu, "Dynamically fused graph network for multi-hop reasoning," In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, Florence, Italy, pp.6140-6150, 2019.
- M. Zhang, F. Li, Y. Wang, Z. Zhang, Y. Zhou, and X. Li, "Coarse and fine granularity graph reasoning for interpretable multi-hop question answering," IEEE Access, Vol.8, pp.56755-56765, 2020. https://doi.org/10.1109/ACCESS.2020.2981134
- J. Devlin, M. W. Chang, K. Lee, and K. Toutanova, "BERT: Pre-training of deep bidirectional transformers for language understanding," In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Minneapolis, Minnesota, pp.4171-4186, 2019.
- A. Asai, K. Hashimoto, H. Hajishirzi, R. Socher, and C. Xiong, "Learning to retrieve reasoning paths over Wikipedia graph for question answering," In Proceedings of International Conference on Learning Representation, 2020.
- Y. Fang, S. Sun, Z. Gan, R. Pillai, S. Wang, and J. Liu, "Hierarchical graph network for multi-hop question answering," In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, pp.8823-8838, 2020.
- J. Welbl, P. Stenetorp, and S. Riedel, "Constructing datasets for multi-hop reading comprehension across documents," Transactions of the Association for Computational Linguistics, Vol.6, pp.287-302, 2018. https://doi.org/10.1162/tacl_a_00021
- Y. Liu, et al., "RoBERTa: A robustly optimized BERT pretraining approach," arxiv.org/abs/1907.11692, 2019.
- T. Mikolov, K. Chen, G. Corrado, and J. Dean, "Efficient estimation of word representations in vector space," In Proceedings of International Conference on Learning Representation, 2013.
- J. Pennington, R. Socher, and C. Manning, "GloVe: Global vectors for word representation," In Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing, Doha, Qatar, pp.1532-1543, 2014.
- A. Vaswani, et al., "Attention is all you need," In Proceedings of the 31st International Conference on Neural Information Processing Systems, Long Beach, California, pp.6000-6010, 2017.
- K. Nishida, K. Nishida, M. Nagata, A. Otsuka, I. Saito, H. Asano, and J. Tomita, "Answering while summarizing: Multi-task learning for multi-hop QA with evidence extraction," In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, Florence, Italy, pp.2335-2345, 2019.
- T. N. Kipf and M. Welling, "Semi-supervised classification with graph convolutional networks," In Proceedings of the 5th International Conference on Learning Representations, Toulon, France, 2017.
- P. Velickovic, G. Cucurull, A. Casanova, A. Romero, P. Lio, and Y. Bengio, "Graph attention networks," In Proceedings of the 6th International Conference on Learning Representations, Vancouver, Canada, 2018.
- M. Tu, G. Wang, J. Huang, Y. Tang, X. He, and B. Zhou, "Multi-hop reading comprehension across multiple documents by reasoning over heterogeneous graphs," In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, Florence, pp.2704-2713, 2019.
- M. Tu, K. Huang, G. Wang, J. Huang, X. He, and B. Zhou, "Select, answer and explain: Interpretable multi-hop reading comprehension over multiple documents," In Proceedings of the Thirty-Fourth AAAI Conference on Artificial Intelligence, New York, USA, pp.9073-9080, 2020.
- N. D. Cao, W. Aziz, and I. Titov, "Question answering by reasoning across documents with graph convolutional networks," In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Minneapolis, Minnesota, pp.2306-2317, 2019.
- Z. Lan, M. Chen, S. Goodman, K. Gimpel, P. Sharma, and R. Soricut, "ALBERT: A lite BERT for self-supervised learning of language representations," In Proceedings of International Conference on Learning Representation, 2020.
- M. Seo, A. Kembhavi, A. Farhadi, and H. Hajishirzi, "Bidirectional attention flow for machine comprehension," In Proceedings of International Conference on Learning Representation, 2017.