Acknowledgement
본 연구는 2021년도 동의대학교 교내연구비에 의해 연구되었음. (202102000001).
References
- H. Jwa, D. Oh, and H. Lim, "Research analysis in automatic fake news detection," Journal of the Korea Convergence Society, Vol.10. No.7, pp.15-21, 2019. https://doi.org/10.15207/JKCS.2019.10.7.015
- Y. Yoon, T. Eom, and J. Ahn, "Fake news detection technology trends and implications," Weekly Technology Trends, Institute of Information & Communications Technology Planning & Evaluation, Oct. 2017.
- K. Popat, S. Mukherjee, A. Yates, and G. Weikum, "DeClarE: Debunking fake news and false claims using evidence-aware deep learning," EMNLP, pp.22-32, 2018.
- H. Lee, J. Kim, and J. Paik, "Survey of fake news detection techniques and solutions," Proceedings of the Korean Society of Computer Information Conference, pp.37-39, 2020.
- Y. Hyun and N. Kim, "Text mining-based fake news detection using news and social media data," The Journal of Society for e-Business Studies, Vol.23, No.4, pp.19-39, 2018. https://doi.org/10.7838/JSEBS.2018.23.4.019
- J. Shim, J. Lee, I. Jeong, and H. Ahn, "A study on Korean fake news detection model using word embedding," Proceedings of the Korean Society of Computer Information Conference, Vol.28, No.2, pp.199-202, 2020.
- SNU FactCheck [Internet], https://factcheck.snu.ac.kr/.
- R. Kumar, A. Goswami, P. Narang, and S. Sinha, "FNDNet - A deep convolutional neural network for fake news detection," Cognitive Systems Research, Vol.61, pp.32-44, 2020. https://doi.org/10.1016/j.cogsys.2019.12.005
- J. Heejung and O. Dongsuk, "exBAKE: Automatic fake news detection model based on bidirectional encoder representations from transformers (BERT)," Applied Sciences, Vol.9, No.4062, 2019.
- S. Yoon, K. Park, and J. Shin, "Detecting incongruity between news headline and body text via a deep hierarchical encoder," arXiv:1811.07066, 2018.
- J. Devlin, M. Chang, K. Lee, and K. Toutanova, "BERT: Pretraining of deep bidirectional transformers for language understanding," arXiv:1810.04805, 2018.
- BERT [Internet], https://huggingface.co/transformers/model_doc/bert.html.
- KoBERT [Internet], https://github.com/SKTBrain/KoBERT.
- Text Summarization with Attention mechanism [Internet], https://wikidocs.net/72820.
- Source Link [Internet], https://drive.google.com/drive/folders/1kQmSoYoq8AoXsbmstTpL2zLjPUEHQB1o?usp=sharing.
- Y. Liu, "Fine-tune BERT for Extractive Summarization," arXiv:1903.10318v2, 2019.
- ETRI Language Analysis API [Internet], https://aiopen.etri.re.kr/guide_wiseNLU.php.
- BeautifulSoup [Internet], https://www.crummy.com/software/BeautifulSoup/bs4/doc/.
- Requests [Internet], https://pypi.org/project/requests/.
- Korean document extractive summarization AI competition [Internet], https://dacon.io/competitions/official/235671/overview/description.
- C. Lin, "ROUGE: A Package for Automatic Evaluation of Summaries," Text Summarization Branches Out, pp.74-81, 2004.
- The JoongAng News Article [Internet], https://www.joongang.co.kr/article/22260785#home.
- SNU FactCheck '"Verification article," [Internet], https://factcheck.snu.ac.kr/v2/facts/435.
- A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, and L. Jones, "Attention Is All You Need," arXiv:1706.03762, 2017.
- FactChectNet [Internet], https://factchecker.or.kr/.