과제정보
본 연구는 정부의 재원으로 한국연구재단의 지원을 받아 수행된 연구임(NRF-2018S1A3A2075114).
참고문헌
- Ahn Kyung-jin (2014). Sleep disorder threatens national health. Medical observer. Available: http://www.monews.co.kr/news/articleView.html?idxno=76359
- Asan Medical Center (2014). disease encyclopedia insomnia. Asan Medical Center, Available: https://www.amc.seoul.kr/asan/healthinfo/disease/diseaseDetail.do?contentId=31586
- Ko, Young-Soo, Lee, Ju-Hee, & Song, Min (2021). Examining suicide tendency social media texts by deep learning and topic modeling techniques. Journal of the Korean Biblia Society for library and Information Science, 32(3), 247-264. https://doi.org/10.14699/kbiblia.2021.32.3.247
- Lee, Soobin, Kim, Seongdeok, Lee, Juhee, Ko, Youngsoo, & Song, Min (2021). Building and analyzing panic disorder social media corpus for automatic deep learning classification model. Journal of the Korean Society for Information Management, 38(2), 153-172. https://doi.org/10.3743/KOSIM.2021.38.2.153
- National Health Insurance Service (2020). 2020 National Health Insurance Statistical Yearbook.
- Yoon, In-Young (2013). Introduction to sleep disorders. Hanyang Medical Reviews, 33, 197-202. https://doi.org/10.7599/hmr.2013.33.4.197
- Abuzayed, A. & Al-Khalifa, H. (2021). BERT for arabic topic modeling: an experimental study on BERTopic technique. Procedia Computer Science, 189, 191-194. http://doi.org/10.1016/j.procs.2021.05.096
- Angelov, D. (2020). Top2vec: Distributed representations of topics. arXiv preprint, arXiv:2008.09470. https://doi.org/10.48550/arXiv.2008.09470
- Buysse D. J. (2013). Insomnia. The Journal of the American Medical Association, 309(7), 706-716. https://doi.org/10.1001/jama.2013.193
- Cheng, Q., Li, T. M., Kwok, C. L., Zhu, T., & Yip, P. S. (2017). Assessing suicide risk and emotional distress in Chinese social media: a text mining and machine learning study. Journal of Medical Internet Research, 19(7), e243. https://doi.org/10.2196/jmir.7276
- Clark, K., Luong, M. T., Le, Q. V., & Manning, C. D. (2020). Electra: Pre-training text encoders as discriminators rather than generators. arXiv preprint, arXiv:2003.10555. http://doi.org/10.48550/arXiv.2003.10555
- Devlin, J., Chang, M. W., Lee, K., & Toutanova, K. (2018). Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint, arXiv:1810.04805. https://doi.org/10.48550/arXiv.1810.04805
- Grootendorst, M. (2020). Bertopic: Leveraging bert and c-tf-idf to create easily interpretable topics. Zenodo. https://doi.org/10.5281/zenodo.4381785
- Grootendorst, M. (2022). BERTopic: Neural topic modeling with a class-based TF-IDF procedure. arXiv preprint, arXiv:2203.05794. https://doi.org/10.48550/arXiv.2203.05794
- Guo, C., Lin, S., Huang, Z., & Yao, Y. (2021). Mental health question and answering system based on bert model and knowledge graph technology. Proceedings of the 2nd International Symposium on Artificial Intelligence for Medicine Sciences, 472-476. https://doi.org/10.1145/3500931.3501011
- He, Q., Veldkamp, B. P., Glas, C. A., & de Vries, T. (2017). Automated assessment of patients' self-narratives for posttraumatic stress disorder screening using natural language processing and text mining. Assessment, 24(2), 157-172. https://doi.org/10.1177/1073191115602551
- Hendry, D., Darari, F., Nurfadillah, R., Khanna, G., Sun, M., Condylis, P. C., & Taufik, N. (2021). Topic modeling for customer service chats. In 2021 International Conference on Advanced Computer Science and Information Systems, 1-6. https://doi.org/10.1109/ICACSIS53237.2021.9631322
- Hochreiter, S. & Schmidhuber, J. (1997). Long short-term memory. Neural Computation, 9(8), 1735-1780. https://doi.org/10.1162/neco.1997.9.8.1735
- Jamison-Powell, S., Linehan, C., Daley, L., Garbett, A., & Lawson, S. (2012). "I can't get no sleep" discussing insomnia on twitter. Proceedings of the Sigchi Conference on Human Factors in Computing Systems, 1501-1510. https://doi.org/10.1145/2207676.2208612
- Kingma, D. P. & Ba, J. (2015). Adam: A method for stochastic optimization. ICLR. 2015. arXiv preprint, arXiv:1412.6980, 9. https://doi.org/10.48550/arXiv.1412.6980
- Koh, J. X. & Liew, T. M. (2020). How loneliness is talked about in social media during COVID-19 pandemic: Text mining of 4,492 Twitter feeds. Journal of Psychiatric Research, 145, 317-324. https://doi.org/10.1016/j.jpsychires.2020.11.015
- Lan, Z., Chen, M., Goodman, S., Gimpel, K., Sharma, P., & Soricut, R. (2019). Albert: A lite bert for self-supervised learning of language representations. arXiv preprint, arXiv:1909.11942. https://doi.org/10.48550/arXiv.1909.11942
- Liu, Y., Ott, M., Goyal, N., Du, J., Joshi, M., Chen, D., Levy, O., Lewis, M., Zettlemoyer, L., & Stoyanov, V. (2019). Roberta: A robustly optimized bert pretraining approach. arXiv preprint, arXiv:1907.11692. https://doi.org/10.48550/arXiv.1907.11692
- Martinez-Castano, R., Htait, A., Azzopardi, L., & Moshfeghi, Y. (2021). BERT-Based transformers for early detection of mental health illnesses. In International Conference of the Cross-Language Evaluation Forum for European Languages, Springer, 189-200. https://doi.org/10.1007/978-3-030-85251-1_15
- Nikhil Chandran, A., Sreekumar, K., & Subha, D. P. (2021). EEG-based automated detection of schizophrenia using long short-term memory (LSTM) network. In Advances in Machine Learning and Computational Intelligence, 26, Springer, Singapore, 229-236. https://doi.org/10.1007/978-981-15-5243-4_19
- Sateia M. J. (2014). International classification of sleep disorders-third editiond. Chest, 146(5), 1387-1394. https://doi.org/10.1378/chest.14-0970
- Sia, S., Dalmia, A., & Mielke, S. J. (2020). Tired of topic models? clusters of pretrained word embeddings make for fast and good topics too!. arXiv preprint, arXiv:2004.14914. https://doi.org/10.48550/arXiv.2004.14914
- van der Nagel, E. & Frith, J. (2015). Anonymity, pseudonymity, and the agency of online identity: Examining the social practices of r/Gonewild. First Monday, 20(3). https://doi.org/10.5210/fm.v20i3.5615
- Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, T., & Polosukhin, I. (2017). Attention is all you need. Advances in Neural Information Processing Systems, 30. https://doi.org/10.48550/arXiv.1706.03762
- Yang, Z., Dai, Z., Yang, Y., Carbonell, J., Salakhutdinov, R. R., & Le, Q. V. (2019). XLNet: Generalized autoregressive pretraining for language understanding. Advances in Neural Information Processing Systems, 32. https://doi.org/10.48550/arXiv.1906.08237