References
- G. Y. Lee & S. B. Lee. (2018). Universal Prediction System Realization Using RNN. The Journal of Korean Institute of Information Technology, 16(10), 11-20. DOI : 10.14801/jkiit.2018.16.10.11
- C. Y. Lee & T. G. Lee, Kyungseop Shin. (2019). Performance Comparison of Natural Language Processing Model Based on Deep Neural Networks. The Journalof Korean Institute Comunications and Information Sciences, 44(7), 1344-1350 https://doi.org/10.7840/kics.2019.44.7.1344
- C. Y. Lee & J. Kim, (2018). The prediction and analysis of the power energy time series by using the elman recurrent neural network. Journal of the Society of Korea Industrial and Systems Engineering, 41(1), 84-93. DOI : 10.11627/jkise.2018.41.1.084
- Y. Kim et al. (2018). Performance evaluation of machine learning and deep learning algorithms in crop classification: Impact of hyperparameters and training sample size. Korean Journal of Remote Sensing, 34(5), 811-827. DOI : 10.7780/kjrs.2018.34.5.9
- I. Aliyu, R. M. Mahmood & C. G. Lim. (2019). LSTM Hyperparameter Optimization for an EEG-Based Efficient Emotion Classification in BCI. The Journal of the Korea institute of electronic communication sciences, 14(6), 1171-1180. DOI : 10.37727/jkdas.2019.21.4.1771
- S. Karita et al. (2019, December). A comparative study on transformer vs rnn in speech applications. In 2019 IEEE Automatic Speech Recognition and Understanding Workshop (ASRU) (pp. 449-456). IEEE. DOI : 10.1109/ASRU46091.2019.9003750
- K. Rao, H. Sak & R. Prabhavalkar. (2017, December). Exploring architectures, data and units for streaming end-to-end speech recognition with rnn-transducer. In 2017 IEEE Automatic Speech Recognition and Understanding Workshop (ASRU) (pp. 193-199). IEEE. DOI : 10.1109/ASRU.2017.8268935
- Colah' Blog. (2015). Understanding LSTM Networks. (Online). https://colah.github.io/posts/2015-08-Understanding-LSTMs/
- X. Zhang, F. Chen & R. Huang. (2018). A combination of RNN and CNN for attentionbased relation classification. Procedia computer science, 131, 911-917. https://doi.org/10.1016/j.procs.2018.04.221
- C. Baziotis, N. Pelekis & C. Doulkeridis. (2017, August). Datastories at semeval-2017 task 6: Siamese LSTM with attention for humorous text comparison. In Proceedings of the 11th International Workshop on Semantic Evaluation (SemEval-2017) (pp. 390-395).
- D. A. Reddy, M. A. Kumar & K. P. Soman. (2019). c In Soft Computing and Signal Processing (pp. 385-394). Springer, Singapore.
- Z. Li, R. Kulhanek, S. Wang, Y. Zhao & S. Wu. (2018, April). Slim embedding layers for recurrent neural language models. In Proceedings of the AAAI Conference on Artificial Intelligence (Vol. 32, No. 1). DOI : 10.1007/978-981-13-3393-4_40
- T. Mikolov, I. Sutskever, K. Chen, G. Corrado & J. Dean. (2013). Distributed representations of words and phrases and their compositionality. arXiv preprint arXiv:1310.4546.
- N. Reimers & I. Gurevych. (2019). Alternative weighting schemes for elmo embeddings. arXiv preprint arXiv:1904.02954.
- I. Santos, N. Nedjah & L. de Macedo Mourelle. (2017, November). Sentiment analysis using convolutional neural network with fastText embeddings. In 2017 IEEE Latin American Conference on Computational Intelligence (LA-CCI) (pp. 1-5). IEEE. DOI : 10.1109/LA-CCI.2017.8285683
- A. Tifrea, G.Becigneul & O. E. Ganea. (2018). Poincar\'e GloVe: Hyperbolic Word Embeddings. arXiv preprint arXiv:1810.06546.
- S. Choi, J. Seol & S. G. Lee. (2016), On Word Embedding Models and Parameters Optimized for Korean. Korean Language Information Science Society, 252-256.