그림 1. Bi-LSTM 모델 개요 Fig. 1. Introduction of Bi-LSTM Model
그림 2. 문자 단위 합성 곱 신경망(Char-CNN) Fig. 2. Character Level Convolutional Neural Network
그림 3. Bi-LSTM 인공 신경망 모델 Fig. 3. Bi-LSTM Neural Network Model
그림 4. 개체명 인식 모델 구조도 Fig. 4. Named Entity Recognition Model Architecture
표 1. TTA 표준 태그 세트와 제안 태그 세트 비교 Table 1. Comparison Between TTA Standard and Proposal Tag Set
표 2. 제안 모델 실험 결과(정확도/F1점수) Table 2. Proposed Model Experiment Result(Accuracy/F1 score)
표 3. 실험 결과 요약(사용자 사전-임베딩 모델) Table 3. Summary of Experiment Result(user dictionary - embedding model)
표 4. 실험 결과 요약(임베딩 모델-필터 모양) Table 4. Summary of Experiment Result(embedding model - filter shape)
References
- J. Huang, O. Kwon, K. Lee and Y. Kim, "A Chatter Bot for a Task-Oriented Dialogue System", KIPS Transactions on Software and Data Engineering, Vol.6, No.11 pp.499-506, Nov 2017 https://doi.org/10.3745/KTSDE.2017.6.11.499
- D. Nadeau and S. Sekine, "A survey of named entity recognition and classification", Lingvisticae Investigationes, Vol.30, No.1, pp.3-26, Jan 2007 https://doi.org/10.1075/li.30.1.03nad
- S. Na and J. Min, "Character-Based LSTM CRFs for Named Entity Recognition", Proceedings of KISS Korea Computer Congress, pp.729-731, Jun 2016
- S. Nam, Y. Hahm and K. Choi, "Application of Word Vector with Korean Specific Feature to Bi-LSTM model for Named Entity Recognition", Human & Cognitive Language Technology(HCLT 2017), Oct 2017
- S. Hochreiter and J. Schmidhuber, "LONG SHORT-TERM MEMORY", Neural Computation Archive, Vol.9, No.8, pp.1735-1780 , Nov 1997 https://doi.org/10.1162/neco.1997.9.8.1735
- JL. Elman, "Finding Structure in Time", Cognitive Science , Vol.14, No.2, pp.179-211, Mar 1990 https://doi.org/10.1207/s15516709cog1402_1
- G. Lample, M. Ballesteros, S. Subramanian, K. Kawakami and C. Dyer, "Neural Architectures for Named Entity Recognition", Proceedings of NAACL-HLT 2016, pp.260-270, Jun 2016
- X. Ma and E. Hovy, "End-to-end Sequence Labeling via Bi-directional LSTM-CNNs-CRF", arXiv preprint arXiv:1603.01354, 2016
- TTA, "Tag Set and Tagged Corpus for Named Entity Recognition", TTAK.KO-10.0852, 2015
- T. Mikolov, I. Sutskever, K. Chen, GS. Corrado, and J. Dean, "Distributed Representations of Words and Phrases and their Compositionality", In Advances in Neural Information Processing Systems, pp.3111-3119, 2013
- J. Pennington, R. Socher, and C. Manning, "GloVe: Global Vectors for Word Representation", In Proceedings of EMNLP-2014, pp.1532-1543, Oct 2014
- X. Zhang, J. Zhao and Y. LeCun, "Character-level Convolutional Networks for Text Classification", Advances in Neural Information Processing Systems 28 (NIPS 2015), Vol.1, pp.649-657, 2015
- C. Sutton and A. McCallum, "An Introduction to Conditional Random Fields for Relational Learning", Foundations and Trends(R) in Machine Learning, Vol.2, 2006
- E. Park and S. Cho, "KoNLPy: Korean natural language processing in Python", The 26th Annual Conference on Human & Cognitive Language Technology, pp.133-136, Oct 2014
- Gensim Topic Modelling for Humans, https://radimrehurek.com/gensim (accessed Jul. 1, 2018).
- GloVe: Global Vectors for Word Representation, https://nlp.stanford.edu/projects/glove/ (accessed Jun. 1, 2018).
- Tensorflow, https://www.tensorflow.org (accessed Jun. 1, 2018).