Acknowledgement
본 연구는 과학기술정보통신부 및 정보통신기획평가원의 지역지능화혁신인재양성(Grand ICT연구센터) 사업 (IITP-2023-2016-0-00318)과 과학기술정보통신부의 재원으로 수행된 연구개발특구진흥재단-기술사업화 역량강화 사업의 지원을 받아 수행된 연구임.(No. 2023-BS-RD-0061 / 지능형 보안 감시 시스템 고도화 및 상용화 기술 개발)
References
- A.Krizhevsky, I.Sutskever and G.E.Hinton, "ImageNet Classification with Deep Convolutional Neural Networks," Advances in Neural Information Processing Systems 25, 2012.
- C.Szegedy, W.Liu, Y.Jia, P.Sermanet, S.Reed, D.Anguelov, D.Erhan, V.Vanhoucke and A.Rabinovich, "Going Deeper with Convolutions," Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp.1-9, 2015.
- A.Dosovitskiy, L.Beyer, A.Kolesnikov, D.Weissenborn, X.Zhai, T.Unterthiner, M.Dehghani, M.Minderer, G.Heigold, S.Gelly, J.Uszkoreit and N.Houlsby, "An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale," International Conference on Learning Representations, 2021.
- K.He, X.Zhang, S.Ren and J.Sun, "Deep Residual Learning for Image Recognition," Computer Vision and Pattern Recognition, pp.770-778, 2016.
- B.Lim, S.O.Arik, N.Loeff and T.Pfister, "Temporal Fusion Transformers for Interpretable Multi-Horizon Time Series Forecasting," International Journal of Forecasting, Vol.37, No.4, pp.1748-1764, 2021. https://doi.org/10.1016/j.ijforecast.2021.03.012
- J.Devlin, M.W.Chang, K.Lee and K.Toutanova. "BERT: Pre-Training of Deep Bidirectional Transformers for Language Understanding," Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Vo1.1, 2019.
- A.Vaswani, N.Shazeer, N.Parmar, J.Uszkoreit, L.Jones, A.N.Gomez, L.Kaiser and L.Polosukhin, "Attention Is All You Need," Advances in Neural Information Processing systems 30, pp.5998-6008, 2017.
- S.O.Arik and T.Pfister, "Tabnet: Attentive Interpretable Tabular Learning," Proceedings of the AAAI Conference on Artificial Intelligence, Vol.35, No.8, pp.6679-6687, 2021.
- R.Shwartz-Ziv and A.Armon, "Tabular Data: Deep Learning is Not All You Need," Information Fusion, Vol.81, pp.84-90, 2022. https://doi.org/10.1016/j.inffus.2021.11.011
- V.Borisov, T.Leemann, K.Sessler, J.Haug, M.Pawelczyk and G.Kansneci, "Deep Neural Networks and Tabular Data: A Survey," IEEE Transactions on Neural Networks and Learning Systems, 2022.
- I.Shavitt and E.Segal, "Regularization Learning Networks: Deep Learning for Tabulat Datasets," International Conference on Neural Information Processing Systems 31, pp.1379-1389, 2018.
- G.Somepalli, M.Goldblum, A.Schwarzschild, C.B.Bruss and T.Goldstein, "SAINT: Improved Neural Networks for Tabular Data via Row Attention and Contrastive Pre-Training," arXiv:2106.01342, 2021.
- L.Buturovic and D.Miljkovic, "A Novel Method for Classification of Tabular Data Using Convolutional Neural Network," BioRxiv, 2020.
- M.I.Iqbal, S.H.Mukta and A.R.Hasan, "A Dynamic Weighted Tabular Method for Convolutional Neural Networks," IEEE Acces, 10, pp.134183-134198, 2022. https://doi.org/10.1109/ACCESS.2022.3231102
- Y.Zhu, T.Brettin, F.Xia, A.Partin, M.Shukla, H.Yoo, Y.A.Evard, J.H.Doroshow and R.L.Stevens, "Converting Tabular Data into Images for Deep Learning With Convolutional Neural Networks," Scientific Reports 11.1, 2021.
- T.Chen and C.Guestrin, "XGBoost: A Scalable Tree Boosting System," The 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp.785-794, 2016.