DOI QR코드

DOI QR Code

Feature Selecting and Classifying Integrated Neural Network Algorithm for Multi-variate Classification

다변량 데이터의 분류 성능 향상을 위한 특질 추출 및 분류 기법을 통합한 신경망 알고리즘

  • Yoon, Hyun-Soo (School of Industrial Management Engineering, Korea University) ;
  • Baek, Jun-Geol (School of Industrial Management Engineering, Korea University)
  • 윤현수 (고려대학교 산업경영공학과) ;
  • 백준걸 (고려대학교 산업경영공학과)
  • Received : 2010.12.15
  • Accepted : 2011.01.16
  • Published : 2011.06.01

Abstract

Research for multi-variate classification has been studied through two kinds of procedures which are feature selection and classification. Feature Selection techniques have been applied to select important features and the other one has improved classification performances through classifier applications. In general, each technique has been independently studied, however consideration of the interaction between both procedures has not been widely explored which leads to a degraded performance. In this paper, through integrating these two procedures, classification performance can be improved. The proposed model takes advantage of KBANN (Knowledge-Based Artificial Neural Network) which uses prior knowledge to learn NN (Neural Network) as training information. Each NN learns characteristics of the Feature Selection and Classification techniques as training sets. The integrated NN can be learned again to modify features appropriately and enhance classification performance. This innovative technique is called ALBNN (Algorithm Learning-Based Neural Network). The experiments' results show improved performance in various classification problems.

Keywords

References

  1. Burges, C. J. C. (1998), A Tutorial on Support Vector Machines for Pattern Recognition, Data Mining and Knowledge Discovery, 2, 121-167. https://doi.org/10.1023/A:1009715923555
  2. Cortes, C. and Vapnik V. (1995), Support vector network, Machine Learning, 20, 273-297.
  3. Gori, M. and Tesi, A. (1994), On the Problem of Local Minima in Backpropagation, IEEE Transactions on Pattern analysis and Machine Intelligence, 14(1), 76-86.
  4. Hagan, M. T. and Menhaj, M. B. (1994), Training feedforward networks with the marquardt algorithm, IEEE Transactions on Neural Networks, 5(6), 989-993. https://doi.org/10.1109/72.329697
  5. Kabir, M. M. and Islam, M. M., Murase, K. (2010), A new wrapper feature selection approach using neural network, Neruocomputing(Article in Process).
  6. Marquardt, D. W. (1963), An algorithm for least squares estimation of nonlinear parameters, Journal of Society for Industrial and Applied Mathematics, 11(2), 431-441. https://doi.org/10.1137/0111030
  7. Park, M. S. and Choi, J. Y. (2009), Theoretical analysis on feature extraction capability of class-augmented PCA, Journal of Pattern Recognition, 42, 2353-2362. https://doi.org/10.1016/j.patcog.2009.04.011
  8. Pudil, P., Novovicova, J. and Kittler, J. (1994), Floating search methods in feature selection, Pattern Recognition Letters, 15(11), 1119-1125. https://doi.org/10.1016/0167-8655(94)90127-9
  9. Saeys, Y., Inza, I., and Larranaga, Y, P. (2007), A review of feature selection techinques in bioinformatics Bioinformatics, 23(19), 2507-2517. https://doi.org/10.1093/bioinformatics/btm344
  10. Sarkar, I., I. Sarkara, N., Planetb, P. J., Baelc, T. E., Stanleyd, S. E., Siddalle, M., and DeSalle, R. (2002), Characteristic attributes in cancer microarrays, Journal of Biomedical Informatics, 35(2), 111-122. https://doi.org/10.1016/S1532-0464(02)00504-X
  11. Towell, G. G. and Shavlik, J. W. (1994), Knowledge based artificial neural networks, Artificial Intelligence, 70(1), 119-165. https://doi.org/10.1016/0004-3702(94)90105-8
  12. Turk, M. and Pentland, A. (1991), Eigenfaces for recognitions, Journal of Cognitive Neuroscience, 3, 71-86. https://doi.org/10.1162/jocn.1991.3.1.71
  13. Yves, C., David E. Rumelhart, A. (1995), Back Propagation : theory, architecture, and applications, Lawrence Erlbaum Associates, New Jersey, USA.