Binary classification by the combination of Adaboost and feature extraction methods

특징 추출 알고리즘과 Adaboost를 이용한 이진분류기

  • Ham, Seaung-Lok (Department of Electrical Engineering, Ajou University) ;
  • Kwak, No-Jun (Department of Electrical Engineering, Ajou University)
  • 함승록 (아주대학교 전자공학과) ;
  • 곽노준 (아주대학교 전자공학과)
  • Received : 2011.12.15
  • Accepted : 2012.07.04
  • Published : 2012.07.25


In pattern recognition and machine learning society, classification has been a classical problem and the most widely researched area. Adaptive boosting also known as Adaboost has been successfully applied to binary classification problems. It is a kind of boosting algorithm capable of constructing a strong classifier through a weighted combination of weak classifiers. On the other hand, the PCA and LDA algorithms are the most popular linear feature extraction methods used mainly for dimensionality reduction. In this paper, the combination of Adaboost and feature extraction methods is proposed for efficient classification of two class data. Conventionally, in classification problems, the roles of feature extraction and classification have been distinct, i.e., a feature extraction method and a classifier are applied sequentially to classify input variable into several categories. In this paper, these two steps are combined into one resulting in a good classification performance. More specifically, each projection vector is treated as a weak classifier in Adaboost algorithm to constitute a strong classifier for binary classification problems. The proposed algorithm is applied to UCI dataset and FRGC dataset and showed better recognition rates than sequential application of feature extraction and classification methods.


Supported by : 한국연구재단


  1. Christopher J. C. Burges, "A Tutorial on Support Vector Machines for Pattern Recognition", DataMining and Knowledge Discovery ,Vol. 2 , pp. 121-167, 1998.
  2. Cover T. M. and Hart P. E, "Nearest Neighbor Pattern Classification", IEEE Transactions on Information Theory, Vol .IT-13, no. 1, pp. 21-27, 1967.
  3. Simon Haykin, "Neural networks", 2nd Edition, PrenticeHall, 1999.
  4. Y. Freund, R. E. Schapire, "A Short Introduction to Boosting", Journal of Japanese Society for Artificial Intelligence, Vol. 14, no. 5, pp. 771-780, 1999.
  5. Yoav Freund, Robert E. Schapire, "A Decision- Theoretic Generalization of on-Line Learning and an Application to Boosting," In European Conference on Computational Learning Theory, pp. 23-37, 1995.
  6. Robert. E. Schapire and Yoram Singer, "Improved boosting algorithms using confidencerated predictions," Machine Learning, Vol. 37, no. 3, pp. 297-336, 1999.
  7. P. viola and M. J. Jones, "Robust Real-time Face Detection", International Journal of Computer Vision, Vol. 57, No. 2, pp. 137-154, 2004.
  8. Matthew Turk and Alex Pentland, "Eigenface for Recognition," Journal of Cognitive Neuroscience Vol. 3, no. 1, pp.70-86, 1991.
  9. I.T.Joliffe, "Principal Component Analysis," Springer-Verlag, 1986.
  10. Friedman, J. H. "Regularized Discriminant Analysis," Journal of the American Statistical Association (American Statistical Association) 84(405), pp. 165-175, 1989.
  11. Martinez, A. M.; Kak, A. C. "PCA versus LDA," IEEE Transactions on Pattern Analysis and Machine Intelligence Vol. 23, no. 2, pp. 228-233, 2001.
  12. 함승록, 곽노준, "Boosted-PCA를 이용한 이진분류기", 신호처리합동학술대회 논문집, Vol. 24, no. 1, pp. 195-197, 2011
  13. UCI Data Sets, Availabe:
  14. P. M. Murphy and D. W. Aha, "UCI repository of machine learning databases," 1994, For more information contact or
  15. P. Jonathon Phillips, Harry Weschsler, Jeffery Huang, and Patrick J.Rauss, "The FERET database and evaluation procedure for face-recognition algorithms", Imageand Vision Computing, Vol. 16, no. 5, pp. 295-306, 1998.
  16. P. J. Phillips et. al., "Overview of the Face Recognition Grand Challenge," IEEE Conference on Computer Vision and Pattern Recognition 2005.