DOI QR코드

DOI QR Code

AN OPTIMAL BOOSTING ALGORITHM BASED ON NONLINEAR CONJUGATE GRADIENT METHOD

  • CHOI, JOOYEON (DEPARTMENT OF MATHEMATICS, EWHA WOMANS UNIVERSITY) ;
  • JEONG, BORA (DEPARTMENT OF MATHEMATICS, EWHA WOMANS UNIVERSITY) ;
  • PARK, YESOM (DEPARTMENT OF MATHEMATICS, EWHA WOMANS UNIVERSITY) ;
  • SEO, JIWON (DEPARTMENT OF MATHEMATICS, EWHA WOMANS UNIVERSITY) ;
  • MIN, CHOHONG (DEPARTMENT OF MATHEMATICS, EWHA WOMANS UNIVERSITY)
  • Received : 2018.01.14
  • Accepted : 2018.03.07
  • Published : 2018.03.25

Abstract

Boosting, one of the most successful algorithms for supervised learning, searches the most accurate weighted sum of weak classifiers. The search corresponds to a convex programming with non-negativity and affine constraint. In this article, we propose a novel Conjugate Gradient algorithm with the Modified Polak-Ribiera-Polyak conjugate direction. The convergence of the algorithm is proved and we report its successful applications to boosting.

Keywords

References

  1. Rich Caruana and Alexandru Niculescu-Mizil. An empirical comparison of supervised learning algorithms. In Proceedings of the 23rd international conference on Machine learning, pages 161-168. ACM, 2006.
  2. Ayhan Demiriz, Kristin P Bennett, and John Shawe-Taylor. Linear programming boosting via column generation. Machine Learning, 46(1):225-254, 2002. https://doi.org/10.1023/A:1012470815092
  3. Yoav Freund and Robert E Schapire. A desicion-theoretic generalization of on-line learning and an application to boosting. In European conference on computational learning theory, pages 23-37. Springer, 1995.
  4. Adam J Grove and Dale Schuurmans. Boosting in the limit: Maximizing the margin of learned ensembles. In AAAI/IAAI, pages 692-699, 1998.
  5. Can Li. A conjugate gradient type method for the nonnegative constraints optimization problems. Journal of Applied Mathematics, 2013, 2013.
  6. B. Loeffelholz, B. Earl, and B.W. Kenneth. Predicting nba games using neural networks. Journal of Quantitative Analysis in Sports, pages 1-15, 2009.
  7. N.Duffy and D.Helmbold. A geometric approach to leveraging weak learners. In Computational Learning Theory, Lecture Notes in Comput. Sci., pages 18-33. Springer, 1999.
  8. R.E.Schapire. The boosting approach to machine learning: An overview. In Nonlinear Estimation and Classification. Lecture Notes in Statist., volume 171, pages 149-171. Springer, 2003.
  9. R.Meir and G.Ratsch. An introduction to boosting and leveraging. In Advanced Lectures on Machine Learning, volume 2600, pages 119-183. Springer, 2003.
  10. Cynthia Rudin, Robert E Schapire, Ingrid Daubechies, et al. Analysis of boosting algorithms using the smooth margin function. The Annals of Statistics, 35(6):2723-2768, 2007. https://doi.org/10.1214/009053607000000785
  11. Chunhua Shen and Hanxi Li. On the dual formulation of boosting algorithms. IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(12):2216-2231, 2010. https://doi.org/10.1109/TPAMI.2010.47