Logical Evolution for Concept Learning

개념학습을 위한 논리적 진화방식

  • Published : 2003.05.01

Abstract

In this paper we present Logical Evolution method which is a new teaming algorithm for the concepts expressed as binary logic function. We try to solve some problems of Inductive Learning algorithms through Logical Evolution. First, to be less affected from limited prior knowledge, it generates features using the gained informations during learning process and learns the concepts with these features. Second, the teaming is done using not the whole example set but the individual example, so even if new problem or new input-output variables are given, it can use the previously generated features. In some cases these old features can make the teaming process more efficient. Logical Evolution method consists of 5 operations which are selected and performed by the logical evaluation procedure for feature generation and learning process. To evaluate the performance of the present algorithm, we make experiments on MONK data set and a newly defined problem.

이 논문에서는, 이진 논리 함수(binary logic function)로 표현되는 개념들에 대한 새로운 학습방법인 논리적 진화방식(Logical Evolution)을 제안하였다. 그리고 이 방법을 통해 기존 귀납학습의 문제점들을 해결하고자 시도하였다. 사용하는 특징이 사전지식의 영향을 적게 받도록, 학습과정에서 얻어진 정보를 이용하여 특징을 생성하고 동시에 이를 이용하여 학습한다. 그리고 전체 자료가 아니라 개별 자료를 이용하여 특징생성 및 학습을 수행한다. 그 결과 새로운 문제가 주어지거나 입출력이 변경되는 경우에도, 이전의 특징을 재사용할 수 있으며 겨우에 따라서는 보다 효율적인 학습이 가능하다. 논리적 진화방식은 5가지 연산으로 구성되며, 이러한 연산들은 특징생성 및 학습 과정에서 논리적 평가방식(logical evaluation)에 의해 적절하게 선택되고 실행된다. 제안된 방법의 성능을 평가하기 위해서 MONK 문제와 새로 정의한 다른 문제를 이용하였다.

Keywords

References

  1. Mitchell, T., Machine Learning, MacGraw-Hill, 1997
  2. Usama M. Fayyad, Gregory Piatesky-Shaprio, Padhraic Smyth and Ramasamy Uthurusamy, Advances in Knowledge Discovery and Data Mining, MIT Press, 1996
  3. Koza, John R., Genetic Programming, MIT Press, 1992
  4. Simon Haykin, Neural Networks, 2th ed, Prentice-Hall, 1999
  5. Quinlan, J.R., 'Discovering rules by induction from large collections of examples,' in Expert systems in the micro electronic age, Edinburg University Press, 1979
  6. P. Clark and T. Niblett, 'The CN2 Induction algorithm,' Machine Learning Journal, Vol.3, No.4, pp.261-283, 1989 https://doi.org/10.1023/A:1022641700528
  7. Rumelhart, D.E., J.K. McClelland, eds., Parallel Distributed Processing : Exploration in the Microstructure of Cognition, vol.1, Cambridge, MA: MIT Press
  8. Michalski, R.S., 'A Review of Machine Learning Methods' in Machine Learning and Data Mining : Methods and Applications, John Wiley & Sons, 1996
  9. Bleodorn, E. and Michalski, R.S., 'Data-driven Constructive Induction in AQ17-DCI : A Method and Experiments,' Reports of Machine Learning and Inference Laboratory, Center for Artificial Intelligence, George Mason University, 1991
  10. Pagallo, G., and Haussler, D., 'Boolean feature discovery in empirical learning,' Machine Learning, 5., pp. 71-99, 1990 https://doi.org/10.1023/A:1022611825350
  11. Yang, D-S., Rendell, L. A., and Blix, G., 'A Scheme For Feature Construction and a Comparision of Empirical Learning Method,' Proceedings of the 12th International Joint Conference on Artificial Intelligence, pp. 699-704, 1991
  12. Thrun, S.B. et al., 'The MONK's Problems : A Performance Comparison of Different Learning Algorithms,' Technical Report CMU-CS-91:197, Carnegie Mellon University, Pittsburgh, 1991