• Title/Summary/Keyword: Incremental Training

Search Result 52, Processing Time 0.031 seconds

Efficient Incremental Learning using the Preordered Training Data (미리 순서가 매겨진 학습 데이타를 이용한 효과적인 증가학습)

  • Lee, Sun-Young;Bang, Sung-Yang
    • Journal of KIISE:Software and Applications
    • /
    • v.27 no.2
    • /
    • pp.97-107
    • /
    • 2000
  • Incremental learning generally reduces training time and increases the generalization of a neural network by selecting training data incrementally during the training. However, the existing methods of incremental learning repeatedly evaluate the importance of training data every time they select additional data. In this paper, an incremental learning algorithm is proposed for pattern classification problems. It evaluates the importance of each piece of data only once before starting the training. The importance of the data depends on how close they are to the decision boundary. The current paper presents an algorithm which orders the data according to their distance to the decision boundary by using clustering. Experimental results of two artificial and real world classification problems show that this proposed incremental learning method significantly reduces the size of the training set without decreasing generalization performance.

  • PDF

Incremental Multi-classification by Least Squares Support Vector Machine

  • Oh, Kwang-Sik;Shim, Joo-Yong;Kim, Dae-Hak
    • Journal of the Korean Data and Information Science Society
    • /
    • v.14 no.4
    • /
    • pp.965-974
    • /
    • 2003
  • In this paper we propose an incremental classification of multi-class data set by LS-SVM. By encoding the output variable in the training data set appropriately, we obtain a new specific output vectors for the training data sets. Then, online LS-SVM is applied on each newly encoded output vectors. Proposed method will enable the computation cost to be reduced and the training to be performed incrementally. With the incremental formulation of an inverse matrix, the current information and new input data are used for building another new inverse matrix for the estimation of the optimal bias and lagrange multipliers. Computational difficulties of large scale matrix inversion can be avoided. Performance of proposed method are shown via numerical studies and compared with artificial neural network.

  • PDF

The relationship between internal marketing and incremental innovation in small business (중소기업에서의 내부마케팅과 구성원들의 점진적 혁신의 관계에 대한 연구)

  • Ahn, Kwan-Young
    • Journal of the Korea Safety Management & Science
    • /
    • v.13 no.4
    • /
    • pp.171-177
    • /
    • 2011
  • This paper reviewed the relationship between internal marketing and incremental innovation, and the moderating effect of firm size. The results of hierarchical multiple regression analysis, based on the responses from 322 employees in small business, showed that almost internal marketing factors effects positively on incremental innovation. All internal marketing factors(CEO support, compensation system, education & training, internal communication, authority delegation) appeared to be related positively with process innovation and service innovation. And all other factors(compensation system, education & training, internal communication, authority delegation) except CEO support showed to have positive relationship with operation innovation. In the moderating effects, internal communication effects more positively on incremental innovation in large firm-size than in small firm-size. But delegation effects more positively on incremental innovation in small firm-size than in large firm-size.

The relationship between managerial system and incremental innovation, and the mediating effect of knowledge transfer in small business (중소기업에서 관리시스템과 점진적 혁신의 관계 및 지식이전의 매개효과)

  • Chang, Kyung-Saeng;Ahn, Kwan Young
    • Journal of the Korea Safety Management & Science
    • /
    • v.19 no.2
    • /
    • pp.135-146
    • /
    • 2017
  • The purpose of this study is to review the relationship between managerial system and incremental innovation, and the mediating effect of knowledge transfer in small business. In order to verify and achieve the purposes mentioned above, questionnaire data were gathered and analysed from 255 enterprise managers in western Kangwon-do province. Empirical survey's findings are as follows; First, CEO's support and education/training appeared to be positively related with knowledge transfer. Second, managerial system and knowledge transfer appeared to be positively related with incremental innovation. Third, knowledge transfer had mediating effect on the relationships of CEO's support-incremental innovation and education/training-incremental innovation.

Stepwise Constructive Method for Neural Networks Using a Flexible Incremental Algorithm (Flexible Incremental 알고리즘을 이용한 신경망의 단계적 구축 방법)

  • Park, Jin-Il;Jung, Ji-Suk;Cho, Young-Im;Chun, Myung-Geun
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.19 no.4
    • /
    • pp.574-579
    • /
    • 2009
  • There have been much difficulties to construct an optimized neural network in complex nonlinear regression problems such as selecting the networks structure and avoiding overtraining problem generated by noise. In this paper, we propose a stepwise constructive method for neural networks using a flexible incremental algorithm. When the hidden nodes are added, the flexible incremental algorithm adaptively controls the number of hidden nodes by a validation dataset for minimizing the prediction residual error. Here, the ELM (Extreme Learning Machine) was used for fast training. The proposed neural network can be an universal approximator without user intervene in the training process, but also it has faster training and smaller number of hidden nodes. From the experimental results with various benchmark datasets, the proposed method shows better performance for real-world regression problems than previous methods.

Incremental Support Vector Learning Method for Function Approximation (함수 근사를 위한 점증적 서포트 벡터 학습 방법)

  • 임채환;박주영
    • Proceedings of the IEEK Conference
    • /
    • 2002.06c
    • /
    • pp.135-138
    • /
    • 2002
  • This paper addresses incremental learning method for regression. SVM(support vector machine) is a recently proposed learning method. In general training a support vector machine requires solving a QP (quadratic programing) problem. For very large dataset or incremental dataset, solving QP problems may be inconvenient. So this paper presents an incremental support vector learning method for function approximation problems.

  • PDF

Face Detection Based on Incremental Learning from Very Large Size Training Data (대용량 훈련 데이타의 점진적 학습에 기반한 얼굴 검출 방법)

  • 박지영;이준호
    • Journal of KIISE:Software and Applications
    • /
    • v.31 no.7
    • /
    • pp.949-958
    • /
    • 2004
  • race detection using a boosting based algorithm requires a very large size of face and nonface data. In addition, the fact that there always occurs a need for adding additional training data for better detection rates demands an efficient incremental teaming algorithm. In the design of incremental teaming based classifiers, the final classifier should represent the characteristics of the entire training dataset. Conventional methods have a critical problem in combining intermediate classifiers that weight updates depend solely on the performance of individual dataset. In this paper, for the purpose of application to face detection, we present a new method to combine an intermediate classifier with previously acquired ones in an optimal manner. Our algorithm creates a validation set by incrementally adding sampled instances from each dataset to represent the entire training data. The weight of each classifier is determined based on its performance on the validation set. This approach guarantees that the resulting final classifier is teamed by the entire training dataset. Experimental results show that the classifier trained by the proposed algorithm performs better than by AdaBoost which operates in batch mode, as well as by ${Learn}^{++}$.

An Incremental Rule Extraction Algorithm Based on Recursive Partition Averaging (재귀적 분할 평균에 기반한 점진적 규칙 추출 알고리즘)

  • Han, Jin-Chul;Kim, Sang-Kwi;Yoon, Chung-Hwa
    • Journal of KIISE:Software and Applications
    • /
    • v.34 no.1
    • /
    • pp.11-17
    • /
    • 2007
  • One of the popular methods used for pattern classification is the MBR (Memory-Based Reasoning) algorithm. Since it simply computes distances between a test pattern and training patterns or hyperplanes stored in memory, and then assigns the class of the nearest training pattern, it cannot explain how the classification result is obtained. In order to overcome this problem, we propose an incremental teaming algorithm based on RPA (Recursive Partition Averaging) to extract IF-THEN rules that describe regularities inherent in training patterns. But rules generated by RPA eventually show an overfitting phenomenon, because they depend too strongly on the details of given training patterns. Also RPA produces more number of rules than necessary, due to over-partitioning of the pattern space. Consequently, we present the IREA (Incremental Rule Extraction Algorithm) that overcomes overfitting problem by removing useless conditions from rules and reduces the number of rules at the same time. We verify the performance of proposed algorithm using benchmark data sets from UCI Machine Learning Repository.

Efficient Construction and Training Multilayer Perceptrons by Incremental Pattern Selection (점진적 패턴 선택에 의한 다충 퍼셉트론의 효율적 구성 및 학습)

  • Jang, Byeong-Tak
    • The Transactions of the Korea Information Processing Society
    • /
    • v.3 no.3
    • /
    • pp.429-438
    • /
    • 1996
  • An incremental learning algorithm is presented that constructs a multilayer perceptron whose size is optimal for solving a given problem. Unlike conventional algorithms in which a fixed size training set is processed repeat-edly, the method uses an increasing number of critical examples to find a necessary and sufficient number of hidden units for learning the entire data. Experimental results in hand- writtern digit recognition shows that the network size optimization combined with incremental pattern selection generalizes significantly better and converges faster than conventional methods.

  • PDF

Unsupervised Incremental Learning of Associative Cubes with Orthogonal Kernels

  • Kang, Hoon;Ha, Joonsoo;Shin, Jangbeom;Lee, Hong Gi;Wang, Yang
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.25 no.1
    • /
    • pp.97-104
    • /
    • 2015
  • An 'associative cube', a class of auto-associative memories, is revisited here, in which training data and hidden orthogonal basis functions such as wavelet packets or Fourier kernels, are combined in the weight cube. This weight cube has hidden units in its depth, represented by a three dimensional cubic structure. We develop an unsupervised incremental learning mechanism based upon the adaptive least squares method. Training data are mapped into orthogonal basis vectors in a least-squares sense by updating the weights which minimize an energy function. Therefore, a prescribed orthogonal kernel is incrementally assigned to an incoming data. Next, we show how a decoding procedure finds the closest one with a competitive network in the hidden layer. As noisy test data are applied to an associative cube, the nearest one among the original training data are restored in an optimal sense. The simulation results confirm robustness of associative cubes even if test data are heavily distorted by various types of noise.