• Title/Summary/Keyword: error back-propagation learning algorithm

Search Result 150, Processing Time 0.027 seconds

A Modified Error Back Propagation Algorithm Adding Neurons to Hidden Layer (은닉층 뉴우런 추가에 의한 역전파 학습 알고리즘)

  • 백준호;김유신;손경식
    • Journal of the Korean Institute of Telematics and Electronics B
    • /
    • v.29B no.4
    • /
    • pp.58-65
    • /
    • 1992
  • In this paper new back propagation algorithm which adds neurons to hidden layer is proposed. this proposed algorithm is applied to the pattern recognition of written number coupled with back propagation algorithm through omitting redundant learning. Learning rate and recognition rate of the proposed algorithm are compared with those of the conventional back propagation algorithm and the back propagation through omitting redundant learning. The learning rate of proposed algorithm is 4 times as fast as the conventional back propagation algorithm and 2 times as fast as the back propagation through omitting redundant learning. The recognition rate is 96.2% in case of the conventional back propagation algorithm, 96.5% in case of the back propagation through omitting redundant learning and 97.4% in the proposed algorithm.

  • PDF

A study on the realization of color printed material check using Error Back-Propagation rule (오류 역전파법으로구현한 컬러 인쇄물 검사에 관한 연구)

  • 한희석;이규영
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 1998.10a
    • /
    • pp.560-567
    • /
    • 1998
  • This paper concerned about a imputed color printed material image in camera to decrease noise and distortion by processing median filtering with input image to identical condition. Also this paper proposed the way of compares a normal printed material with an abnormal printed material color tone with trained a learning of the error back-propagation to block classification by extracting five place from identical block(3${\times}$3) of color printed material R, G, B value. As a representative algorithm of multi-layer perceptron the error Back-propagation technique used to solve complex problems. However, the Error Back-propagation is algorithm which basically used a gradient descent method which can be converged to local minimum and the Back Propagation train include problems, and that may converge in a local minimum rather than get a global minimum. The network structure appropriate for a given problem. In this paper, a good result is obtained by improve initial condition and adjust th number of hidden layer to solve the problem of real time process, learning and train.

  • PDF

Estimating Regression Function with $\varepsilon-Insensitive$ Supervised Learning Algorithm

  • Hwang, Chang-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.15 no.2
    • /
    • pp.477-483
    • /
    • 2004
  • One of the major paradigms for supervised learning in neural network community is back-propagation learning. The standard implementations of back-propagation learning are optimal under the assumptions of identical and independent Gaussian noise. In this paper, for regression function estimation, we introduce $\varepsilon-insensitive$ back-propagation learning algorithm, which corresponds to minimizing the least absolute error. We compare this algorithm with support vector machine(SVM), which is another $\varepsilon-insensitive$ supervised learning algorithm and has been very successful in pattern recognition and function estimation problems. For comparison, we consider a more realistic model would allow the noise variance itself to depend on the input variables.

  • PDF

Classification of ECG Arrhythmia Signals Using Back-Propagation Network (역전달 신경회로망을 이용한 심전도 파형의 부정맥 분류)

  • 권오철;최진영
    • Journal of Biomedical Engineering Research
    • /
    • v.10 no.3
    • /
    • pp.343-350
    • /
    • 1989
  • A new algorithm classifying ECG Arrhythmia signals using Back-propagation network is proposed. The base-line of ECG signal is detected by high pass filter and probability density function then input data are normalized for learning and classifying. In addition, ECG data are scanned to classify Arrhythmia signal which is hard to find R-wave. A two-layer perceptron with one hidden layer along with error back-propagation learning rule is utilized as an artificial neural network. The proposed algorithm shows outstanding performance under circumstances of amplitude variation, baseline wander and noise contamination.

  • PDF

Improved Error Backpropagation Algorithm using Modified Activation Function Derivative (수정된 Activation Function Derivative를 이용한 오류 역전파 알고리즘의 개선)

  • 권희용;황희영
    • The Transactions of the Korean Institute of Electrical Engineers
    • /
    • v.41 no.3
    • /
    • pp.274-280
    • /
    • 1992
  • In this paper, an Improved Error Back Propagation Algorithm is introduced, which avoids Network Paralysis, one of the problems of the Error Backpropagation learning rule. For this purpose, we analyzed the reason for Network Paralysis and modified the Activation Function Derivative of the standard Error Backpropagation Algorithm which is regarded as the cause of the phenomenon. The characteristics of the modified Activation Function Derivative is analyzed. The performance of the modified Error Backpropagation Algorithm is shown to be better than that of the standard Error Back Propagation algorithm by various experiments.

  • PDF

A Modified Error Function to Improve the Error Back-Propagation Algorithm for Multi-Layer Perceptrons

  • Oh, Sang-Hoon;Lee, Young-Jik
    • ETRI Journal
    • /
    • v.17 no.1
    • /
    • pp.11-22
    • /
    • 1995
  • This paper proposes a modified error function to improve the error back-propagation (EBP) algorithm for multi-Layer perceptrons (MLPs) which suffers from slow learning speed. It can also suppress over-specialization for training patterns that occurs in an algorithm based on a cross-entropy cost function which markedly reduces learning time. In the similar way as the cross-entropy function, our new function accelerates the learning speed of the EBP algorithm by allowing the output node of the MLP to generate a strong error signal when the output node is far from the desired value. Moreover, it prevents the overspecialization of learning for training patterns by letting the output node, whose value is close to the desired value, generate a weak error signal. In a simulation study to classify handwritten digits in the CEDAR [1] database, the proposed method attained 100% correct classification for the training patterns after only 50 sweeps of learning, while the original EBP attained only 98.8% after 500 sweeps. Also, our method shows mean-squared error of 0.627 for the test patterns, which is superior to the error 0.667 in the cross-entropy method. These results demonstrate that our new method excels others in learning speed as well as in generalization.

  • PDF

Fast Learning Algorithms for Neural Network Using Tabu Search Method with Random Moves (Random Tabu 탐색법을 이용한 신경회로망의 고속학습알고리즘에 관한 연구)

  • 양보석;신광재;최원호
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.5 no.3
    • /
    • pp.83-91
    • /
    • 1995
  • A neural network with one or more layers of hidden units can be trained using the well-known error back propagation algorithm. According to this algorithm, the synaptic weights of the network are updated during the training by propagating back the error between the expected output and the output provided by the network. However, the error back propagation algorithm is characterized by slow convergence and the time required for training and, in some situation, can be trapped in local minima. A theoretical formulation of a new fast learning method based on tabu search method is presented in this paper. In contrast to the conventional back propagation algorithm which is based solely on the modification of connecting weights of the network by trial and error, the present method involves the calculation of the optimum weights of neural network. The effectiveness and versatility of the present method are verified by the XOR problem. The present method excels in accuracy compared to that of the conventional method of fixed values.

  • PDF

On the set up to the Number of Hidden Node of Adaptive Back Propagation Neural Network (적응 역전파 신경회로망의 은닉 층 노드 수 설정에 관한 연구)

  • Hong, Bong-Wha
    • The Journal of Information Technology
    • /
    • v.5 no.2
    • /
    • pp.55-67
    • /
    • 2002
  • This paper presents an adaptive back propagation algorithm that update the learning parameter by the generated error, adaptively and varies the number of hidden layer node. This algorithm is expected to escaping from the local minimum and make the best environment for convergence to be change the number of hidden layer node. On the simulation tested this algorithm on two learning pattern. One was exclusive-OR learning and the other was $7{\times}5$ dot alphabetic font learning. In both examples, the probability of becoming trapped in local minimum was reduce. Furthermore, in alphabetic font learning, the neural network enhanced to learning efficient about 41.56%~58.28% for the conventional back propagation. and HNAD(Hidden Node Adding and Deleting) algorithm.

  • PDF

Modified Error Back Propagation Algorithm using the Approximating of the Hidden Nodes in Multi-Layer Perceptron (다층퍼셉트론의 은닉노드 근사화를 이용한 개선된 오류역전파 학습)

  • Kwak, Young-Tae;Lee, young-Gik;Kwon, Oh-Seok
    • Journal of KIISE:Software and Applications
    • /
    • v.28 no.9
    • /
    • pp.603-611
    • /
    • 2001
  • This paper proposes a novel fast layer-by-layer algorithm that has better generalization capability. In the proposed algorithm, the weights of the hidden layer are updated by the target vector of the hidden layer obtained by least squares method. The proposed algorithm improves the learning speed that can occur due to the small magnitude of the gradient vector in the hidden layer. This algorithm was tested in a handwritten digits recognition problem. The learning speed of the proposed algorithm was faster than those of error back propagation algorithm and modified error function algorithm, and similar to those of Ooyen's method and layer-by-layer algorithm. Moreover, the simulation results showed that the proposed algorithm had the best generalization capability among them regardless of the number of hidden nodes. The proposed algorithm has the advantages of the learning speed of layer-by-layer algorithm and the generalization capability of error back propagation algorithm and modified error function algorithm.

  • PDF

On the enhancement of the learning efficiency of the adaptive back propagation neural network using the generating and adding the hidden layer node (은닉층 노드의 생성추가를 이용한 적응 역전파 신경회로망의 학습능률 향상에 관한 연구)

  • Kim, Eun-Won;Hong, Bong-Wha
    • Journal of the Institute of Electronics Engineers of Korea TE
    • /
    • v.39 no.2
    • /
    • pp.66-75
    • /
    • 2002
  • This paper presents an adaptive back propagation algorithm that its able to enhancement for the learning efficiency with updating the learning parameter and varies the number of hidden layer node by the generated error, adaptively. This algorithm is expected to escaping from the local minimum and make the best environment for the convergence of the back propagation neural network. On the simulation tested this algorithm on three learning pattern. One was exclusive-OR learning and the another was 3-parity problem and 7${\times}$5 dot alphabetic font learning. In result that the probability of becoming trapped in local minimum was reduce. Furthermore, the neural network enhanced to learning efficient about 17.6%~64.7% for the existed back propagation.