• Title/Summary/Keyword: error back-propagation algorithm

Search Result 318, Processing Time 0.031 seconds

A Modified Error Back Propagation Algorithm Adding Neurons to Hidden Layer (은닉층 뉴우런 추가에 의한 역전파 학습 알고리즘)

  • 백준호;김유신;손경식
    • Journal of the Korean Institute of Telematics and Electronics B
    • /
    • v.29B no.4
    • /
    • pp.58-65
    • /
    • 1992
  • In this paper new back propagation algorithm which adds neurons to hidden layer is proposed. this proposed algorithm is applied to the pattern recognition of written number coupled with back propagation algorithm through omitting redundant learning. Learning rate and recognition rate of the proposed algorithm are compared with those of the conventional back propagation algorithm and the back propagation through omitting redundant learning. The learning rate of proposed algorithm is 4 times as fast as the conventional back propagation algorithm and 2 times as fast as the back propagation through omitting redundant learning. The recognition rate is 96.2% in case of the conventional back propagation algorithm, 96.5% in case of the back propagation through omitting redundant learning and 97.4% in the proposed algorithm.

  • PDF

A study on the realization of color printed material check using Error Back-Propagation rule (오류 역전파법으로구현한 컬러 인쇄물 검사에 관한 연구)

  • 한희석;이규영
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 1998.10a
    • /
    • pp.560-567
    • /
    • 1998
  • This paper concerned about a imputed color printed material image in camera to decrease noise and distortion by processing median filtering with input image to identical condition. Also this paper proposed the way of compares a normal printed material with an abnormal printed material color tone with trained a learning of the error back-propagation to block classification by extracting five place from identical block(3${\times}$3) of color printed material R, G, B value. As a representative algorithm of multi-layer perceptron the error Back-propagation technique used to solve complex problems. However, the Error Back-propagation is algorithm which basically used a gradient descent method which can be converged to local minimum and the Back Propagation train include problems, and that may converge in a local minimum rather than get a global minimum. The network structure appropriate for a given problem. In this paper, a good result is obtained by improve initial condition and adjust th number of hidden layer to solve the problem of real time process, learning and train.

  • PDF

A Study Of Handwritten Digit Recognition By Neural Network Trained With The Back-Propagation Algorithm Using Generalized Delta Rule (신경망 회로를 이용한 필기체 숫자 인식에 관할 연구)

  • Lee, Kye-Han;Chung, Chin-Hyun
    • Proceedings of the KIEE Conference
    • /
    • 1999.07g
    • /
    • pp.2932-2934
    • /
    • 1999
  • In this paper, a scheme for recognition of handwritten digits using a multilayer neural network trained with the back-propagation algorithm using generalized delta rule is proposed. The neural network is trained with hand written digit data of different writers and different styles. One of the purpose of the work with neural networks is the minimization of the mean square error(MSE) between actual output and desired one. The back-propagation algorithm is an efficient and very classical method. The back-propagation algorithm for training the weights in a multilayer net uses the steepest descent minimization procedure and the sigmoid threshold function. As an error rate is reduced, recognition rate is improved. Therefore we propose a method that is reduced an error rate.

  • PDF

Improving the Error Back-Propagation Algorithm for Imbalanced Data Sets

  • Oh, Sang-Hoon
    • International Journal of Contents
    • /
    • v.8 no.2
    • /
    • pp.7-12
    • /
    • 2012
  • Imbalanced data sets are difficult to be classified since most classifiers are developed based on the assumption that class distributions are well-balanced. In order to improve the error back-propagation algorithm for the classification of imbalanced data sets, a new error function is proposed. The error function controls weight-updating with regards to the classes in which the training samples are. This has the effect that samples in the minority class have a greater chance to be classified but samples in the majority class have a less chance to be classified. The proposed method is compared with the two-phase, threshold-moving, and target node methods through simulations in a mammography data set and the proposed method attains the best results.

Fast Learning Algorithms for Neural Network Using Tabu Search Method with Random Moves (Random Tabu 탐색법을 이용한 신경회로망의 고속학습알고리즘에 관한 연구)

  • 양보석;신광재;최원호
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.5 no.3
    • /
    • pp.83-91
    • /
    • 1995
  • A neural network with one or more layers of hidden units can be trained using the well-known error back propagation algorithm. According to this algorithm, the synaptic weights of the network are updated during the training by propagating back the error between the expected output and the output provided by the network. However, the error back propagation algorithm is characterized by slow convergence and the time required for training and, in some situation, can be trapped in local minima. A theoretical formulation of a new fast learning method based on tabu search method is presented in this paper. In contrast to the conventional back propagation algorithm which is based solely on the modification of connecting weights of the network by trial and error, the present method involves the calculation of the optimum weights of neural network. The effectiveness and versatility of the present method are verified by the XOR problem. The present method excels in accuracy compared to that of the conventional method of fixed values.

  • PDF

Improved Error Backpropagation Algorithm using Modified Activation Function Derivative (수정된 Activation Function Derivative를 이용한 오류 역전파 알고리즘의 개선)

  • 권희용;황희영
    • The Transactions of the Korean Institute of Electrical Engineers
    • /
    • v.41 no.3
    • /
    • pp.274-280
    • /
    • 1992
  • In this paper, an Improved Error Back Propagation Algorithm is introduced, which avoids Network Paralysis, one of the problems of the Error Backpropagation learning rule. For this purpose, we analyzed the reason for Network Paralysis and modified the Activation Function Derivative of the standard Error Backpropagation Algorithm which is regarded as the cause of the phenomenon. The characteristics of the modified Activation Function Derivative is analyzed. The performance of the modified Error Backpropagation Algorithm is shown to be better than that of the standard Error Back Propagation algorithm by various experiments.

  • PDF

Classification of ECG Arrhythmia Signals Using Back-Propagation Network (역전달 신경회로망을 이용한 심전도 파형의 부정맥 분류)

  • 권오철;최진영
    • Journal of Biomedical Engineering Research
    • /
    • v.10 no.3
    • /
    • pp.343-350
    • /
    • 1989
  • A new algorithm classifying ECG Arrhythmia signals using Back-propagation network is proposed. The base-line of ECG signal is detected by high pass filter and probability density function then input data are normalized for learning and classifying. In addition, ECG data are scanned to classify Arrhythmia signal which is hard to find R-wave. A two-layer perceptron with one hidden layer along with error back-propagation learning rule is utilized as an artificial neural network. The proposed algorithm shows outstanding performance under circumstances of amplitude variation, baseline wander and noise contamination.

  • PDF

Estimating Regression Function with $\varepsilon-Insensitive$ Supervised Learning Algorithm

  • Hwang, Chang-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.15 no.2
    • /
    • pp.477-483
    • /
    • 2004
  • One of the major paradigms for supervised learning in neural network community is back-propagation learning. The standard implementations of back-propagation learning are optimal under the assumptions of identical and independent Gaussian noise. In this paper, for regression function estimation, we introduce $\varepsilon-insensitive$ back-propagation learning algorithm, which corresponds to minimizing the least absolute error. We compare this algorithm with support vector machine(SVM), which is another $\varepsilon-insensitive$ supervised learning algorithm and has been very successful in pattern recognition and function estimation problems. For comparison, we consider a more realistic model would allow the noise variance itself to depend on the input variables.

  • PDF

Modified Error Back Propagation Algorithm using the Approximating of the Hidden Nodes in Multi-Layer Perceptron (다층퍼셉트론의 은닉노드 근사화를 이용한 개선된 오류역전파 학습)

  • Kwak, Young-Tae;Lee, young-Gik;Kwon, Oh-Seok
    • Journal of KIISE:Software and Applications
    • /
    • v.28 no.9
    • /
    • pp.603-611
    • /
    • 2001
  • This paper proposes a novel fast layer-by-layer algorithm that has better generalization capability. In the proposed algorithm, the weights of the hidden layer are updated by the target vector of the hidden layer obtained by least squares method. The proposed algorithm improves the learning speed that can occur due to the small magnitude of the gradient vector in the hidden layer. This algorithm was tested in a handwritten digits recognition problem. The learning speed of the proposed algorithm was faster than those of error back propagation algorithm and modified error function algorithm, and similar to those of Ooyen's method and layer-by-layer algorithm. Moreover, the simulation results showed that the proposed algorithm had the best generalization capability among them regardless of the number of hidden nodes. The proposed algorithm has the advantages of the learning speed of layer-by-layer algorithm and the generalization capability of error back propagation algorithm and modified error function algorithm.

  • PDF

Recognition of a New Car Plate using Color Information and Error Back-propagation Neural Network Algorithms (컬러 정보와 오류역전파 신경망 알고리즘을 이용한 신차량 번호판 인식)

  • Lee, Jong-Hee;Kim, Jin-Whan
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.5 no.5
    • /
    • pp.471-476
    • /
    • 2010
  • In this paper, we propose an effective method that recognizes the vehicle license plate using RGB color information and back-propagation neural network algorithm. First, the image of the vehicle license plate is adjusted by the Mean of Blue values in the vehicle plate and two candidate areas of Red and Green region are classified by calculating the differences of pixel values and the final Green area is searched by back-propagation algorithm. Second, our method detects the area of the vehicle plate using the frequence of the horizontal and the vertical histogram. Finally, each of codes are detected by an edge detection algorithm and are recognized by error back-propagation algorithm. In order to evaluate the performance of our proposed extraction and recognition method, we have run experiments on a new car plates. Experimental results showed that the proposed license plate extraction is better than that of existing HSI information model and the overall recognition was effective.