• 제목/요약/키워드: backpropagation algorithm

검색결과 351건 처리시간 0.023초

하이브리드 알고리즘을 이용한 신경망의 학습성능 개선 (Improving the Training Performance of Neural Networks by using Hybrid Algorithm)

  • 김원욱;조용현;김영일;강인구
    • 한국정보처리학회논문지
    • /
    • 제4권11호
    • /
    • pp.2769-2779
    • /
    • 1997
  • 본 논문에서는 공액기울기법과 터널링 시스템을 조합사용하여 신경망의 학습성능을 향상시킬 수 있는 효율적인 방법을 제안하였다. 빠른 수렴속도의 학습을 위하여 공액 기울기법에 기초한 후향전파 알고리즘을 사용하였고, 국소최적해를 만났을 때 이를 벗어난 다른 연결가중치의 설정을 위해 동적터널링 시스템에 기초한 후향전파 알고리즘을 조합한 학습 알고리즘을 적용하였다. 제안된 방법을 패리티 검사 및 패턴분류 문제에 각각 적용하여 기존의 기울기 하강법에 기초한 후향전파 알고리즘 및 기울기 하강법과 동적터널링 시스템을 조합한 후향전파 알고리즘방법의 결과와 비교 고찰하여 제안된 방법이 다른 방법들 보다 학습성능에서 우수함을 나타내었다.

  • PDF

후향전파 알고리즘과 동적터널링 시스템을 조합한 다층신경망의 새로운 학습방법 (A new training method of multilayer neural networks using a hybrid of backpropagation algorithm and dynamic tunneling system)

  • 조용현
    • 전자공학회논문지B
    • /
    • 제33B권4호
    • /
    • pp.201-208
    • /
    • 1996
  • This paper proposes an efficient method for improving the training performance of the neural network using a hybrid of backpropagation algorithm and dynamic tunneling system.The backpropagation algorithm, which is the fast gradient descent method, is applied for high-speed optimization. The dynamic tunneling system, which is the deterministic method iwth a tunneling phenomenone, is applied for blobal optimization. Converging to the local minima by using the backpropagation algorithm, the approximate initial point for escaping the local minima is estimated by the pattern classification, and the simulation results show that the performance of proposed method is superior th that of backpropagation algorithm with randomized initial point settings.

  • PDF

수정된 Activation Function Derivative를 이용한 오류 역전파 알고리즘의 개선 (Improved Error Backpropagation Algorithm using Modified Activation Function Derivative)

  • 권희용;황희영
    • 대한전기학회논문지
    • /
    • 제41권3호
    • /
    • pp.274-280
    • /
    • 1992
  • In this paper, an Improved Error Back Propagation Algorithm is introduced, which avoids Network Paralysis, one of the problems of the Error Backpropagation learning rule. For this purpose, we analyzed the reason for Network Paralysis and modified the Activation Function Derivative of the standard Error Backpropagation Algorithm which is regarded as the cause of the phenomenon. The characteristics of the modified Activation Function Derivative is analyzed. The performance of the modified Error Backpropagation Algorithm is shown to be better than that of the standard Error Back Propagation algorithm by various experiments.

  • PDF

개선된 유전자 역전파 신경망에 기반한 예측 알고리즘 (Forecasting algorithm using an improved genetic algorithm based on backpropagation neural network model)

  • 윤여창;조나래;이성덕
    • Journal of the Korean Data and Information Science Society
    • /
    • 제28권6호
    • /
    • pp.1327-1336
    • /
    • 2017
  • 본 연구에서는 단기 예측을 위한 자기회귀누적이동평균모형, 역전파 신경망 및 유전자 알고리즘의 결합 적용에 대하여 논의하고 이를 통한 유전자-신경망 알고리즘의 효용성을 살펴본다. 일반적으로 역전파 알고리즘은 지역 최소값에 수렴될 수 있는 단점이 있기 때문에, 여기서는 예측 정확도를 높이기 위해 역전파 신경망 구조를 최적화하고 유전자 알고리즘을 결합한 유전자-신경망 알고리즘 기반 예측모형을 구축한다. 실험을 통한 오차 비교는 KOSPI 지수를 이용한다. 결과는 이 연구에서 제안된 유전자-신경망 모형이 역전파 신경망 모형과 비교할 때 예측 정확도에서 어느 정도 유의한 효율성을 보여주고자 한다.

확률적 근사법과 후형질과 알고리즘을 이용한 다층 신경망의 학습성능 개선 (Improving the Training Performance of Multilayer Neural Network by Using Stochastic Approximation and Backpropagation Algorithm)

  • 조용현;최흥문
    • 전자공학회논문지B
    • /
    • 제31B권4호
    • /
    • pp.145-154
    • /
    • 1994
  • This paper proposes an efficient method for improving the training performance of the neural network by using a hybrid of a stochastic approximation and a backpropagation algorithm. The proposed method improves the performance of the training by appliying a global optimization method which is a hybrid of a stochastic approximation and a backpropagation algorithm. The approximate initial point for a stochastic approximation and a backpropagation algorihtm. The approximate initial point for fast global optimization is estimated first by applying the stochastic approximation, and then the backpropagation algorithm, which is the fast gradient descent method, is applied for a high speed global optimization. And further speed-up of training is made possible by adjusting the training parameters of each of the output and the hidden layer adaptively to the standard deviation of the neuron output of each layer. The proposed method has been applied to the parity checking and the pattern classification, and the simulation results show that the performance of the proposed method is superior to that of the backpropagation, the Baba's MROM, and the Sun's method with randomized initial point settings. The results of adaptive adjusting of the training parameters show that the proposed method further improves the convergence speed about 20% in training.

  • PDF

Performance Improvement of Backpropagation Algorithm by Automatic Tuning of Learning Rate using Fuzzy Logic System

  • Jung, Kyung-Kwon;Lim, Joong-Kyu;Chung, Sung-Boo;Eom, Ki-Hwan
    • Journal of information and communication convergence engineering
    • /
    • 제1권3호
    • /
    • pp.157-162
    • /
    • 2003
  • We propose a learning method for improving the performance of the backpropagation algorithm. The proposed method is using a fuzzy logic system for automatic tuning of the learning rate of each weight. Instead of choosing a fixed learning rate, the fuzzy logic system is used to dynamically adjust the learning rate. The inputs of fuzzy logic system are delta and delta bar, and the output of fuzzy logic system is the learning rate. In order to verify the effectiveness of the proposed method, we performed simulations on the XOR problem, character classification, and function approximation. The results show that the proposed method considerably improves the performance compared to the general backpropagation, the backpropagation with momentum, and the Jacobs'delta-bar-delta algorithm.

기울기 조정에 의한 다층 신경회로망의 학습효율 개선방법에 대한 연구 (A Study on the Learning Efficiency of Multilayered Neural Networks using Variable Slope)

  • 이형일;남재현;지선수
    • 산업경영시스템학회지
    • /
    • 제20권42호
    • /
    • pp.161-169
    • /
    • 1997
  • A variety of learning methods are used for neural networks. Among them, the backpropagation algorithm is most widely used in such image processing, speech recognition, and pattern recognition. Despite its popularity for these application, its main problem is associated with the running time, namely, too much time is spent for the learning. This paper suggests a method which maximize the convergence speed of the learning. Such reduction in e learning time of the backpropagation algorithm is possible through an adaptive adjusting of the slope of the activation function depending on total errors, which is named as the variable slope algorithm. Moreover experimental results using this variable slope algorithm is compared against conventional backpropagation algorithm and other variations; which shows an improvement in the performance over pervious algorithms.

  • PDF

새로운 다층 신경망 학습 알고리즘 (A new learning algorithm for multilayer neural networks)

  • 고진욱;이철희
    • 대한전자공학회:학술대회논문집
    • /
    • 대한전자공학회 1998년도 추계종합학술대회 논문집
    • /
    • pp.1285-1288
    • /
    • 1998
  • In this paper, we propose a new learning algorithm for multilayer neural networks. In the error backpropagation that is widely used for training multilayer neural networks, weights are adjusted to reduce the error function that is sum of squared error for all the neurons in the output layer of the network. In the proposed learning algorithm, we consider each output of the output layer as a function of weights and adjust the weights directly so that the output neurons produce the desired outputs. Experiments show that the proposed algorithm outperforms the backpropagation learning algorithm.

  • PDF

은닉노드 목표 값을 가진 2개 층 신경망의 분리학습 알고리즘 (A Separate Learning Algorithm of Two-Layered Networks with Target Values of Hidden Nodes)

  • 최범기;이주홍;박태수
    • 한국정보과학회논문지:소프트웨어및응용
    • /
    • 제33권12호
    • /
    • pp.999-1007
    • /
    • 2006
  • 역전파 학습 방법은 속도가 느리고, 지역 최소점이나 고원에 빠져 수렴에 실패하는 경우가 많다고 알려져 있다. 이제까지 알려진 역전파의 대체 방법들은 수렴 속도와 변수에 따른 수렴의 안정성 사이에서 불균형이라는 대가를 치루고 있다. 기존의 전통적인 역전파에서 발생하는 위와 같은 문제점 중, 특히 지역 최소점을 탈피하는 기능을 추가하여 적은 저장 공간으로 안정성이 보장되면서도 빠른 수렴속도를 유지하는 알고리즘을 제안한다. 이 방법은 전체 신경망을 은닉층-출력층(hidden to output)을 의미하는 상위 연결(upper connections)과 입력층-은닉층(input to hidden)을 의미하는 하위 연결(lower connections) 2개로 분리하여 번갈아 훈련을 시키는 분리 학습방법을 적용한다. 본 논문에서 제안하는 알고리즘은 다양한 classification 문제에 적용한 실험 결과에서 보듯이 전통적인 역전파 및 기타 개선된 알고리즘에 비해 계산량이 적고, 성능이 매우 좋으며 높은 신뢰성을 보장한다.

반도체 패키지의 내부 결함 검사용 알고리즘 성능 향상 (The Performance Advancement of Test Algorithm for Inner Defects in Semiconductor Packages)

  • 김재열;윤성운;한재호;김창현;양동조;송경석
    • 한국정밀공학회:학술대회논문집
    • /
    • 한국정밀공학회 2002년도 추계학술대회 논문집
    • /
    • pp.345-350
    • /
    • 2002
  • In this study, researchers classifying the artificial flaws in semiconductor packages are performed by pattern recognition technology. For this purposes, image pattern recognition package including the user made software was developed and total procedure including ultrasonic image acquisition, equalization filtration, binary process, edge detection and classifier design is treated by Backpropagation Neural Network. Specially, it is compared with various weights of Backpropagation Neural Network and it is compared with threshold level of edge detection in preprocessing method fur entrance into Multi-Layer Perceptron(Backpropagation Neural network). Also, the pattern recognition techniques is applied to the classification problem of defects in semiconductor packages as normal, crack, delamination. According to this results, it is possible to acquire the recognition rate of 100% for Backpropagation Neural Network.

  • PDF