초기값의 최적 설정에 의한 최적화용 신경회로망의 성능개선

Improving the Performances of the Neural Network for Optimization by Optimal Estimation of Initial States

  • 발행 : 1993.08.01

초록

This paper proposes a method for improving the performances of the neural network for optimization by an optimal estimation of initial states. The optimal initial state that leads to the global minimum is estimated by using the stochastic approximation. And then the update rule of Hopfield model, which is the high speed deterministic algorithm using the steepest descent rule, is applied to speed up the optimization. The proposed method has been applied to the tavelling salesman problems and an optimal task partition problems to evaluate the performances. The simulation results show that the convergence speed of the proposed method is higher than conventinal Hopfield model. Abe's method and Boltzmann machine with random initial neuron output setting, and the convergence rate to the global minimum is guaranteed with probability of 1. The proposed method gives better result as the problem size increases where it is more difficult for the randomized initial setting to give a good convergence.

키워드