DOI QR코드

DOI QR Code

Training Artificial Neural Networks and Convolutional Neural Networks using WFSO Algorithm

WFSO 알고리즘을 이용한 인공 신경망과 합성곱 신경망의 학습

  • Jang, Hyun-Woo (Department of Electronics and Information Engineering, Hansung University) ;
  • Jung, Sung Hoon (School of Mechanical and Electronic Engineering, Hansung University)
  • 장현우 (한성대학교 전자정보공학과) ;
  • 정성훈 (한성대학교 기계전자공학부)
  • Received : 2017.08.19
  • Accepted : 2017.08.31
  • Published : 2017.08.31

Abstract

This paper proposes the learning method of an artificial neural network and a convolutional neural network using the WFSO algorithm developed as an optimization algorithm. Since the optimization algorithm searches based on a number of candidate solutions, it has a drawback in that it is generally slow, but it rarely falls into the local optimal solution and it is easy to parallelize. In addition, the artificial neural networks with non-differentiable activation functions can be trained and the structure and weights can be optimized at the same time. In this paper, we describe how to apply WFSO algorithm to artificial neural network learning and compare its performances with error back-propagation algorithm in multilayer artificial neural networks and convolutional neural networks.

본 논문에서는 최적화 알고리즘으로 개발된 WFSO(Water Flowing and Shaking Optimization) 알고리즘을 사용한 인공신경망 과합성공 신경망의 학습 방법을 제안한다. 최적화 알고리즘은 다수의 후보 해를 기반으로 탐색해 나가기 때문에 일반적으로 속도가 느린 단점이 있으나 지역 최소값에 거의 빠지지 않고 병렬화가 용이하며 미분 불가능한 활성화함수를 갖는 인공신경망 학습도 가능하고 구조와 가중치를 동시에 최적화 할 수 있는 장점이 있다. 본 논문에서는 WFSO 알고리즘을 인공신경망 학습에 적용하는 방법을 설명하고 다층 인공신경망과 합성곱 신경망에서 오류역전파 알고리즘과 성능을 비교한다.

Keywords

References

  1. V. Vemuri, "Artificial Neural Networks: Theoretical Concepts," IEEE Computer Society Press, 1990
  2. D. J. Montana and L. Davis, "Training Feedforward Neural Networks Using Genetic Algorithms," In IJCAI, vol. 89, pp. 762-767, Aug 1989.
  3. J. Ilonen, J. K. Kamarainen, and J. Lampinen, "Differential evolution training algorithm for feed-forward neural networks," Neural Processing Letters, vol. 17, no. 1, pp. 93-105. 2003. https://doi.org/10.1023/A:1022995128597
  4. Michael Meissner, Michael Schmuker, and Gisbert Schneider, "Optimized Particle Swarm Optimization (OPSO) and its application to artificial neural network training," BMC bioinformatics, vol. 7, no. 1, pp. 125-135, 2006 https://doi.org/10.1186/1471-2105-7-125
  5. D. Karaboga, B. Akay, and C. Ozturk, "Artificial bee colony (ABC) optimization algorithm for training feed-forward neural networks," MDAI, vol. 7, pp. 318-319, 2007.
  6. Sung Hoon Jung, "Water Flowing and Shaking Optimization," International Journal of Fuzzy Logic and Intelligent Systems, vol. 12, no. 2, pp. 173-180, Jun 2012. https://doi.org/10.5391/IJFIS.2012.12.2.173
  7. K.-T. Bae and C.-J. Kim, "An Agricultural Estimate Price Model of Artificial Neural Network by Optimizing Hidden Layer," Journal of Advanced Information Technology and Convergence, vol. 14, no. 12, pp. 161-169, 2016.
  8. K.-H. Kim and S. H. Jung, "Automatic Generation of a Configured Song with Hierarchical Artificial Neural Networks," Journal of DCS, vol. 18, no. 4, pp. 641-647, 2017.