Reducing the Number of Hidden Nodes in MLP using the Vertex of Hidden Layer's Hypercube

은닉층 다차원공간의 Vertex를 이용한 MLP의 은닉 노드 축소방법

  • 곽영태 (충남대학교 컴퓨터공학과 정회원) ;
  • 이영직 (한국전자통신연구원 멀티모달 I/F팀 정회원) ;
  • 권오석 (충남대학교 컴퓨터공학과 정회원)
  • Published : 1999.09.01


This paper proposes a method of removing unnecessary hidden nodes by a new cost function that evaluates the variance and the mean of hidden node outputs during training. The proposed cost function makes necessary hidden nodes be activated and unnecessary hidden nodes be constants. We can remove the constant hidden nodes without performance degradation. Using the CEDAR handwritten digit recognition, we have shown that the proposed method can remove the number of hidden nodes up to 37.2%, with higher recognition rate and shorter learning time.



  1. Parallel Distributed Processing D. E. Rumelhart;J. L. McClelland
  2. Neural Networks v.78 Improving the convergence of the back-propagaion algorithm A. Van Ooyen;B. Nienhuis
  3. Proc. IJCNN v.I Stepsize variation methods for accelerating the backpropation algorithm J. R. Chen;P. Mars
  4. Proc. ICONIP'94 Improving the Error Back-Propagation Youngjik Lee;Sang-Hoon Oh
  5. Neural Networks for Patern Recognition C. M. Bishop
  6. Neural Networks v.2 Multilayer feedforward networks are universal approximators K. Hornik;M. Stinchcombe;H. White
  7. IEEE Spectrum Working with Neural Networks Dan Hammerstrom
  8. Neural Information Processig System2 The cascade correlation learning architecture S. E. Fahlman;C. Lebiere
  9. IEEE International Conference on Systems, Man and Cybernetics A multi-layer feed-forward neural network with dynamically adjustable structures Lee T.-C;A. M. Peterson;J. C. Tsai
  10. IEEE Trans, Neural Networks v.4 no.5 Pruning Algorithms - A Survey R. Reed
  11. IEEE Trans. Neural Networks v.1 no.2 A simple procedure for pruning back-propagation trained neural networks E. D. Karnin
  12. Proc. of NIPS'89 Optimal Brain Damage Yann Le Cun;Jhon S. Denker;Sara A. Solla
  13. Proc. IJCNN'90 v.1 Novel back propagation algorithm for reduction of hidden units and acceleration of convergence using artificial selection M. hagiwara
  14. Proc. Int. Joint Conf. Neural Networks v.I Improving generalization in back-propagation networks with distributed bottlenecks J. K. Kruschke
  15. Advances in Neural Information Processing(1) A back-propagation algorithm with optimal use of hidden units Y. Chauvin
  16. Advances in Neural Information Processing(I) Skeletonization: A technique for trmming the fat from a network via relevance assessment M. C. Mozer;P. Smolensky
  17. IEEE Trans. Pattern and Machine Intell. v.16 A database for handwritten text recognition research J. J. Hull
  18. IEEE Trans. Neural Networks v.4 no.1 Backpropagation Neural Nets with One ad Two Hidden Layers J. Villiers;E. Barnard
  19. 전자공학회논문지 v.28 no.4 역전파 학습시 초기 가중치가 학습의 조기 포화에 미치는 영향 오상훈;이영직;김명원