A Method for Optimizing the Structure of Neural Networks Based on Information Entropy

  • Yuan Hongchun (Institute of intelligent Machines, Chinese Academy of Sciences, Hefei 230031) ;
  • Xiong Fanlnu (Institute of intelligent Machines, Chinese Academy of Sciences, Hefei 230031) ;
  • Kei, Bai-Shi (Institute of intelligent Machines, Chinese Academy of Sciences, Hefei 230031)
  • Published : 2001.01.01

Abstract

The number of hidden neurons of the feed-forward neural networks is generally decided on the basis of experience. The method usually results in the lack or redundancy of hidden neurons, and causes the shortage of capacity for storing information of learning overmuch. This research proposes a new method for optimizing the number of hidden neurons bases on information entropy, Firstly, an initial neural network with enough hidden neurons should be trained by a set of training samples. Second, the activation values of hidden neurons should be calculated by inputting the training samples that can be identified correctly by the trained neural network. Third, all kinds of partitions should be tried and its information gain should be calculated, and then a decision-tree correctly dividing the whole sample space can be constructed. Finally, the important and related hidden neurons that are included in the tree can be found by searching the whole tree, and other redundant hidden neurons can be deleted. Thus, the number of hidden neurons can be decided. In the case of building a neural network with the best number of hidden units for tea quality evaluation, the proposed method is applied. And the result shows that the method is effective

Keywords