Proceedings of the KIEE Conference (대한전기학회:학술대회논문집)
- 2001.07d
- /
- Pages.2726-2728
- /
- 2001
Optimal Synthesis Method for Binary Neural Network using NETLA
NETLA를 이용한 이진 신경회로망의 최적 합성방법
- Sung, Sang-Kyu (Dong-A Uni) ;
- Kim, Tae-Woo (KEPCO) ;
- Park, Doo-Hwan (Dong-A Uni) ;
- Jo, Hyun-Woo (KEPCO) ;
- Ha, Hong-Gon (Dong-Eui Uni) ;
- Lee, Joon-Tark (Dong-A Uni)
- 성상규 (동아대학교 전기공학과) ;
- 김태우 (한국전력공사) ;
- 박두환 (동아대학교 전기공학과) ;
- 조현우 (한국전력공사) ;
- 하홍곤 (동의대학교 전자공학과) ;
- 이준탁 (동아대학교 전기공학과)
- Published : 2001.07.18
Abstract
This paper describes an optimal synthesis method of binary neural network(BNN) for an approximation problem of a circular region using a newly proposed learning algorithm[7] Our object is to minimize the number of connections and neurons in hidden layer by using a Newly Expanded and Truncated Learning Algorithm(NETLA) for the multilayer BNN. The synthesis method in the NETLA is based on the extension principle of Expanded and Truncated Learning(ETL) and is based on Expanded Sum of Product (ESP) as one of the boolean expression techniques. And it has an ability to optimize the given BNN in the binary space without any iterative training as the conventional Error Back Propagation(EBP) algorithm[6] If all the true and false patterns are only given, the connection weights and the threshold values can be immediately determined by an optimal synthesis method of the NETLA without any tedious learning. Futhermore, the number of the required neurons in hidden layer can be reduced and the fast learning of BNN can be realized. The superiority of this NETLA to other algorithms was proved by the approximation problem of one circular region.
Keywords