Title :
Exceptional Reducibility of Complex-Valued Neural Networks
Author :
Kobayashi, Masaki
Author_Institution :
Interdiscipl. Grad. Sch. of Med. & Eng., Univ. of Yamanashi, Kofu, Japan
fDate :
7/1/2010 12:00:00 AM
Abstract :
A neural network is referred to as minimal if it cannot reduce the number of hidden neurons that maintain the input-output map. The condition in which the number of hidden neurons can be reduced is referred to as reducibility. Real-valued neural networks have only three simple types of reducibility. It can be naturally extended to complex-valued neural networks without bias terms of hidden neurons. However, general complex-valued neural networks have another type of reducibility, referred to herein as exceptional reducibility. In this paper, another type of reducibility is presented, and a method by which to minimize complex-valued neural networks is proposed.
Keywords :
neural nets; complex-valued neural networks; input-output map; learning processes; real-valued neural networks; Complex-valued neural networks; minimality; reducibility; rotation-equivalence; Algorithms; Humans; Neural Networks (Computer); Neurons; Signal Processing, Computer-Assisted;
Journal_Title :
Neural Networks, IEEE Transactions on
DOI :
10.1109/TNN.2010.2048040