Title :
Learning of fast transforms and spectral domain neural computing
Author :
Ersoy, Okan K. ; Chen, Chuan-hsing
Author_Institution :
Sch. of Electr. Eng., Purdue Univ., West Lafayette, IN, USA
fDate :
5/1/1989 12:00:00 AM
Abstract :
The interaction between neural networks and fast transforms is examined. It is shown that the development, discovery, and the study of transforms can be efficiently carried out through the use of learning algorithms used in neural networks. In turn, these transforms can be used for a number of tasks in neural networks, such as network reduction and simplification, fast convergence during learning, fast memory retrieval, reduced cost and increased speed of implementation, feature extraction, invariance to distortions, better generalization, and increased quality of performance in the presence of noise and incomplete knowledge. Learning with the unconstrained part of the neural network of reduced size or minimized number of interconnections is performed in the spectral domain only, thereby considerably easing the problems of convergence and implementation. The techniques described can be especially useful in dynamic neural networks
Keywords :
learning systems; neural nets; dynamic neural networks; fast convergence; fast memory retrieval; fast transforms; feature extraction; generalization; increased quality of performance; increased speed; invariance to distortions; learning algorithms; minimized number of interconnections; network reduction; network simplification; neural networks; reduced cost; spectral domain; spectral domain neural computing; Artificial neural networks; Biological neural networks; Computer networks; Convergence; Costs; Feature extraction; Image coding; Image reconstruction; Neural networks; Noise reduction;
Journal_Title :
Circuits and Systems, IEEE Transactions on