Title :
Multi-scale dynamic neural net architectures
Author :
Atlas, Les ; Marks, Robert, II ; Donnell, Mark ; Taylor, James
Author_Institution :
Dept. of Electr. Eng., Washington Univ., Seattle, WA, USA
Abstract :
The design of specialized trainable neural network architectures for temporal problems is described. Multilayer extensions of previous dynamic neural net architectures are considered. Two of the key attributes of these architectures are smoothing and decimation between layers. An analysis of parameters (weights) to estimate suggests a massive reduction in training data needed for multiscale topologies for networks with large temporal input windows. The standard back-propagation training rules are modified to allow for smoothing between layers, and preliminary simulation results for these new rules are encouraging. For example, a binary problem with an input of size 32 converged in three iterations with smoothing and never converged when there was no smoothing.<>
Keywords :
neural nets; back-propagation training rules; decimation; dynamic neural net architectures; iterations; large temporal input windows; multilayer extensions; multiscale topologies; smoothing; temporal problems; trainable neural network; Image recognition; Interactive systems; Laboratories; Network topology; Neural networks; Neurons; Pattern classification; Smoothing methods; Sonar; Speech;
Conference_Titel :
Communications, Computers and Signal Processing, 1989. Conference Proceeding., IEEE Pacific Rim Conference on
Conference_Location :
Victoria, BC, Canada
DOI :
10.1109/PACRIM.1989.48413