Title :
Multi-sigmoidal neural networks and back-propagation
Author :
Drakapoulos, J.A.
Author_Institution :
Stanford Univ., CA, USA
Abstract :
A new neural network architecture based on dynamically created nonmonotonic activation functions that are modeled by a set of sigmoidal functions is introduced. A modification of the backpropagation algorithm is presented that is capable to learn both the weights and the unit activation functions themselves. The new architecture reveals an existing tradeoff in capturing interactions of the inputs via hidden units or nonmonotonic activation functions. In the classification problems examined, the new architecture resulted in very shallow networks with fewer degrees of freedom than the corresponding backpropagation networks. Those networks converged very fast to a solution and resulted in optimal or nearly optimal sigmoidal configurations
Keywords :
backpropagation; convergence; feedforward neural nets; neural net architecture; nonmonotonic reasoning; pattern classification; transfer functions; backpropagation algorithm; classification problems; convergence; degrees of freedom; dynamically created nonmonotonic activation functions; hidden units; input interaction capture; multi-sigmoidal neural network architecture; node weight learning; optimal sigmoidal configurations; shallow networks; sigmoidal functions; unit activation function learning;
Conference_Titel :
Artificial Neural Networks, 1995., Fourth International Conference on
Conference_Location :
Cambridge
Print_ISBN :
0-85296-641-5
DOI :
10.1049/cp:19950546