• DocumentCode
    2623398
  • Title

    Weight update in back-propagation neural networks: the role of activation functions

  • Author

    Alippi, Cesare

  • Author_Institution
    Dipartimento di Elettronica, Politecnico di Milano, Italy
  • fYear
    1991
  • fDate
    18-21 Nov 1991
  • Firstpage
    560
  • Abstract
    The speed of the learning phase in a classic back-propagation neural network depends both on learning rates and on the choice of activation mappers. These relationships, implicit in the Hebbian learning rule, are analytically analyzed, focusing on the generalized delta rule. A theorem sets a maximum for the step to be taken along the gradient descent direction according to the chosen activation function and to the learning rate. These results explain different requirements for learning parameters, for the hardware representation of weights, and for the behavior of activation function features with respect to learning speed. Results, in order to obtain a significant generalization, are applied to a large activation functions family comprising the most common activation mappers
  • Keywords
    neural nets; Hebbian learning rule; activation functions; activation mappers; back-propagation neural networks; generalized delta rule; gradient descent direction; weight update; Computer science; Educational institutions; Error correction; Genetic expression; Hardware; Hebbian theory; Intelligent networks; Least squares methods; Neural networks; Neurons;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1991. 1991 IEEE International Joint Conference on
  • Print_ISBN
    0-7803-0227-3
  • Type

    conf

  • DOI
    10.1109/IJCNN.1991.170459
  • Filename
    170459