• DocumentCode
    288372
  • Title

    An update function that speeds up backpropagation learning

  • Author

    Deredy, W. El ; Branston, N.M.

  • Author_Institution
    Dept. of Neurological Surg., Inst. of Neurology, London, UK
  • Volume
    1
  • fYear
    1994
  • fDate
    27 Jun-2 Jul 1994
  • Firstpage
    477
  • Abstract
    We consider a modification of the backpropagation (BP) learning algorithm in which a linear function, directly proportional to the deviation between target values and actual values at the output, is propagated backwards instead of the original nonlinear function. The new algorithm is tested on the odd/even parity function for orders between 4 and 7 and on high- (180) dimensional data derived from NMR spectroscopy of animal tumours. Results suggest that using the linear function, the network converges faster and is more likely to escape from local minima than when the original BP is used
  • Keywords
    backpropagation; neural nets; NMR spectroscopy; animal tumours; backpropagation learning; linear function; local minima; neural net; odd/even parity function; update function; Animals; Artificial neural networks; Backpropagation algorithms; Nervous system; Nuclear magnetic resonance; Pattern recognition; Spectroscopy; Supervised learning; Testing; Tumors;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
  • Conference_Location
    Orlando, FL
  • Print_ISBN
    0-7803-1901-X
  • Type

    conf

  • DOI
    10.1109/ICNN.1994.374209
  • Filename
    374209