• DocumentCode
    3412976
  • Title

    Sobolev Gradients and Neural Networks

  • Author

    Bastian, Michael R. ; Gunther, Jacob H. ; Moon, Todd K.

  • Author_Institution
    Dept. of Electr. & Comput. Eng., Utah State Univ., Logan, UT
  • fYear
    2008
  • fDate
    March 31 2008-April 4 2008
  • Firstpage
    2085
  • Lastpage
    2088
  • Abstract
    By using a formulation similar to a Sobolev gradient for the natural gradient a new algorithm has been developed that converges faster, uses fewer additional parameters and has smaller storage requirements all while not overtraining to a training set. Simulation results show the improvements for an applicable problem.
  • Keywords
    Newton method; error analysis; gradient methods; learning (artificial intelligence); matrix algebra; multilayer perceptrons; Newton method; Sobolev gradient formulation; block-diagonal matrix; error function; multilayer perceptron; neural network training; Backpropagation algorithms; Computational modeling; Feedforward neural networks; Jacobian matrices; Moon; Multilayer perceptrons; Neural networks; Newton method; Partial differential equations; Signal processing algorithms; Algorithms; Feedforward Neural Networks; Newton Methods;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Acoustics, Speech and Signal Processing, 2008. ICASSP 2008. IEEE International Conference on
  • Conference_Location
    Las Vegas, NV
  • ISSN
    1520-6149
  • Print_ISBN
    978-1-4244-1483-3
  • Electronic_ISBN
    1520-6149
  • Type

    conf

  • DOI
    10.1109/ICASSP.2008.4518052
  • Filename
    4518052