• DocumentCode
    314398
  • Title

    Superior training of artificial neural networks using weight-space partitioning

  • Author

    Gupta, Hoshin V. ; Hsu, Kuo-lin ; Sorooshian, Soroosh

  • Author_Institution
    Dept. of Hydrology & Water Resources, Arizona Univ., Tucson, AZ, USA
  • Volume
    3
  • fYear
    1997
  • fDate
    9-12 Jun 1997
  • Firstpage
    1919
  • Abstract
    Linear least squares simplex (LLSSIM) is a new algorithm for batch training of three-layer feedforward artificial neural networks (ANN), based on a partitioning of the weight space. The input-hidden weights are trained using a “multi-start downhill simplex” global search algorithm, and the hidden-output weights are estimated using “conditional linear least squares”. Monte-Carlo testing shows that LLSSIM provides globally superior weight estimates with significantly fewer function evaluations than the conventional backpropagation, adaptive backpropagation, and conjugate gradient strategies
  • Keywords
    feedforward neural nets; learning (artificial intelligence); least squares approximations; optimisation; search problems; batch training; conditional linear least squares; feedforward neural networks; global search algorithm; input-hidden weights; linear least squares simplex; weight-space partitioning; Artificial neural networks; Backpropagation algorithms; Convergence; Joining processes; Least squares methods; Logistics; Neurons; Partitioning algorithms; Testing; Transfer functions;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks,1997., International Conference on
  • Conference_Location
    Houston, TX
  • Print_ISBN
    0-7803-4122-8
  • Type

    conf

  • DOI
    10.1109/ICNN.1997.614192
  • Filename
    614192