• DocumentCode
    61908
  • Title

    A Hybrid Constructive Algorithm for Single-Layer Feedforward Networks Learning

  • Author

    Xing Wu ; Rozycki, Pawel ; Wilamowski, Bogdan M.

  • Author_Institution
    Dept. of Electr. & Comput. Eng., Auburn Univ., Auburn, AL, USA
  • Volume
    26
  • Issue
    8
  • fYear
    2015
  • fDate
    Aug. 2015
  • Firstpage
    1659
  • Lastpage
    1668
  • Abstract
    Single-layer feedforward networks (SLFNs) have been proven to be a universal approximator when all the parameters are allowed to be adjustable. It is widely used in classification and regression problems. The SLFN learning involves two tasks: determining network size and training the parameters. Most current algorithms could not be satisfactory to both sides. Some algorithms focused on construction and only tuned part of the parameters, which may not be able to achieve a compact network. Other gradient-based optimization algorithms focused on parameters tuning while the network size has to be preset by the user. Therefore, trial-and-error approach has to be used to search the optimal network size. Because results of each trial cannot be reused in another trial, it costs much computation. In this paper, a hybrid constructive (HC)algorithm is proposed for SLFN learning, which can train all the parameters and determine the network size simultaneously. At first, by combining Levenberg-Marquardt algorithm and least-square method, a hybrid algorithm is presented for training SLFN with fixed network size. Then,with the hybrid algorithm, an incremental constructive scheme is proposed. A new randomly initialized neuron is added each time when the training entrapped into local minima. Because the training continued on previous results after adding new neurons, the proposed HC algorithm works efficiently. Several practical problems were given for comparison with other popular algorithms. The experimental results demonstrated that the HC algorithm worked more efficiently than those optimization methods with trial and error, and could achieve much more compact SLFN than those construction algorithms.
  • Keywords
    feedforward neural nets; learning (artificial intelligence); least squares approximations; optimisation; HC algorithm; Levenberg-Marquardt algorithm; SLFN learning; compact network; fixed network size; hybrid constructive algorithm; incremental constructive scheme; least-square method; local minima; optimal network size search; parameter training; randomly initialized neuron; single-layer feedforward network learning; trial-and-error approach; universal approximator; Approximation algorithms; Approximation methods; Jacobian matrices; Neurons; Optimization; Signal processing algorithms; Training; Hybrid constructive (HC) algorithm; Levenberg–Marquardt (LM) algorithm; Levenberg-Marquardt (LM) algorithm; least-square (LS) methods; single-layer feedforward networks (SLFNs);
  • fLanguage
    English
  • Journal_Title
    Neural Networks and Learning Systems, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    2162-237X
  • Type

    jour

  • DOI
    10.1109/TNNLS.2014.2350957
  • Filename
    6894562