• DocumentCode
    801028
  • Title

    A framework for improved training of Sigma-Pi networks

  • Author

    Heywood, Malcolm ; Noakes, Peter

  • Author_Institution
    Dept. of Electron. Syst. Eng., Essex Univ., Colchester, UK
  • Volume
    6
  • Issue
    4
  • fYear
    1995
  • fDate
    7/1/1995 12:00:00 AM
  • Firstpage
    893
  • Lastpage
    903
  • Abstract
    This paper proposes and demonstrates a framework for Sigma-Pi networks such that the combinatorial increase in product terms is avoided. This is achieved by only implementing a subset of the possible product terms (sub-net Sigma-Pi). Application of a dynamic weight pruning algorithm enables redundant weights to be removed and replaced during the learning process, hence permitting access to a larger weight space than employed at network initialization. More than one learning rate is applied to ensure that the inclusion of higher order descriptors does not result in over description of the training set (memorization). The application of such a framework is tested using a problem requiring significant generalization ability. Performance of the resulting sub-net Sigma-Pi network is compared to that returned by optimal multi-layer perceptrons and general Sigma-Pi solutions
  • Keywords
    feedforward neural nets; generalisation (artificial intelligence); learning (artificial intelligence); dynamic weight pruning algorithm; generalization ability; higher order descriptors; improved training; learning process; learning rate; optimal multi-layer perceptrons; redundant weights; sub-net Sigma-Pi network; Backpropagation algorithms; Detectors; Heuristic algorithms; Multi-layer neural network; Multilayer perceptrons; Network topology; Neural networks; Performance gain; Testing;
  • fLanguage
    English
  • Journal_Title
    Neural Networks, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    1045-9227
  • Type

    jour

  • DOI
    10.1109/72.392251
  • Filename
    392251