• DocumentCode
    768172
  • Title

    Learning without local minima in radial basis function networks

  • Author

    Bianchini, Monica ; Frasconi, Paolo ; Gori, Marco

  • Author_Institution
    Dipartimento di Sistemi e Inf., Univ. di Firenze, Italy
  • Volume
    6
  • Issue
    3
  • fYear
    1995
  • fDate
    5/1/1995 12:00:00 AM
  • Firstpage
    749
  • Lastpage
    756
  • Abstract
    Learning from examples plays a central role in artificial neural networks. The success of many learning schemes is not guaranteed, however, since algorithms like backpropagation may get stuck in local minima, thus providing suboptimal solutions. For feedforward networks, optimal learning can be achieved provided that certain conditions on the network and the learning environment are met. This principle is investigated for the case of networks using radial basis functions (RBF). It is assumed that the patterns of the learning environment are separable by hyperspheres. In that case, we prove that the attached cost function is local minima free with respect to all the weights. This provides us with some theoretical foundations for a massive application of RBF in pattern recognition
  • Keywords
    feedforward neural nets; learning by example; pattern recognition; artificial neural networks; cost function; example-based learning; hypersphere separability; local minima; pattern recognition; radial basis function networks; Artificial neural networks; Backpropagation algorithms; Cost function; Intelligent networks; Neural networks; Pattern analysis; Pattern recognition; Radial basis function networks; Rough surfaces; Surface roughness;
  • fLanguage
    English
  • Journal_Title
    Neural Networks, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    1045-9227
  • Type

    jour

  • DOI
    10.1109/72.377979
  • Filename
    377979