• DocumentCode
    296009
  • Title

    An inverse model learning algorithm using the hierarchical mixtures of experts

  • Author

    Satoshi, Yamaguchi ; Hidekiyo, Itakura ; Yoshikazu, Nishikawa

  • Author_Institution
    Dept. of Comput. Sci., Chiba Inst. of Technol., Narashino, Japan
  • Volume
    5
  • fYear
    1995
  • fDate
    Nov/Dec 1995
  • Firstpage
    2738
  • Abstract
    A new learning algorithm in neural networks is proposed for inverse modeling. In the learning algorithm, the hierarchical mixtures of experts (HME) is tried out as a forward model of the system. The algorithm is fundamentally based on the back propagation procedure and the updating values of the network synaptic weights are calculated with the help of the HME. Almost all conventional learning algorithms use the Jacobian matrix of the system for estimating the neural network error. This is not the case with our algorithm. As a result, it carefully avoids the local minimum problem which often occurs in some inverse model learning processes
  • Keywords
    Jacobian matrices; backpropagation; inverse problems; modelling; neural nets; Jacobian matrix; back propagation; hierarchical expert mixtures; inverse model learning algorithm; inverse modeling; network synaptic weights; neural networks; Biological system modeling; Computational biology; Computer science; Inverse problems; Jacobian matrices; Neural networks; Neurofeedback; Supervised learning;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1995. Proceedings., IEEE International Conference on
  • Conference_Location
    Perth, WA
  • Print_ISBN
    0-7803-2768-3
  • Type

    conf

  • DOI
    10.1109/ICNN.1995.488163
  • Filename
    488163