• DocumentCode
    313582
  • Title

    Regularization and error bars for the mixture of experts network

  • Author

    Ramamurti, Viswanath ; Ghosh, Joydeep

  • Author_Institution
    Dept. of Electr. & Comput. Eng., Texas Univ., Austin, TX, USA
  • Volume
    1
  • fYear
    1997
  • fDate
    9-12 Jun 1997
  • Firstpage
    221
  • Abstract
    The mixture of experts architecture provides a modular approach to function approximation. Since different experts get attuned to different regions of the input space during the course of training, and data distribution may not be uniform, some experts may get over-trained while others are undertrained. This leads to overall poorer generalization. In this paper, we show how regularization applied to the gating network improves generalization performance during the course of training. Secondly, we address the issue of estimating the error bars for network prediction. This is useful to estimate the range of probable network outputs for a given input especially in performance critical applications. Equations are derived to estimate the variance of the network output for a given input. Simulation results are presented in support of the proposed methods which substantially improve the effectiveness of mixture of experts networks
  • Keywords
    function approximation; neural net architecture; data distribution; error bars; experts mixture network; function approximation; modular approach; network prediction; performance critical applications; probable network outputs; regularization; Bars; Computer architecture; Computer errors; Contracts; Ear; Electronic mail; Equations; Function approximation; Lapping;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks,1997., International Conference on
  • Conference_Location
    Houston, TX
  • Print_ISBN
    0-7803-4122-8
  • Type

    conf

  • DOI
    10.1109/ICNN.1997.611668
  • Filename
    611668