• DocumentCode
    328281
  • Title

    Max-min propagation nets: learning by delta rule for the Chebyshev norm

  • Author

    Estevez, Pablo A. ; Okabe, Yoichi

  • Author_Institution
    Res. Center for Adv. Sci. & Technol., Tokyo Univ., Japan
  • Volume
    1
  • fYear
    1993
  • fDate
    25-29 Oct. 1993
  • Firstpage
    524
  • Abstract
    This paper introduces max-min propagation nets, which are composed of multiple-input maximum (max) and minimum (min) operators, besides linear weighted sum units. A learning algorithm based on gradient descent is derived for these networks. Two different criteria of error measurement are tested: the Chebyshev norm and the least squares norm. Weight-update rules for both criteria are deduced and implemented together with acceleration techniques to speed up convergence. A comparison of results on the parity problem is presented.
  • Keywords
    Chebyshev approximation; convergence of numerical methods; learning (artificial intelligence); minimax techniques; neural nets; Chebyshev norm; convergence; delta rule; error measurement; gradient descent; learning algorithm; least squares norm; linear weighted sum units; max-min propagation nets; maximum operators; minimum operators; parity problem; weight-update rules; Acceleration; Boolean functions; Chebyshev approximation; Convergence; Hardware; Least squares methods; Multi-layer neural network; Neural networks; Neurons; Testing;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1993. IJCNN '93-Nagoya. Proceedings of 1993 International Joint Conference on
  • Print_ISBN
    0-7803-1421-2
  • Type

    conf

  • DOI
    10.1109/IJCNN.1993.713968
  • Filename
    713968