• DocumentCode
    288493
  • Title

    Some results on L1 convergence rate of RBF networks and kernel regression estimators

  • Author

    Krzyzak, Adam ; Xu, Lei

  • Author_Institution
    Dept. of Comput. Sci., Concordia Univ., Montreal, Que., Canada
  • Volume
    2
  • fYear
    1994
  • fDate
    27 Jun-2 Jul 1994
  • Firstpage
    1209
  • Abstract
    Rather than studying the L2 convergence rates of kernel regression estimators (KRE) and radial basis function (RBF) nets given in Xu-Krzyzak-Yuille (1992 & 1993), we study convergence properties of the mean integrated absolute error (MIAE) for KRE and RBF nets. It has been shown that MIAE of KRE and RBF nets can converge to zero as the size of networks and the size of the training sequence tend to ∞, and that the upper bound for the convergence rate of MIAE is O(n-αs/(2+s)( 2α+d)) for approximating Lipschitz functions
  • Keywords
    convergence of numerical methods; estimation theory; feedforward neural nets; learning (artificial intelligence); statistical analysis; L1 convergence rate; approximating Lipschitz functions; kernel regression estimators; mean integrated absolute error; radial basis function nets; training sequence; upper bound; Convergence; Estimation error; Kernel; Neural networks; Probability; Radial basis function networks; Symmetric matrices; Upper bound;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
  • Conference_Location
    Orlando, FL
  • Print_ISBN
    0-7803-1901-X
  • Type

    conf

  • DOI
    10.1109/ICNN.1994.374356
  • Filename
    374356