• DocumentCode
    2970967
  • Title

    The learning convergence of CMAC in cyclic learning

  • Author

    Yao, Shu ; Bo Zhang

  • Author_Institution
    Dept. of Comput. Sci. & Technol., Tsinghua Univ., Beijing, China
  • Volume
    3
  • fYear
    1993
  • fDate
    25-29 Oct. 1993
  • Firstpage
    2583
  • Abstract
    Discusses the learning convergence of the cerebellar model articulation controller (CMAC) in cyclic learning. The authors prove the following results. First, if the training samples are noiseless, the learning algorithm converges if and only if the learning rate is chosen from (0, 2). Second, when the training samples have noises, the learning algorithm will converge with probability one if the learning rate is dynamically decreased. Third, in the noise case, with a small but fixed learning rate ε the mean square error of the weight sequences generated by the CMAC learning algorithm will be bounded by O(ε). Some simulation experiments are carried out to test these results.
  • Keywords
    cerebellar model arithmetic computers; convergence; learning (artificial intelligence); probability; CMAC; cerebellar model articulation controller; cyclic learning; learning convergence; mean square error; training samples; weight sequences; Associative memory; Backpropagation algorithms; Computer science; Convergence; Mean square error methods; Neural networks; Noise generators; Testing; Yttrium;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1993. IJCNN '93-Nagoya. Proceedings of 1993 International Joint Conference on
  • Print_ISBN
    0-7803-1421-2
  • Type

    conf

  • DOI
    10.1109/IJCNN.1993.714252
  • Filename
    714252