• DocumentCode
    423632
  • Title

    Active training on the CMAC neural network

  • Author

    Weruaga, Luis

  • Author_Institution
    Comm. for Sci. Visualisation, Austrian Acad. of Sci., Vienna, Austria
  • Volume
    2
  • fYear
    2004
  • fDate
    25-29 July 2004
  • Firstpage
    855
  • Abstract
    The CMAC neural network presents a rigid architecture for learning and generalizing simultaneously, a limitation stressed with sparse or non-dense training datasets, and hardly solved by the current training algorithms. This paper proposes a novel training algorithm that overcomes the mentioned tradeoff. The training mechanism is based on the minimization of the stiffness energy of the output, solution based on the active deformable model theory. These ideas lead to a cell-interaction-based internal update mechanism that preserves the potential CMAC learning capabilities and delivers a higher generalization degree than the one a-priori embedded in the CMAC architecture. The training mechanism is derived entirely from a rigorous theoretical study. This analysis is supported with comparative results on the inverse kinematics of a robotic arm, which prove the excellent performance of the proposed active training.
  • Keywords
    cerebellar model arithmetic computers; generalisation (artificial intelligence); learning (artificial intelligence); manipulator kinematics; minimisation; CMAC architecture; CMAC learning; CMAC neural network; active deformable model theory; active training algorithms; cell interaction; internal update mechanism; inverse kinematics; learning architecture; nondense training datasets; robotic arm; sparse training datasets; stiffness energy minimization; training mechanism; Adaptive systems; Deformable models; Iterative algorithms; Kinematics; Least squares approximation; Neural networks; Nonlinear control systems; Performance analysis; Robots; Visualization;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2004. Proceedings. 2004 IEEE International Joint Conference on
  • ISSN
    1098-7576
  • Print_ISBN
    0-7803-8359-1
  • Type

    conf

  • DOI
    10.1109/IJCNN.2004.1380041
  • Filename
    1380041