• DocumentCode
    671599
  • Title

    Maximal margin learning vector quantisation

  • Author

    Trung Le ; Dat Tran ; Van Nguyen ; Wanli Ma

  • Author_Institution
    Fac. of Inf. Technol., HCMc Univ. of Pedagogy, Ho Chi Minh City, Vietnam
  • fYear
    2013
  • fDate
    4-9 Aug. 2013
  • Firstpage
    1
  • Lastpage
    6
  • Abstract
    Kernel Generalised Learning Vector Quantisation (KGLVQ) was proposed to extend Generalised Learning Vector Quantisation into the kernel feature space to deal with complex class boundaries and thus yielded promising performance for complex classification tasks in pattern recognition. However KGLVQ does not follow the maximal margin principle, which is crucial for kernel-based learning methods. In this paper we propose a maximal margin approach (MLVQ) to the KGLVQ algorithm. MLVQ inherits the merits of KGLVQ and also follows the maximal margin principle to improve the generalisation capability. Experiments performed on the well-known data sets available in UCI repository show promising classification results for the proposed method.
  • Keywords
    learning (artificial intelligence); pattern classification; vector quantisation; KGLVQ algorithm; UCI repository; complex classification tasks; kernel feature space; kernel generalised learning vector quantisation; maximal margin learning vector quantisation; pattern recognition; Kernel; Linear programming; Prototypes; Support vector machines; Training; Vector quantization; Vectors;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks (IJCNN), The 2013 International Joint Conference on
  • Conference_Location
    Dallas, TX
  • ISSN
    2161-4393
  • Print_ISBN
    978-1-4673-6128-6
  • Type

    conf

  • DOI
    10.1109/IJCNN.2013.6706940
  • Filename
    6706940