• DocumentCode
    3500367
  • Title

    Accelerated learning of Generalized Sammon Mappings

  • Author

    Huang, Yinjie ; Georgiopoulos, Michael ; Anagnostopoulos, Georgios C.

  • Author_Institution
    Dept. of EE & CS, Univ. of Central Florida, Orlando, FL, USA
  • fYear
    2011
  • fDate
    July 31 2011-Aug. 5 2011
  • Firstpage
    2952
  • Lastpage
    2960
  • Abstract
    The Sammon Mapping (SM) has established itself as a valuable tool in dimensionality reduction, manifold learning, exploratory data analysis and, particularly, in data visualization. The SM is capable of projecting high-dimensional data into a low-dimensional space, so that they can be visualized and interpreted. This is accomplished by representing inter-sample dissimilarities in the original space by Euclidean inter-sample distances in the projection space. Recently, Kernel Sammon Mapping (KSM) has been shown to subsume the SM and a few other related extensions to SM. Both of the aforementioned models feature a set of linear weights that are estimated via Iterative Majorization (IM). While IM is significantly faster than other standard gradient-based methods, tackling data sets of larger than moderate sizes becomes a challenging learning task, as IM´s convergence significantly slows down with increasing data set cardinality. In this paper we derive two improved training algorithms based on Successive Over-Relaxation (SOR) and Parallel Tangents (PARTAN) acceleration, that, while still being first-order methods, exhibit faster convergence than IM. Both algorithms are relatively easy to understand, straightforward to implement and, performance-wise, are as robust as IM. We also present comparative results that illustrate their computational advantages on a set of benchmark problems.
  • Keywords
    learning (artificial intelligence); Euclidean inter-sample distances; accelerated learning; data visualization; dimensionality reduction; exploratory data analysis; generalized Sammon mappings; inter-sample dissimilarities; iterative majorization; kernel Sammon mapping; manifold learning; parallel tangents acceleration; projection space; standard gradient-based methods; successive over-relaxation; Acceleration; Convergence; Data visualization; Kernel; Prototypes; Stress; Training;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks (IJCNN), The 2011 International Joint Conference on
  • Conference_Location
    San Jose, CA
  • ISSN
    2161-4393
  • Print_ISBN
    978-1-4244-9635-8
  • Type

    conf

  • DOI
    10.1109/IJCNN.2011.6033609
  • Filename
    6033609