Title :
Convergence of SOM and NG as GNC algorithms
Author :
Gonzalez, Ana I. ; Anjou, Alicia D. ; Graña, Manuel
Author_Institution :
Univ. del Pais Vasco, San Sebastian
Abstract :
Convergence of the self-organizing map (SOM) and neural gas (NG) is usually contemplated from the point of view of stochastic gradient descent (SGD) algorithms of an energy function. SGD algorithms are characterized by very restrictive conditions that produce a slow convergence rate. Also they are local minimization algorithms, very dependent on the initial conditions. However, some empirical results show that one-pass on-line training realizations of SOM and NG may perform comparable to more careful (slow) realizations. Moreover, other empirical works suggest that SOM is quite robust against initial conditions. In both cases the performance measure is the quantization distortion. That empirical evidence leads us to propose that the appropriate setting for the convergence analysis of SOM NG and similar competitive artificial neural network clustering algorithms is the theory of graduated non-convexity (GNC) algorithms.
Keywords :
convergence; gradient methods; minimisation; pattern clustering; self-organising feature maps; stochastic processes; vector quantisation; competitive artificial neural network clustering algorithm; convergence analysis; graduated nonconvexity algorithm; minimization algorithm; neural gas; one-pass on-line training realization; quantization distortion; self-organizing map; stochastic gradient descent algorithm; Algorithm design and analysis; Artificial neural networks; Clustering algorithms; Convergence; Distortion measurement; Minimization methods; Robustness; Scheduling; Signal processing algorithms; Stochastic processes;
Conference_Titel :
Neural Networks, 2006. IJCNN '06. International Joint Conference on
Conference_Location :
Vancouver, BC
Print_ISBN :
0-7803-9490-9
DOI :
10.1109/IJCNN.2006.247363