DocumentCode :
463704
Title :
Smoothness Maximization via Gradient Descents
Author :
Bin Zhao ; Fei Wang ; Changshui Zhang
Author_Institution :
Dept. of Autom., Tsinghua Univ., Beijing, China
Volume :
2
fYear :
2007
fDate :
15-20 April 2007
Abstract :
The recent years have witnessed a surge of interest in graph based semi-supervised learning. However, despite its extensive research, there has been little work on graph construction. In this study, employing the idea of gradient descent, we propose a novel method called iterative smoothness maximization (ISM), to learn an optimal graph automatically for a semi-supervised learning task. The main procedure of ISM is to minimize the upper bound of semi-supervised classification error through an iterative gradient descent approach. We also prove the convergence of ISM theoretically, and finally experimental results on two real-world data sets are provided to demonstrate the effectiveness of ISM.
Keywords :
gradient methods; graph theory; learning (artificial intelligence); pattern classification; graph based semi-supervised learning; iterative gradient descents; iterative smoothness maximization; semi-supervised classification error; Automation; Convergence; Cost function; Intelligent systems; Iterative methods; Laboratories; Pattern classification; Semisupervised learning; Surges; Upper bound; Cluster Assumption; Gaussian Function; Gradient Descent; Semi-Supervised Learning (SSL);
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Acoustics, Speech and Signal Processing, 2007. ICASSP 2007. IEEE International Conference on
Conference_Location :
Honolulu, HI
ISSN :
1520-6149
Print_ISBN :
1-4244-0727-3
Type :
conf
DOI :
10.1109/ICASSP.2007.366309
Filename :
4217482
Link To Document :
بازگشت