Title :
Maximum Margin Clustering Made Practical
Author :
Zhang, Kai ; Tsang, Ivor W. ; Kwok, James T.
fDate :
4/1/2009 12:00:00 AM
Abstract :
Motivated by the success of large margin methods in supervised learning, maximum margin clustering (MMC) is a recent approach that aims at extending large margin methods to unsupervised learning. However, its optimization problem is nonconvex and existing MMC methods all rely on reformulating and relaxing the nonconvex optimization problem as semidefinite programs (SDP). Though SDP is convex and standard solvers are available, they are computationally very expensive and only small data sets can be handled. To make MMC more practical, we avoid SDP relaxations and propose in this paper an efficient approach that performs alternating optimization directly on the original nonconvex problem. A key step to avoid premature convergence in the resultant iterative procedure is to change the loss function from the hinge loss to the Laplacian/square loss so that overconfident predictions are penalized. Experiments on a number of synthetic and real-world data sets demonstrate that the proposed approach is more accurate, much faster (hundreds to tens of thousands of times faster), and can handle data sets that are hundreds of times larger than the largest data set reported in the MMC literature.
Keywords :
learning (artificial intelligence); optimisation; pattern clustering; Laplacian/square loss; alternating optimization; maximum margin clustering; nonconvex optimization problem; nonconvex problem; semidefinite programs; supervised learning; Large margin methods; maximum margin clustering (MMC); scalability; unsupervised learning;
Journal_Title :
Neural Networks, IEEE Transactions on
DOI :
10.1109/TNN.2008.2010620