DocumentCode :
42688
Title :
Large-Margin Multi-ViewInformation Bottleneck
Author :
Chang Xu ; Dacheng Tao ; Chao Xu
Author_Institution :
Key Lab. of Machine Perception (Minist. of Educ.), Peking Univ., Beijing, China
Volume :
36
Issue :
8
fYear :
2014
fDate :
Aug. 2014
Firstpage :
1559
Lastpage :
1572
Abstract :
In this paper, we extend the theory of the information bottleneck (IB) to learning from examples represented by multi-view features. We formulate the problem as one of encoding a communication system with multiple senders, each of which represents one view of the data. Based on the precise components filtered out from multiple information sources through a “bottleneck”, a margin maximization approach is then used to strengthen the discrimination of the encoder by improving the code distance within the frame of coding theory. The resulting algorithm therefore inherits all the merits of the IB principle and coding theory. It has two distinct advantages over existing algorithms, namely, that our method finds a tradeoff between the accuracy and complexity of the multi-view model, and that the encoded multi-view data retains sufficient discrimination for classification. We also derive the robustness and generalization error bound of the proposed algorithm, and reveal the specific properties of multi-view learning. First, the complementarity of multi-view features guarantees the robustness of the algorithm. Second, the consensus of multi-view features reduces the empirical Rademacher complexity of the objective function, enhances the accuracy of the solution, and improves the generalization error bound of the algorithm. The resulting objective function is solved efficiently using the alternating direction method. Experimental results on annotation, classification and recognition tasks demonstrate that the proposed algorithm is promising for practical applications.
Keywords :
generalisation (artificial intelligence); learning (artificial intelligence); pattern classification; IB principle; alternating direction method; annotation task; classification task; code distance; coding theory; empirical Rademacher complexity; generalization error bound; large-margin multiview information bottleneck; margin maximization approach; multiview features representation; multiview learning; objective function; recognition task; Accuracy; Complexity theory; Kernel; Linear programming; Optimization; Support vector machines; Vectors; Multi-view learning; information bottleneck; large-margin learning;
fLanguage :
English
Journal_Title :
Pattern Analysis and Machine Intelligence, IEEE Transactions on
Publisher :
ieee
ISSN :
0162-8828
Type :
jour
DOI :
10.1109/TPAMI.2013.2296528
Filename :
6697863
Link To Document :
بازگشت