Title :
Learning the Structure of Deep Convolutional Networks
Author :
Jiashi Feng;Trevor Darrell
Author_Institution :
Dept. of EECS, UC Berkeley, Berkeley, CA, USA
Abstract :
In this work, we develop a novel method for automatically learning aspects of the structure of a deep model, in order to improve its performance, especially when labeled training data are scarce. We propose a new convolutional neural network model with the Indian Buffet Process (IBP) prior, termed ibpCNN. The ibpCNN automatically adapts its structure to provided training data, achieves an optimal balance among model complexity, data fidelity and training loss, and thus offers better generalization performance. The proposed ibpCNN captures complicated data distribution in an unsupervised generative way. Therefore, ibpCNN can exploit unlabeled data -- which can be collected at low cost -- to learn its structure. After determining the structure, ibpCNN further learns its parameters according to specified tasks, in an end-to-end fashion, and produces discriminative yet compact representations. We evaluate the performance of ibpCNN, on fully-and semi-supervised image classification tasks, ibpCNN surpasses standard CNN models on benchmark datasets, with much smaller size and higher efficiency.
Keywords :
"Data models","Adaptation models","Training data","Complexity theory","Training","Neural networks","Convolutional codes"
Conference_Titel :
Computer Vision (ICCV), 2015 IEEE International Conference on
Electronic_ISBN :
2380-7504
DOI :
10.1109/ICCV.2015.315