Title :
A novel sparse auto-encoder for deep unsupervised learning
Author :
Xiaojuan Jiang ; Yinghua Zhang ; Wensheng Zhang ; Xian Xiao
Author_Institution :
State Key Lab. of Intell. Control & Manage. of Complex Syst., Inst. of Autom., Beijing, China
Abstract :
This paper proposes a novel sparse variant of auto-encoders as a building block to pre-train deep neural networks. Compared with sparse auto-encoders through KL-divergence, our method requires fewer hyper-parameters and the sparsity level of the hidden units can be learnt automatically. We have compared our method with several other unsupervised leaning algorithms on the benchmark databases. The satisfactory classification accuracy (97.92% on MNIST and 87.29% on NORB) can be achieved by a 2-hidden-layer neural network pre-trained using our algorithm, and the whole training procedure (including pre-training and fine-tuning) takes far less time than the state-of-art results.
Keywords :
image classification; image coding; neural nets; unsupervised learning; 2-hidden-layer neural network pretraining; KL-divergence; MNIST dataset; NORB dataset; benchmark databases; classification accuracy; deep neural network pretraining; deep-unsupervised learning; fine-tuning procedure; hidden unit sparsity level; hyper-parameters; sparse auto-encoders; Classification algorithms; Data models; Lead; Principal component analysis; Training;
Conference_Titel :
Advanced Computational Intelligence (ICACI), 2013 Sixth International Conference on
Conference_Location :
Hangzhou
Print_ISBN :
978-1-4673-6341-9
DOI :
10.1109/ICACI.2013.6748512