DocumentCode :
2714303
Title :
A Neural Network pruning approach based on Compressive Sampling
Author :
Yang, Jie ; Bouzerdoum, Abdesselam ; Phung, Son Lam
Author_Institution :
Sch. of Electr., Comput. & Telecommun. Eng., Univ. of Wollongong, Wollongong, NSW, Australia
fYear :
2009
fDate :
14-19 June 2009
Firstpage :
3428
Lastpage :
3435
Abstract :
The balance between computational complexity and the architecture bottlenecks the development of neural networks (NNs). An architecture that is too large or too small will influence the performance to a large extent in terms of generalization and computational cost. In the past, saliency analysis has been employed to determine the most suitable structure, however, it is time-consuming and the performance is not robust. In this paper, a family of new algorithms for pruning elements (weighs and hidden neurons) in neural networks is presented based on compressive sampling (CS) theory. The proposed framework makes it possible to locate the significant elements, and hence find a sparse structure, without computing their saliency. Experiment results are presented which demonstrate the effectiveness of the proposed approach.
Keywords :
computational complexity; neural net architecture; signal representation; compressed sensing theory; compressive sampling theory; computational complexity; neural network pruning approach; saliency analysis; sparse signal representation; Computational complexity; Computational efficiency; Computer architecture; Iterative algorithms; Network topology; Neural networks; Neurons; Performance analysis; Robustness; Sampling methods;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2009. IJCNN 2009. International Joint Conference on
Conference_Location :
Atlanta, GA
ISSN :
1098-7576
Print_ISBN :
978-1-4244-3548-7
Electronic_ISBN :
1098-7576
Type :
conf
DOI :
10.1109/IJCNN.2009.5179045
Filename :
5179045
Link To Document :
بازگشت