Title :
Comparative study on various pruning algorithms for RNN. I. Complexity analysis
Author_Institution :
Dept. of Comput., Hong Kong Polytech. Univ., China
Abstract :
Owing to the computational complexity requirement, pruning a fully connected recurrent neural network (RNN) would be ineffective for large size RNN. In this paper several non-heuristic pruning algorithms for fully connected RNN are investigated, some of them are extended from extended Kalman filter based approaches and some of them are based on weight magnitude, together with some techniques on the pruning procedures. Their effectiveness, such as on their computational complexities, network sizes and generalization abilities, is evaluated and presented. This paper presents the issue on computational complexity.
Keywords :
Kalman filters; computational complexity; covariance matrices; generalisation (artificial intelligence); probability; recurrent neural nets; computational complexity; covariance matrix; extended Kalman filter; generalization; network sizes; nonheuristic pruning algorithms; probability; recurrent neural network; Algorithm design and analysis; Computational complexity; Computational efficiency; Computer networks; High performance computing; Internet; Laboratories; Neural networks; Recurrent neural networks; Weight measurement;
Conference_Titel :
Machine Learning and Cybernetics, 2002. Proceedings. 2002 International Conference on
Print_ISBN :
0-7803-7508-4
DOI :
10.1109/ICMLC.2002.1175435