Title :
Comparative study on various pruning algorithms for RNN. II. Experimental results and comparative analysis
Author :
Sum, John ; Tjiang, Jung Seng
Author_Institution :
Dept. of Comput., Hong Kong Polytech. Univ., China
Abstract :
For Pt. I see ibid., p.2225-30 (2002). Owing to the computational complexity requirement, pruning a fully connected recurrent neural network (RNN) would be ineffective for large size RNN. In Part I, several non-heuristic pruning algorithms for fully connected RNN have been proposed, some of them are extended from EKF based approaches and some of them are based on weight magnitude, together with some techniques on the pruning procedures. Their effectiveness on the computational complexities have been analyzed. In this paper, their effectiveness on network sizes and the generalization abilities will be evaluated.
Keywords :
Kalman filters; generalisation (artificial intelligence); identification; learning (artificial intelligence); nonlinear systems; recurrent neural nets; computational complexities; extended Kalman filter; generalization; identification; network sizes; nonheuristic pruning algorithms; nonlinear system; recurrent neural network; skipping procedure; weight magnitude; Algorithm design and analysis; Computational complexity; Computational efficiency; Computer networks; High performance computing; IP networks; Laboratories; Linear systems; Neural networks; Recurrent neural networks;
Conference_Titel :
Machine Learning and Cybernetics, 2002. Proceedings. 2002 International Conference on
Print_ISBN :
0-7803-7508-4
DOI :
10.1109/ICMLC.2002.1175436