Title :
Neuron and dendrite pruning by synaptic weight shifting in polynomial time
Author :
Tanprasert, Chularat ; Tanprasert, Thitipong ; Lursinsap, Chidchanok
Author_Institution :
Software Technol. Lab., Minst. of Sci., Technol. & Environ., Bangkok, Thailand
Abstract :
There is a lot of redundant information in ANNs. Therefore, pruning algorithms are required in order to save the computational cost and reduce the complexity of the system. A weight shifting technique which was proposed for increasing the fault tolerance of the neural networks is applied to prune links and/or neurons of the networks. After the training converges, each hidden neuron that has less effect on the performance is removed and their weights are shifted to other links by weight shifting technique. The weights of removed links are still in the network only with other links of the same neuron. The experimental result shows that 5%-45% of links can be removed and the pruned network still gives the performance at the same level as the unpruned network. This technique does not require the retraining process, modification of the error cost function, or computational overhead. The time complexities of the link and neuron prunings are O(n2) and O(m), respectively, where n is the number of links and m is the number of neurons in the network
Keywords :
computational complexity; feedforward neural nets; learning (artificial intelligence); dendrite pruning; fault tolerance; hidden neuron; neuron pruning; polynomial time; redundant information; synaptic weight shifting; time complexities; Computational efficiency; Computer science; Cost function; Fault detection; Fault tolerance; Feedforward systems; Mathematics; Neural networks; Neurons; Polynomials;
Conference_Titel :
Neural Networks, 1996., IEEE International Conference on
Conference_Location :
Washington, DC
Print_ISBN :
0-7803-3210-5
DOI :
10.1109/ICNN.1996.549003