Title of article :
Structure Learning for Deep Neural Networks with Competitive Synaptic Pruning
Author/Authors :
Ahmadi ، A. Department of Electrical Engineering - Sahand University of Technology , Mahboobi Esfanjani ، R. Department of Electrical Engineering - Sahand University of Technology
From page :
189
To page :
196
Abstract :
Background and Objectives: A predefined structure is usually employed for deep neural networks, which results in over- or underfitting, heavy processing load, and storage overhead. Training along with pruning can decrease redundancy in deep neural networks; however, it may lead to a decrease in accuracy.Methods: In this note, we provide a novel approach for structure optimization of deep neural networks based on competition of connections merged with brain-inspired synaptic pruning. The efficiency of each network connection is continuously assessed in the proposed scheme based on the global gradient magnitude criterion, which also considers positive scores for strong and more effective connections and negative scores for weak connections. But a connection with a weak score is not removed quickly; instead, it is eliminated when its net score reaches a predetermined threshold. Moreover, the pruning rate is obtained distinctly for each layer of the network.Results: Applying the suggested algorithm to a neural network model of a distillation column in a noisy environment demonstrates its effectiveness and applicability.Conclusion: The proposed method, which is inspired by connection competition and synaptic pruning in the human brain, enhances learning speed, preserves accuracy, and reduces costs due to its smaller network size. It also handles noisy data more efficiently by continuously assessing network connections
Keywords :
Deep Neural Networks , Synaptic Pruning , Distillation Column , PID Tuning
Journal title :
Journal of Electrical and Computer Engineering Innovations (JECEI)
Journal title :
Journal of Electrical and Computer Engineering Innovations (JECEI)
Record number :
2779365
Link To Document :
بازگشت