DocumentCode :
3661289
Title :
Resource-constrained classification using a cascade of neural network layers
Author :
Sam Leroux;Steven Bohez;Tim Verbelen;Bert Vankeirsbilck;Pieter Simoens;Bart Dhoedt
Author_Institution :
Ghent University - iMinds, Gaston Crommenlaan 8/201, B-9050, Belgium
fYear :
2015
fDate :
7/1/2015 12:00:00 AM
Firstpage :
1
Lastpage :
7
Abstract :
Deep neural networks are the state of the art technique for a wide variety of classification problems. Although deeper networks are able to make more accurate classifications, the value brought by an additional hidden layer diminishes rapidly. Even shallow networks are able to achieve relatively good results on various classification problems. Only for a small subset of the samples do the deeper layers make a significant difference. We describe an architecture in which only the samples that can not be classified with a sufficient confidence by a shallow network have to be processed by the deeper layers. Instead of training a network with one output layer at the end of the network, we train several output layers, one for each hidden layer. When an output layer is sufficiently confident in this result, we stop propagating at this layer and the deeper layers need not be evaluated. The choice of a threshold confidence value allows us to trade-off accuracy and speed.
Keywords :
"Neurons","Biological neural networks","Training"
Publisher :
ieee
Conference_Titel :
Neural Networks (IJCNN), 2015 International Joint Conference on
Electronic_ISBN :
2161-4407
Type :
conf
DOI :
10.1109/IJCNN.2015.7280601
Filename :
7280601
Link To Document :
بازگشت