DocumentCode :
3661060
Title :
Parallel training of convolutional neural networks for small sample learning
Author :
Tianliang Liu; Haihong Zheng; Wei Liang
Author_Institution :
Computer Science and Technology, Xidian University, China
fYear :
2015
fDate :
7/1/2015 12:00:00 AM
Firstpage :
1
Lastpage :
6
Abstract :
We propose a parallel training framework of convolutional neural networks (CNNs) for small sample learning. In the framework we model the feature filter process and show Sadowsky energy distribution exists in the model. Using Sadowsky energy distribution, the weights in convolutional kernels can be rearranged after each update according to special cases. With this rearrangement, each CNNs in the framework has different predicted probability especially for easily misclassified samples, which avoids the situation of a low predicted probability traditional CNNs may have. The class that gets the maximum predicted probability among the CNNs would be chosen as the result of prediction. Our CNNs framework gives better hand-written digit classification for small samples than one-stage CNNs, and has a faster convergence rate than multiplestages CNNs.
Keywords :
"Training","Testing","Convolution","Silicon","Xenon","Accuracy","Computer aided software engineering"
Publisher :
ieee
Conference_Titel :
Neural Networks (IJCNN), 2015 International Joint Conference on
Electronic_ISBN :
2161-4407
Type :
conf
DOI :
10.1109/IJCNN.2015.7280367
Filename :
7280367
Link To Document :
بازگشت