Title :
Channel-Max, Channel-Drop and Stochastic Max-pooling
Author :
Yuchi Huang; Xiuyu Sun; Ming Lu;Ming Xu
Author_Institution :
NEC Labs, Beijing, China
fDate :
6/1/2015 12:00:00 AM
Abstract :
We propose three regularization techniques to overcome drawbacks of local winner-take-all methods used in deep convolutional networks. Channel-Max inherits the max activation unit from Maxout networks, but otherwise adopts complementary subsets of input and filters with different kernel sizes as better companions to the max function. To balance the training on different pathways, Channel-Drop is employed to randomly discard half pathways before their inputs are convolved respectively. Stochastic Max-pooling is defined to reduce the overfitting caused by conventional max-pooling, in which half activations are randomly dropped in each pooling region during training and top largest activations are probabilistically averaged during testing. Using Channel-Max, Channel-Drop and Stochastic Max-pooling, we demonstrate state-of-the-art performance on four benchmark datasets: CIFAR-10, CIFAR-100, STL-10 and SVHN.
Keywords :
"Training","Kernel","Stochastic processes","Convolution","Testing","Mathematical model","Color"
Conference_Titel :
Computer Vision and Pattern Recognition Workshops (CVPRW), 2015 IEEE Conference on
Electronic_ISBN :
2160-7516
DOI :
10.1109/CVPRW.2015.7301267