Title :
A pre-radical basis function with deep back propagation neural network research
Author :
Hui Wen ; Weixin Xie ; Jihong Pei
Author_Institution :
ATR Key Lab. of Shenzhen Univ., Shenzhen, China
Abstract :
In the paper, the architecture of a pre-radical basis function(RBF) with deep back propagation(BP) neural network is proposed. The three-layer RBF network is altered into a two-layer RBF, the output of RBF hidden layer is processed and then connected with a multilayer perceptron network. Firstly, the input samples go through RBF hidden units and are pre-trained via unsupervised learning, after the data obtained are normalized, the supervised BP learning algorithm is used to achieve adjustments of the network weights, thus completing the training of the entire network. Experiments show that the improved architecture simplifies the selection of parameters in the former RBF network, while reducing dependence on the number of hidden layers and neurons in the BP network. Meanwhile, the improved architecture accelerates the convergence rate of BP network which can effectively avoid falling into the risk of local minimum, it also improves the classification accuracy.
Keywords :
backpropagation; multilayer perceptrons; radial basis function networks; BP network; RBF network; deep back propagation neural network research; multilayer perceptron network; preradical basis function; supervised BP learning algorithm; unsupervised learning; Accuracy; Clustering algorithms; Computer architecture; Neurons; Radial basis function networks; Training; Vectors; back propagation; deep learning; hybrid architecture; neural network; radial basis function;
Conference_Titel :
Signal Processing (ICSP), 2014 12th International Conference on
Conference_Location :
Hangzhou
Print_ISBN :
978-1-4799-2188-1
DOI :
10.1109/ICOSP.2014.7015247