DocumentCode :
2895900
Title :
Weight Initialization of Feedforward Neural Networks by Means of Partial Least Squares
Author :
Liu, Yan ; Zhou, Chang-feng ; Chen, Ying-wu
Author_Institution :
Coll. of Inf. Syst. & Manage., Nat. Univ. of Defense Technol., Changsha
fYear :
2006
fDate :
13-16 Aug. 2006
Firstpage :
3119
Lastpage :
3122
Abstract :
A method to set the weight initialization and the optimal number of hidden nodes of feedforward neural networks (FNN) based on the partial least squares (PLS) algorithm is developed. The combination of PLS and FNN method ensures that the outputs of neurons are in the active region and increases the rate of convergence. The performance of the FNN, PLS, and PLS-FNN are compared according to an example of customer satisfaction measurement with unknown relationship between the input and output data. The results show that the hybrid PLS-FNN has the smallest root mean square error and the highest imitating precision. It substantially provides the good initial weights, improves the training performance and efficiently achieves an optimal solution
Keywords :
convergence; feedforward neural nets; learning (artificial intelligence); mean square error methods; convergence; feedforward neural network; partial least square algorithm; root mean square error; weight initialization; Convergence; Covariance matrix; Cybernetics; Educational institutions; Feedforward neural networks; Least squares methods; Linear regression; Machine learning; Management information systems; Matrices; Matrix decomposition; Neural networks; Vectors; Feedforward neural networks; PLS-FNN; partial least squares; weight initialization;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Machine Learning and Cybernetics, 2006 International Conference on
Conference_Location :
Dalian, China
Print_ISBN :
1-4244-0061-9
Type :
conf
DOI :
10.1109/ICMLC.2006.258402
Filename :
4028601
Link To Document :
بازگشت