DocumentCode :
2288058
Title :
A theoretically sound learning algorithm for constructive neural networks
Author :
Kwok, Tin-Yau ; Yeung, Dit-Yan
Author_Institution :
Dept. of Comput. Sci., Hong Kong Univ. of Sci. & Technol., Kowloon, Hong Kong
fYear :
1994
fDate :
13-16 Apr 1994
Firstpage :
389
Abstract :
In this paper, we analyse the problem of learning in constructive neural networks from a Hilbert space point of view. A novel objective function for training new hidden units using a greedy approach is derived. More importantly, we prove that a network so constructed incrementally still preserves the universal approximation property with respect to L2 performance criteria. While theoretical results obtained so far on the universal approximation capabilities of multilayer feedforward networks only provide existence proofs, our results move one step further by providing a theoretically sound procedure for constructive approximation while still preserving the universal approximation property
Keywords :
approximation theory; feedforward neural nets; function approximation; learning (artificial intelligence); Hilbert space; L2 performance criteria; constructive neural networks; greedy approach; learning algorithm; multilayer feedforward networks; objective function; universal approximation; Computer science; Councils; Feedforward neural networks; Feedforward systems; Hilbert space; Multi-layer neural network; Neural networks; Testing; Thumb;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Speech, Image Processing and Neural Networks, 1994. Proceedings, ISSIPNN '94., 1994 International Symposium on
Print_ISBN :
0-7803-1865-X
Type :
conf
DOI :
10.1109/SIPNN.1994.344886
Filename :
344886
Link To Document :
بازگشت