DocumentCode
866345
Title
Can threshold networks be trained directly?
Author
Huang, Guang-Bin ; Zhu, Qin-Yu ; Mao, K.Z. ; Siew, Chee-Kheong ; Saratchandran, P. ; Sundararajan, N.
Author_Institution
Sch. of Electr. & Electron. Eng., Nanyang Technol. Univ., Singapore
Volume
53
Issue
3
fYear
2006
fDate
3/1/2006 12:00:00 AM
Firstpage
187
Lastpage
191
Abstract
Neural networks with threshold activation functions are highly desirable because of the ease of hardware implementation. However, the popular gradient-based learning algorithms cannot be directly used to train these networks as the threshold functions are nondifferentiable. Methods available in the literature mainly focus on approximating the threshold activation functions by using sigmoid functions. In this paper, we show theoretically that the recently developed extreme learning machine (ELM) algorithm can be used to train the neural networks with threshold functions directly instead of approximating them with sigmoid functions. Experimental results based on real-world benchmark regression problems demonstrate that the generalization performance obtained by ELM is better than other algorithms used in threshold networks. Also, the ELM method does not need control variables (manually tuned parameters) and is much faster.
Keywords
gradient methods; learning (artificial intelligence); neural nets; regression analysis; threshold elements; ELM algorithm; extreme learning machine; gradient descent method; neural networks; real-world benchmark regression problems; threshold networks; Analog computers; Computational efficiency; Computer networks; Learning systems; Machine learning; Neural network hardware; Neural networks; Neurons; Process control; System-on-a-chip; Extreme learning machine (ELM); gradient descent method; threshold neural networks;
fLanguage
English
Journal_Title
Circuits and Systems II: Express Briefs, IEEE Transactions on
Publisher
ieee
ISSN
1549-7747
Type
jour
DOI
10.1109/TCSII.2005.857540
Filename
1605431
Link To Document