Title :
Some experiments on training multilayer feedforward neural networks of hard-limiting units using random weights
Author :
Jin, Fu ; Chai, Zhen-Ming
Author_Institution :
Inst. of Electron., Acad. Sinica, Beijing, China
Abstract :
Training multilayer feedforward neural networks of hard-limiting units remains a problem, because of the nondifferentiable output functions. However, if one adds Gaussian noise (with zero means) to each weight the weights will be random variables with smooth distribution functions. So one can use gradients to optimize the weights. This method was first presented by P.L. Bartlett and T. Downs (1992). In order to evaluate the effectiveness of the method, the authors did some experiments on training one-, two- and three-layer feedforward neural networks. The experiment results show that the method of Bartlett and Downs is effective for training one- and two-layer networks, but is very time-consuming for training networks with three (or more) layers
Keywords :
computational complexity; feedforward neural nets; learning (artificial intelligence); random noise; Gaussian noise; gradients; hard-limiting units; multilayer feedforward neural networks; nondifferentiable output functions; optimization; random variables; random weights; smooth distribution functions; training; Density functional theory; Distribution functions; Feedforward neural networks; Gaussian distribution; Gaussian noise; Logic; Multi-layer neural network; Neural networks; Radio access networks; Random variables;
Conference_Titel :
Speech, Image Processing and Neural Networks, 1994. Proceedings, ISSIPNN '94., 1994 International Symposium on
Print_ISBN :
0-7803-1865-X
DOI :
10.1109/SIPNN.1994.344879