DocumentCode :
2693300
Title :
Synthesis of neural networks with linear programs
Author :
Ursic, Silvio
fYear :
1990
fDate :
17-21 June 1990
Firstpage :
379
Abstract :
The two critical steps of choosing an initial neural network configuration and then choosing a collection of network weights so that the network best approximates a given training set can be formulated as a linear program. The inequalities necessary to construct the linear program are subsets of Boolean symmetric functions, naturally implementable with threshold logic devices. The construction bypasses problems with local minima with current training algorithms. The training process becomes a linear programming problem whose solution provides the sought approximation. The construction also provides clear methods of trading the final approximation precision provided by the network with the computing times needed to obtain and use it
Keywords :
computational complexity; linear programming; neural nets; Boolean symmetric functions; approximation; inequalities; linear programming; linear programs; network weights; neural networks; threshold logic devices; training algorithms; training set;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1990., 1990 IJCNN International Joint Conference on
Conference_Location :
San Diego, CA, USA
Type :
conf
DOI :
10.1109/IJCNN.1990.137597
Filename :
5726557
Link To Document :
بازگشت