Title :
Synthesis of neural networks with linear programs
Abstract :
The two critical steps of choosing an initial neural network configuration and then choosing a collection of network weights so that the network best approximates a given training set can be formulated as a linear program. The inequalities necessary to construct the linear program are subsets of Boolean symmetric functions, naturally implementable with threshold logic devices. The construction bypasses problems with local minima with current training algorithms. The training process becomes a linear programming problem whose solution provides the sought approximation. The construction also provides clear methods of trading the final approximation precision provided by the network with the computing times needed to obtain and use it
Keywords :
computational complexity; linear programming; neural nets; Boolean symmetric functions; approximation; inequalities; linear programming; linear programs; network weights; neural networks; threshold logic devices; training algorithms; training set;
Conference_Titel :
Neural Networks, 1990., 1990 IJCNN International Joint Conference on
Conference_Location :
San Diego, CA, USA
DOI :
10.1109/IJCNN.1990.137597