Title :
Minimum-seeking properties of analog neural networks with multilinear objective functions
Author_Institution :
Centre for AI & Robotics, Bangalore, India
Abstract :
Studies the problem of minimizing an objective function over the discrete set {0,1}n. It is shown that one can assume without loss of generality that the objective function is a multilinear polynomial. A gradient type neural network is proposed to perform the optimization. A novel feature of the network is the introduction of a bias vector. The network is operated in the high-gain region of the sigmoidal nonlinearities. The following comprehensive theorem is proved: For all sufficiently small bias vectors except those belonging to a set of measure zero, for all sufficiently large sigmoidal gain, for all initial conditions except those belonging to a set of measure zero, the state of the network converges to a local minimum of the objective function. This is a considerable generalization of earlier results for quadratic objective functions
Keywords :
minimisation; neural nets; optimisation; analog neural networks; bias vector; gradient type neural network; local minimum; minimum-seeking properties; multilinear objective functions; multilinear polynomial; optimization; Artificial intelligence; Ear; Electrical capacitance tomography; Electronic mail; Gain measurement; Neural networks; Neurofeedback; Neurons; Polynomials; Robots;
Conference_Titel :
Decision and Control, 1993., Proceedings of the 32nd IEEE Conference on
Conference_Location :
San Antonio, TX
Print_ISBN :
0-7803-1298-8
DOI :
10.1109/CDC.1993.325800