Title :
The controlled conjugate gradient type trajectory-following neural net for minimization of nonconvex functions.
Author :
Bhaya, Amit ; Pazos, Fernando ; Kaszkurewicz, Eugenius
Author_Institution :
Dept. of Electr. Eng., Fed. Univ. of Rio de Janeiro, Rio de Janeiro, Brazil
Abstract :
This paper presents a unified way to design neural networks characterized as second-order ordinary differential equations (ODE), the trajectories of which can converge to the global minimum of nonconvex scalar functions. These neural networks, sometimes also called continuous-time algorithms, are interpreted as closed-loop control systems and the state feedback design is based on control Lyapunov functions. The focus is on a new family of continuous-time versions of conjugate gradient method, named controlled conjugate gradient (CCG) nets, that generalize heavy ball with friction (HBF) nets. For nonconvex functions, the goal of these nets is to produce trajectories that start from an arbitrary initial point and can escape from local minima, thereby increasing the chances of converging to the global minimum. Several numerical examples are given, using benchmark problems, showing that this escape and subsequent convergence to the global minimum occurs.
Keywords :
concave programming; conjugate gradient methods; continuous time systems; differential equations; minimisation; neural nets; Lyapunov functions; closed-loop control systems; conjugate gradient method; continuous-time algorithms; controlled conjugate gradient nets; heavy ball with friction nets; nonconvex function minimization; nonconvex scalar functions; second-order ordinary differential equations; state feedback design; trajectory-following neural net; Electrocardiography;
Conference_Titel :
Neural Networks (IJCNN), The 2010 International Joint Conference on
Conference_Location :
Barcelona
Print_ISBN :
978-1-4244-6916-1
DOI :
10.1109/IJCNN.2010.5596365