Title :
Generalized Levenberg-Marquardt neural nets for minimization of quasiconvex scalar functions
Author :
Pazos, Fernando A. ; Bhaya, Amit ; Kaszkurewicz, Eugenius
Author_Institution :
Dept. of Electr. Eng., Fed. Univ. of Rio de Janeiro, Rio de Janeiro, Brazil
Abstract :
Neural nets that minimize quasi-convex scalar functions are designed as dynamical systems (ordinary differential equations) which correspond to various well known discrete time algorithms, such as steepest descent, Newton, Levenberg-Marquardt, etc. The main contribution is a generalization of the Levenberg-Marquardt algorithm, including an adaptive version, that combines good features of the Newton and Levenberg-Marquardt algorithms and leads to trajectories that converge faster to the minimum of a quasiconvex objective function that is assumed to have known gradient and Hessian.
Keywords :
Hessian matrices; Newton method; differential equations; gradient methods; minimisation; neural nets; nonlinear dynamical systems; Newton algorithm; discrete time algorithm; dynamical systems; generalized Levenberg-Marquardt neural nets; gradient methods; ordinary differential equation; quasiconvex scalar function minimization; Adaptation models; Convergence; Eigenvalues and eigenfunctions; Neural networks; Newton method; Trajectory; Vectors; Neural network; nonlinear systems; optimization;
Conference_Titel :
Nature and Biologically Inspired Computing (NaBIC), 2011 Third World Congress on
Conference_Location :
Salamanca
Print_ISBN :
978-1-4577-1122-0
DOI :
10.1109/NaBIC.2011.6089602