Abstract :
A method is presented to compute the inverse of the Hessian matrix in an optimization problem using the conjugate gradient algorithm of Hestenes and Stiefel [1]. It is shown how this may be used to refine the solution by a Newton-Raphson iteration for both finite and infinite dimensional optimization problems.