next up previous

2.5.3 Quasi Newton

QN methods form an interesting class of algorithms that are theoretically closely-related to nonlinear CG methods and to perform well in practice. Surprising recent discoveries yet increased their appeal.

As extensions of nonlinear CG methods, QN methods are additional curvature information to accelerate convergence. However, memory and computational requirements are kept as low as possible. Essentially, curvature information is built up progressively. At each step of the algorithm, the current approximation to the Hessian (or inverse Hessian) is updated by using new gradient information. (The updated matrix itself is not necessarily stored explicitly, since the updating procedure may be defined compactly in terms of a small set of stored vectors.) For the expansion of the gradient:

we obtain the following relation if were a constant (equivalently, a quadratic), equal to :

where .