1.4 Numerical Example and Programming Notes



next up previous
Next: 2 Methods for Unconstrained Up: 1 Introduction Previous: 1.3 Optimality Conditions

1.4 Numerical Example and Programming Notes

 

Rosenbrock's function is often used as a minimization test problem for nonlinear continuous problems, since its minimum lies at the base of a ``banana-shaped valley'' and can be difficult to locate. This function is defined for even integers as the following sum:

 

The contour plot of Rosenbrock's function for is shown in Figure 3. In general, contour maps show surfaces in the -dimensional space defined by where is a constant. For , plane curves correspond to various values of . We see from the figure that the minimum point (dark circle) is at , where . The gradient components of this function are given by

 

and the Hessian is the block diagonal matrix with entries

 

(These formulas are given in a form most efficient for programming.) For , the two eigenvalues of the Hessian at the minimum are and , and thus the condition number . The function contours, whose axes lengths are proportional to the inverse of the eigenvalues, are thus quite elongated near the minimum (see Figure 3).


Figure 3 Contours of the Two-Dimensional Rosenbrock Function. View Figure

In minimization applications, the user is often required to write subroutines that compute the target function and its first and second derivatives (the latter optional) at each given point.

Note that the Hessian is stored in two one-dimensional arrays that reside in a COMMON block. A storage format is often not imposed on the Hessian for large-scale problems so that the user can exploit problem structure (e.g., sparsity) to save storage space.