Two common gradient methods are Steepest Descent (SD) [45][7] and Conjugate Gradient (CG) [36][32]. Both are fundamental techniques that are often incorporated into various iterative algorithms in many areas of scientific computing. For a theoretical treatment and notes on parallel implementations, see the linear algebra chapter .

- 2.4.1 Derivative Programming
- 2.4.2 Steepest Descent
- 2.4.3 Conjugate Gradient
- 2.4.4 Preconditioning
- 2.4.5 Nonlinear Conjugate Gradient