5.1.3 Annealing Schedule



next up previous
Next: 5.1.4 SA Computational Considerations Up: 5.1 Simulated Annealing Previous: 5.1.2 Solution Evaluation

5.1.3 Annealing Schedule

 

Through equation (63) or (69), the annealing schedule determines the degree of uphill movement permitted during the search and is, thus, critical to the algorithm's performance. The principle underlying the choice of a suitable annealing schedule is easily stated - the initial temperature should be high enough to `melt' the system completely and should be reduced towards its `freezing point' as the search progresses - but ``choosing an annealing schedule for practical purposes is still something of a black art'' (Bounds, [6]).

The standard implementation of the SA algorithm is one in which homogeneous Markov chains of finite length gif are generated at decreasing temperatures. The following parameters should therefore be specified:

Initial Temperature

Kirkpatrick [41] suggested that a suitable initial temperature is one that results in an average increase acceptance probability of about 0.8.gif The value of will clearly depend on the scaling of and, hence, be problem-specific. It can be estimated by conducting an initial search in which all increases are accepted and calculating the average objective increase observed is then given by:

 

Final Temperature

In some simple implementations of the SA algorithm the final temperature is determined by fixing

Alternatively, the search can be halted when it ceases to make progress. Lack of progress can be defined in a number of ways, but a useful basic definition is

combined with The sample code supplied with this section uses this definition of convergence (lack of progress).

Length of Markov Chains

An obvious choice for , the length of the -th Markov chain, is a value that depends on the size of the problem, so is independent of . Alternatively it can be argued that a minimum number of transitions should be accepted at each temperature. However, as approaches , transitions are accepted with decreasing probability so the number of trials required to achieve acceptances approaches . Thus, in practice, an algorithm in which each Markov chain is terminated after

whichever comes first, is a suitable compromise.

Decrementing the Temperature

The simplest and most common temperature decrement rule is:

 

where is constant close to, but smaller than, . This exponential cooling scheme (ECS) was first proposed Kirkpatrick et al. [39] with . Randelman and Grest [54] compared this strategy with a linear cooling scheme (LCS) in which is reduced every trials:

 

They found the reductions achieved using the two schemes to be comparable, and also noted that the final value of was, in general, improved with slower cooling rates, at the expense, of course, of greater computational effort. Finally, they observed that the algorithm performance depended more on the cooling rate than on the individual values of and . Obviously, care must be taken to avoid negative temperatures when using the LCS.

Many researchers have proposed more elaborate annealing schedules, most of which are in some respect adaptive, using statistical measures of the algorithm's current performance to modify its control parameters. These are well reviewed by van Laarhoven and Aarts [42].



next up previous
Next: 5.1.4 SA Computational Considerations Up: 5.1 Simulated Annealing Previous: 5.1.2 Solution Evaluation



verena@csep1.phy.ornl.gov