About Simulated AnnealingSimulated Annealing (SA) is commonly said to be the oldest among the metaheuristics and surely one of the first algorithms that had an explicit strategy to avoid local minima. The origins of the algorithm are in statistical mechanics (Metropolis algorithm) and it was first presented as a search algorithm for CO problems in [Kirkpatrick et al. 1983] and [Cerny et al. 1985]. The fundamental idea is to allow moves resulting in solutions of worse quality than the current solution (uphill moves) in order to escape from local minima. The probability of doing such a move is decreased during the search.
The algorithm starts by generating an initial solution (either randomly or heuristically constructed) and by initializing the so-called temperature parameter T. Then the following is repeated until the termination condition is satisfied: A solution s' from the neighborhood N(s) of the solution s is randomly sampled and it is accepted as new current solution depending on f(s), f(s') and T. s' replaces s if f(s') < f(s) or, in case f(s') >= f(s), with a probability which is a function of T and f(s') - f(s). The probability is generally computed following the Boltzmann distribution exp(-(f(s') - f(s))/T).
The temperature T is decreased during the search process, thus at the beginning of the search the probability of accepting uphill moves is high and it gradually decreases, converging to a simple iterative improvement algorithm. This process is analogous to the annealing process of metals and glass, which assume a low energy configuration when cooled with an appropriate cooling schedule. Regarding the search process, this means that the algorithm is the result of two combined strategies: random walk and iterative improvement. In the first phase of the search, the bias toward improvements is low and it permits the exploration of the search space; this erratic component is slowly decreased thus leading the search to converge to a (local) minimum. The probability of accepting uphill moves is controlled by two factors: the difference of the objective functions and the temperature. On the one hand, at fixed temperature, the higher the difference f(s')- f(s), the lower the probability to accept a move from s to s'. On the other hand, the higher T, the higher the probability of uphill moves.
Simulated annealing (given some assumptions on the cooling of the temperature, etc.) is proven to converge to the optimum solution of a problem.
An easy implementation of the algorithm makes it very easy to adapt a local search method (e.g. best improvement local search) to a simulated annealing algorithm, usually rendering the local search with much better results.Disadvantages
Although it is proven to converge to the optimum, it converges in infinite time. Not only for this reason, but also since you have to cool down slowly, the algorithm is usually not faster than its comtemporaries
|Network Coordinator: Marco Dorigo e-mail: email@example.com Web site responsibles: Christian Blum and Max Manfrin|