Simulated annealing

For -hard optimization problems, the use of exact algorithms for the evaluation of the optimal solution is computationally intensive, requiring an effort that increases exponentially with the size of the problem (cf. also Algorithm, computational complexity of an). In practice, exact algorithms are used for solving only moderately sized problem instances. This results in the development of heuristic optimization techniques which provide good quality solutions in a reasonable amount of computational time. One such popular technique is simulated annealing, which has been widely applied in both discrete and continuous optimization problems [a1], [a6], [a11]. Simulated annealing is a stochastic search method modeled according to the physical annealing process which is found in the field of thermodynamics. Annealing refers to the process of a thermal system initially melting at high temperature and then cooling slowly by lowering the temperature until it reaches a stable state (ground state), in which the system has its lowest energy. S. Kirkpatrick, C.D. Gelatt and M.P. Vecchi [a7] initially proposed an effective connection between simulated annealing and combinatorial optimization, based on original work by N. Metropolis, A. Rosenbluth, M. Rosenbluth, A. Teller, and E. Teller [a8].

The main advantage of the simulated annealing algorithm is its ability to escape from local optima by using a mechanism which allows deterioration in the objective function value. That is, in the optimization process, simulated annealing accepts not only better-than-previous solutions, but also worse-quality solutions controlled probabilistically through the temperature parameter . More particularly, in the first stages of simulated annealing, where is relatively high, the search of the solution space is widely "explored" so that different solution directions are identified, and often "bad" solutions are accepted with high probability. During the course of the algorithm, the temperature decreases in order to steadily reduce the probability of accepting solutions that lead to worse objective function values. With the allowance of controlled "uphill" movements one can avoid the entrapment to local optima and, eventually, higher quality solutions can be obtained.

There are many factors that have a strong impact on the performance of the simulated annealing algorithm:

The initial temperature . A high value of means that the probability of accepting inferior solutions is also high, leading to a diversified search in the first iterations of the algorithm. With low values of the initial temperature the search becomes more localized.

The thermal equilibrium. This is the condition in which further improvement in the solution cannot be expected with high probability.

The annealing schedule, which determines in what point of the algorithm and by how much the temperature is to be reduced.

Now, consider a minimization process. Let denote the change of the objective function value between the current state and the state under consideration that occurs as decreases. This change corresponds to the change in the energy level in the analogy with physical annealing. Then the probability of accepting a worse quality solution is equal to , where is the Boltzmann constant (cf. Boltzmann equation). Simulated annealing is presented below in pseudo-code:

PROCEDURE simulated annealing()

 InputInstance();

 Generate randomly an initial solution;

 initialize ;

 DO

  DO thermal equilibrium not reached

                Generate a neighbour state randomly;

                evaluate ;

                update current state

                IF  with new state;

                IF  with new state


with probability ;

  OD;

 Decrease  using annealing schedule;

 OD;

 RETURN(solution with the lowest energy)

END simulated annealing;A pseudo-code for a simulated annealing procedure


The following example (the quadratic assignment problem) illustrates the basic principles of simulated annealing in combinatorial optimization. The quadratic assignment problem is defined as follows: Given a set and two -matrices and , find a permutation of the set that minimizes the following function:

In the context of location theory one uses the quadratic assignment problem formulation to model the problem of allocating facilities to locations with the objective to minimize the cost associated not only with the distance between locations but also with the flow. and correspond to the flow and distance matrices, respectively [a10].

Let represent the temperature at stage of the procedure and let represent the annealing schedule. Then the application of simulated annealing to the quadratic assignment problem [a2], [a12] can be described with the following steps:

Start with a feasible solution (permutation). Make an exchange between two randomly selected permutation elements (a -exchange). Evaluate the consequent change .

While , repeat the above step. If , then randomly select a variable from a uniform distribution . Accept the pair exchange if , and repeat the process.

The system remains at stage until a fixed number of pair exchanges (equilibrium) has taken place before going to the next stage.

The procedure stops when all the temperatures in the annealing schedule have been used, i.e. when .

Simulated annealing has been used to solve a wide variety of combinatorial optimization problems, such as graph partitioning and graph colouring [a3], [a4], VLSI design [a7], quadratic assignment problems [a2], image processing [a5] and many others. In addition, implementations of simulated annealing in parallel environments have recently appeared, with applications in cell placement problems, traveling-salesman problems, quadratic assignment problems, and others [a9]. General references on simulated annealing can be found in [a1] and in [a11].

References

 [a1] E.H.L. Aarts, J.H.M. Korst, "Simulated annealing and Boltzmann machines" , Wiley (1989) [a2] R.E. Burkard, F. Rendl, "A thermodynamically motivated simulation procedure for combinatorial optimization problems" European J. Operations Research , 17 (1984) pp. 169–174 [a3] D.S. Johnson, C.R. Aragon, L.A. McGeoch, C. Schevon, "Optimization by simulated annealing: an experimental evaluation. Part I: graph partitioning" Operations Research , 37 (1989) pp. 865–892 [a4] D.S. Johnson, C.R. Aragon, L.A. McGeoch, C. Schevon, "Optimization by simulated annealing: an experimental evaluation. Part II: graph coloring and number partitioning" Operations Research , 39 (1991) pp. 378–395 [a5] S. Geman, D. Geman, "Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images" IEEE Trans. Patern Analysis and Machine Intelligence , 6 (1984) pp. 721–741 [a6] R. Horst, P.M. Pardalos, "Handbook of global optimization" , Kluwer Acad. Publ. (1995) [a7] S. Kirkpatrick, C.D. Gelatt, M.P. Vecchi, "Optimization by simulated annealing" Science , 220 (1989) pp. 671–680 [a8] N. Metropolis, A. Rosenbluth, M. Rosenbluth, A. Teller, E. Teller, "Equation of state calculations by fast computing machines" J. Chem. Phys. , 21 (1953) pp. 1087–1092 [a9] P.M. Pardalos, L.S. Pitsoulis, T.D. Mavridou, M.G.C. Resende, "Parallel search for combinatorial optimization: genetic algorithms, simulated annealing, tabu search and GRASP" , Lecture Notes in Computer Science , 980 , Springer (1995) pp. 317–331 [a10] "Quadratic assignment and related problems" P.M. Pardalos (ed.) H. Wolkowicz (ed.) , Discrete Math. and Theor. Comput. Sci. , 16 , Amer. Math. Soc. (1994) [a11] P.J.M. van Laarhoven, E.H.L. Aarts, "Simulated annealing, theory and practice" , Kluwer Acad. Publ. (1987) [a12] M.R. Wilhelm, T.L. Ward, "Solving quadratic assignment problems by simulated annealing" IEEE Trans. , 19 (1987) pp. 107–119
How to Cite This Entry:
Simulated annealing. P.M. PardalosT.D. Mavridou (originator), Encyclopedia of Mathematics. URL: http://www.encyclopediaofmath.org/index.php?title=Simulated_annealing&oldid=17162
This text originally appeared in Encyclopedia of Mathematics - ISBN 1402006098