## Iterated Local Searchedited by Thomas Stützle and Luis Paquete, INTELLEKTIK ## Contents## About Iterated Local SearchThe essence of the Iterated Local Serach (ILS) metaheuristic can be given in a nut-shell: one iteratively builds a sequence of solutions generated by the
embedded heuristic, leading to far better solutions than if one were
to use repeated random trials of that heuristic. This simple
idea has a long history, and its rediscovery by many
authors has lead to many different names for ILS like iterated
descent, large-step Markov chains, iterated
Lin-Kernighan, chained local optimization , or combinations of
these.
For us, there
are two main points that make an algorithm an ILS:
- there must be a single chain that is being followed (this then excludes population-based algorithms);
- the search for better solutions occurs in a reduced space defined by the output of a black-box heuristic.
ILS explores the search of local minima
(denominated as
The overall ILS procedure is pictorially illustrated in
the following figure. To be complete, let us note that
generally the ILS walk will not be reversible; in particular one may
sometimes be able to step from
Since deterministic perturbations may lead to short cycles (for
instance of length 2), one should randomize the perturbations or
have them be adaptive so as to avoid this kind of cycling. If the
perturbations depend on any of the previous
Written in a procedural way, ILS algorithms as metaheuristics have
the following high level architecture:
s*' with a fixed rule. This leads to random
walk dynamics on S* that are ``Markovian'', the
probability of making a particular step from s(1)* to s(2)*
depending only on s(1)* and s(2)*.
Staying within Markovian walks, the most basic acceptance criteria
will use only the difference in the costs of
The potential power of ILS lies in its biased sampling of the set of local optima. The efficiency of this sampling depends both on the kinds of perturbations and on the acceptance criteria. Interestingly, even with the most naive implementations of these parts, ILS is much better than random restart. But still much better results can be obtained if the ILS modules are optimized. First, the acceptance criteria can be adjusted empirically as in simulated annealing without knowing anything about the problem being optimized. This kind of optimization will be familiar to any user of metaheuristics, though the questions of memory may become quite complex. Second, the Perturbation routine can incorporate as much problem-specific information as the developer is willing to put into it. In practice, a rule of thumb can be used as a guide: ``a good perturbation transforms one excellent solution into an excellent starting point for a local search''.
Together, these different aspects show that ILS algorithms can have a
wide range of complexity, but complexity may be added progressively
and in a modular way. (Recall in particular that all of the
fine-tuning that resides in the embedded LocalSearch can be ignored if
one wants, and it does not appear in the metaheuristic per-se.) This
makes ILS an appealing metaheuristic for both academic and industrial
applications. The cherry on the cake is speed: one can perform ## Readings-
H.R. Lourenco, O. Martin, and T. Stuetzle. Iterated Local Search. In: F. Glover and G. Kochenberger, editors, Handbook of Metaheuristics. pages 321-353, Kluwer Academic Publishers, Norwell, MA, 2002 -
E. B. Baum. Towards practical ``neural'' computation for combinatorial optimization problems. In J. Denker, editor, Neural Networks for Computing, pages 53--64, 1986. AIP conference proceedings. -
J. Baxter. Local optima avoidance in depot location. Journal of the Operational Research Society, 32:815--819, 1981. -
R. K. Congram, C.N. Potts, and S. L. Van de Velde. An iterated dynasearch algorithm for the single--machine total weighted tardiness scheduling problem. INFORMS Journal on Computing, 14(1):52-67, 2002. -
D. S. Johnson and L. A. McGeoch. The travelling salesman problem: A case study in local optimization. In E.H.L. Aarts and J.K. Lenstra, editors, Local Search in Combinatorial Optimization, pages 215--310. John Wiley & Sons, Chichester, England, 1997. -
S. Kreipl. A large step random walk for minimizing total weighted tardiness in a job shop. Journal of Scheduling, 3(3):125--138, 2000. -
O. Martin and S. W. Otto. Combining simulated annealing with local search heuristics. Annals of Operations Research, 63:57--75, 1996. -
O. Martin, S. W. Otto, and E. W. Felten. Large-step Markov chains for the traveling salesman problem. Complex Systems, 5(3):299--326, 1991. -
O. Martin, S. W. Otto, and E. W. Felten. Large-step Markov chains for the TSP incorporating local search heuristics. Operations Research Letters, 11:219--224, 1992. -
T. Stuetzle Iterated Local Search for the Quadratic Assignment Problem. Technical Report AIDA-99-03, Darmstadt University of Technology, Computer Science Department, Intellectics Group. (submitted) -
T. Stuetzle. Local Search Algorithms for Combinatorial Problems --- Analysis, Improvements, and New Applications. PhD thesis, Darmstadt University of Technology, Department of Computer Science, 1998.
## Links- A survey paper for download.
| ||

Network Coordinator: Marco Dorigo e-mail: mdorigo@ulb.ac.be Web site responsibles: Christian Blum and Max Manfrin |