Dimma:: A Design and Implementation Methodology For Metaheuristic Algorithms - A Perspective From Software Development

Download as pdf or txt
Download as pdf or txt
You are on page 1of 18

International Journal of Applied Metaheuristic Computing, 1(4), 57-74, October-December 2010 57

DIMMA:
A Design and Implementation Methodology
for Metaheuristic Algorithms – A
Perspective from Software Development
Masoud Yaghini, Iran University of Science and Technology, Iran
Mohammad Rahim Akhavan Kazemzadeh, Iran University of Science and Technology, Iran

ABSTRACT
Metaheuristic algorithms will gain more and more popularity in the future as optimization problems are in-
creasing in size and complexity. In order to record experiences and allow project to be replicated, a standard
process as a methodology for designing and implementing metaheuristic algorithms is necessary. To the
best of the authors’ knowledge, no methodology has been proposed in literature for this purpose. This paper
presents a Design and Implementation Methodology for Metaheuristic Algorithms, named DIMMA. The
proposed methodology consists of three main phases and each phase has several steps in which activities that
must be carried out are clearly defined in this paper. In addition, design and implementation of tabu search
metaheuristic for travelling salesman problem is done as a case study to illustrate applicability of DIMMA.

Keywords: Ant Colony Optimization, Design and Implementation Methodology, Evolutionary Algorithms,
Genetic Algorithm, Metaheuristic Methodology, Operations Research, Tabu Search

1. INTRODUCTION taheuristic methods are genetic algorithm


(Goldberg, 1989) and ant colony optimization
Optimization problems, which occur in real (Dorigo & Stützle, 2004) in which collective
world applications, are sometimes NP-hard. intelligence play the important role (Wang,
In the case of NP-hard problems, exact algo- 2010). Tabu search (Glover & Laguna, 1997)
rithms need, in the worst case, exponential and simulated annealing (Kirkpatrick, Gelatt,
time to find the optimum. Metaheuristics or & Vecchi, 1983) are the two popular single-
modern heuristics deal with these problems by solution based metaheuristics that improve a
introducing systematic rules to escape from single solution in an iterative algorithm.
local optima. Metaheuristics are applicable to With growing scale and complexity of
a wide range of optimization problems (Doreo optimization problems, metaheuristics will
et al., 2006; Morago, DePuy, & Whitehouse, gain more and more popular. According to
2006). Some popular population-based me- significant growth in using metaheuristics as
optimization tools, there must be a standard
methodology for design and implementing
DOI: 10.4018/jamc.2010100104

Copyright © 2010, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global
is prohibited.
58 International Journal of Applied Metaheuristic Computing, 1(4), 57-74, October-December 2010

them. Such a methodology is used for recording 2. ARCHITECTURE OF DIMMA


experience and allows projects to be replicated.
Moreover, this standard methodology can be The architecture of DIMMA has been inspired
a comfort factor for new adopters with little from Rational Unified Process (RUP) which
metaheuristic experience, and can show the is a methodology for software engineering
guidelines to everyone who want to design (Kroll & Krutchten, 2003). DIMMA has two
and implement metaheuristics. dimensions including dynamic and discipline
To the best of our knowledge, no method- dimension (Figure 1). Dynamic dimension is
ology has been proposed in literature for design the horizontal dimension, which includes phases
and implementation metaheuristic algorithms. of the methodology: initiation, blueprint, and
There are many software frameworks in the construction. Discipline dimension is the verti-
literature for metaheuristics (Voss & Woodruff, cal dimension that shows the disciplines, which
2002; Fink et al., 1999), in which framework logically group the steps, activities, and artifacts.
means reusable programming codes and DIMMA has three sequential phases that
components for metaheuristics (Talbi, 2009). each of them has several steps (Figure 2). In
Hence, the meaning of frameworks in these each step, we define several activities, which
references is different from our proposed must be done to complete the steps. These
methodology. Although there are several tutori- phases are as follows: initiation, in which the
als as lectures on how to design meheuristics problem in hand must be understood precisely,
(Thierens, 2008), they are sometimes for and the goal of designing metaheuristic must
special metaheuristic and do not consider this be clearly defined. The next phase is blueprint,
process as a whole. the most important goals of this phase are se-
The proposed methodology in this paper, lecting metaheuristic solution method, defining
a Design and Implementation Methodology for performance measures, and designing algorithm
Metaheuristic Algorithms (DIMMA), shows for our solution strategy. The last phase is
guidelines to everyone who wants to design construction in which implementing the de-
and implement a metaheuristic algorithm. Web- signed algorithm, parameters tuning (parameter
ster’s collegiate dictionary defines methodol- setting), analyzing its performance, and finally
ogy as “a body of methods, rules, and postulate documentation of results must be done. In some
employed by a discipline” or “the analysis of steps, it is necessary to review pervious steps
the principles or procedures of inquiry in a to justify and improve decisions and algorithm.
particular field” (Merriam-Webster, 1997). For example, it is common for the algorithm to
DIMMA includes several phases, steps, be modified after the performance evaluations.
disciplines, and principles to design and These backward movements are illustrated in
implement a specific metaheuristic for a given Figure 2.
optimization problem. In other words, DIMMA
refers to the methodology that is used to stan-
dardize process of design and implementing 3. INITIATION PHASE
a metaheuristic algorithm. In Sections 2-5 we
Step 1.1: State the Problem
explain the architecture of DIMMA and its
phases and steps, In Section 6 we followed by Stating the problem is the step 1.1 in DIMMA
a description of each step of DIMMA using that is helpful in narrowing the problem down
design and implementation of Tabu Search and make it more manageable. To state the
(TS) metaheuristic for Travelling Salesman problem, one can write simple statement that
Problem (TSP) as a case study. includes one or more objectives, inputs, outputs,

Copyright © 2010, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global
is prohibited.
International Journal of Applied Metaheuristic Computing, 1(4), 57-74, October-December 2010 59

Figure 1. Two dimensions of DIMMA and level of effort in each discipline during the phases

Figure 2. Steps in each phases of DIMMA

and assumptions of problem. In this step, a The structure of problem such as multi-
mathematical model can be provided for clar- objective approaches, dynamic aspects, continu-
ity. However, in some cases, it is difficult to ous or discrete modeling which is defined here
formulate the problem with an unambiguous can have significant effects on the next steps
analytical mathematical notation. including defining goals, selecting solution
strategy, and defining performance measures.

Copyright © 2010, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global
is prohibited.
60 International Journal of Applied Metaheuristic Computing, 1(4), 57-74, October-December 2010

Step 1.2 Define Goals A. Therefore, the aim of designing an algorithm


must be for a class of problems (Talbi, 2009).
In the step 1.2 of DIMMA, the goals of develop- Instances which are used for implementing
ing the metaheuristic must be defined clearly. and analyzing metaheuristics can be divided into
All the experiments, performance analysis real-life and constructed instances. Real-life
measures, and statistical analysis will depend instances are taken from real world applications.
on these goals. In addition, goal definition can Pure real-life and random real-life instances
be helpful in selecting solution strategy. are the two types of real life instances. Pure
Some goals of designing the metaheuristic real-life instances are the practical instances of
are: (1) reducing search time in comparison with real world applications. If available, they are
exact methods or another metaheuristics, (2) the good tools for evaluating the performance
improving quality of solutions, (3) robustness of a metaheuristic. Obtaining pure real-life
in terms of the instances, (4) solving large-scale instances is difficult, because those data aren’t
problems, (5) easiness of implementation, (6) public, and collecting such data may be time
easiness to combine with other algorithms to consuming or expensive (Talbi, 2009; Silberholz
improve performance, (7) flexibility to solve & Golden, 2010). Random real-life instances
other problems or optimization models, and (8) are an alternative of real-life instances which
providing a tight approximation to the problem. use random real instances. In this type of in-
Selecting instances and solution method, and stances, the structure of the real-life problem is
defining performance measures must be done preserved, but details are randomly changed to
according to selected goals. For example, if produce new instances (Alba & Lugue, 2005;
you want to reduce the search time, you must Fischetti, Gonzalez, & Toth, 1997).
select time measurements in step 2.2, and you Standard instances and pure random
must select instances which can be comparable instances are the two types of constructed
to another research works. instances. Standard instances are the instances
that are widely used in experimentations, and be-
Step 1.3 Select Instances cause of that, became standard in literature. OR-
Library (Beasley, 1990), MIPLIB (Achterberg,
In the step 1.3 of DIMMA, we must select input Koch, & Martin, 2003), and TSPLIB (Reinelt,
instances carefully to perform the evaluation 1991) are three examples of public libraries
of the algorithm. The chosen instances must of standard instances which are available on
be representative of the problem that one is Internet. By means of the standard instances we
trying to solve. The selected structure of input can compare our designed metaheuristic with
instances may influence the performance of another methods in literature (Alba & Lugue,
metaheuristics significantly. To obtain inter- 2005; Talbi, 2009). Finally, when none of the
esting result and to allow the generalization of above instances are available, the remaining
the conclusions, the selected instances must be alternative is pure random instance generation.
diverse in terms of size of the instances, their Although this type of instances can generate a
complexity, and their structure (Alba & Lugue, wide range of instances with different size and
2005; Talbi, 2009; Silberholz & Golden, 2010). complexity, they are often too far from real-life
Keep in mind that according to No Free Lunch problems to reflect their structure and important
(NFL) theorem (Wolpert & Macready, 1997), characteristics and as a result, performance
no optimization algorithm is better than any evaluation using only this type of instances
other on all possible optimization problems; it might be controversial (Alba & Lugue, 2005;
means that, if algorithm A performs better than Gendreau, Laporte, & Semet, 1998).
B for a given problem, there is no proof that After selecting instances, the selected in-
A is always better than B and there is always stances must be divided into two subsets; one
another problem where B performs better than for tuning the metaheuristic parameters and

Copyright © 2010, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global
is prohibited.
International Journal of Applied Metaheuristic Computing, 1(4), 57-74, October-December 2010 61

the second for evaluating the algorithm perfor- algorithm, hybrid method, parallel algorithm,
mance with tuned parameters. Tuning instances and cooperative algorithm.
should be representative of the whole class of For selecting metaheuristic or exact meth-
instances that the algorithm will eventually ods as a solution strategy, we must keep in mind
encounter. Therefore, since tuning instances four main factors. First, one is the complexity
are representative of another, the performance of a problem. Complexity can be seen as an
of algorithm on these two subsets of instances indication of hardness of the problem. The
must be similar to each other. Also the instances second important factor is the size of input
which are used for tuning parameters must not instances. In the case of small instances, the
be applied in performance evaluation because problem may be solved by an exact approach,
this may lead to controversial results that the even if the problem is NP-hard. The structure
parameters are just suitable for these problems of the instances is another important factor.
(Biratteri, 2009). Some of the problems with specific structure
In addition, after selecting instances, clas- by medium or even large dimensions may be
sifying them might be useful. Classify instances solved optimally by exact algorithms. The last
means categorizing them into classes according factor is the required search time to solve a given
to some important factors such as size and com- problem which must take into account. If we
plexity. Performance analysis of constructed have real-time constraint (depend on problem,
metaheuristic algorithm must be done on each can be some seconds to some months), even
class of this problem instances (Silberholz if the complexity of problem is polynomial,
& Golden, 2010). This kind of approach can the using of metaheuristic can be justified. If
be found in Hartmann and Kolisch (2000) in for a given problem, there is an exact or other
which three factors such as network complex- state of the art algorithm in literature, select-
ity, resource factor, and resource strength are ing metaheuristic, as a solution strategy is not
used to classify instances for project scheduling rational (Talbi, 2009).
problem. Sometimes it is useful to use fitness In addition to above factors, ease of use of
landscape analysis for classifying the instance. certain metaheuristic over certain problem can
This work may help to know the difficulties and play an important role. For example, ACO is
structure of instances (see Stadler & Schnabl, very intuitive for TSP, but may be difficult to
1992; Stadler, 1995). adapt to continuous problems. Also, in some
cases, the direct use of a generic metaheuristic
would not lead to good results and sometimes
4. BLUEPRINT PHASE there is a need to adapt it for the problem with
the use of specialized heuristics.
Step 2.1 Select Solution Strategy
Hybrid algorithms are one of the ap-
In step 2.1 of DIMMA, after reviewing existing proaches to use metaheuristics as optimization
solution methods for the problem, the neces- tools. A hybrid algorithm is a combination of
sity of using metaheuristics must be specified. complete (exact) or approximate algorithms
Indeed, according to the existing solution (or both) used to solve the problem in hand
methods, it must be distinguished if applying (El-Abd & Kamel, 2005). These methods are
metaheuristic for the problem is necessary or used when we want the specific advantages of
not. If a metaheuristic approach is selected as different approaches.
a solution method, one can go to the next step; The central goal of parallel computing is to
otherwise we must stop in this point. speed up computation by dividing the workload
According to the situation, we can select among several processors. From the view point
one of the solution strategies such as exact of algorithm design, pure parallel computing
algorithm, heuristic algorithm, metaheuristic strategies exploit the partial order of algorithms
(i.e., the sets of operations that may be executed

Copyright © 2010, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global
is prohibited.
62 International Journal of Applied Metaheuristic Computing, 1(4), 57-74, October-December 2010

concurrently in time without modifying the (Figure 3): global optimal solution, lower (up-
solution method and the final solution obtained) per) bound solution, best known solution, and
and thus correspond to the natural parallelism requirements or actual implemented solution.
present in the algorithm (Crainic & Toulouse, For constructed instances, the global optimal
2003). Cooperative algorithms are a category is known. In this case, the percentage of runs
of parallel algorithms, in which several search in which the result is equivalent to global op-
algorithms are run in parallel in order to solve timal (success rate), can be the best indicator.
the optimization problem in hand. The search However, usually this global optimal is not
algorithms (run in parallel) may be different, available, so the lower or upper bound can be
that is why a cooperative search technique may used for minimization and maximization prob-
be also viewed as a hybrid algorithm (El-Abd lems, respectively. Several relaxation methods
& Kamel, 2005). such as Lagrangian relaxation can be used to
After selecting solution strategy, we must find lower and upper bound solution. For some
specify algorithm components for our problem. standard and popular problems, the best known
This specification can be helpful in the step solution is available in the literature which can
of selecting data structure and designing the be used as an indicator to evaluating the quality
algorithm. of solutions. For real world application, there
might be predefined goal that can be as a quality
Step 2.2 Define measurement (Talbi, 2009).
Performance Measures Computational effort is the worst-case or
average-case complexity and CPU time that
In this step, the performance measures are se- can be used for theoretical and empirical
lected for the step of performance analysis. We analysis of algorithm efficiency. For theoretical
must select the appropriate measures according analysis, one can use complexity theory to
to the selected goals of using metaheuristics. calculate the complexity of an algorithm. How-
For exact solution methods, in which the global ever, it is not always sufficient to analyze the
optimality is guaranteed, search time is the computation effort theoretically. CPU time is
main indicator to evaluate the performances of a measurement to analyze the computational
the algorithms. But, in the case of metaheuris- effort empirically. One of the disadvantages of
tics which try to find near optimal solution CPU time is that it depends on computer char-
in reasonable time, both solution quality and acteristics and compiler in which the algorithm
computational time are the two main indicators is compiled (Talbi, 2009; Silberholz & Golden,
of performance (Alba & Lugue, 2005; Talbi, 2010). Therefore, many researchers use the
2009). In metaheuristic search methods, the number of objective function evaluations as a
indicators to evaluate the effectiveness include measurement of the computational effort, since
quality of solutions, computational effort, and it eliminates the effects of these characteristics
robustness (Barr et al., 1995). (Talbi, 2009). However, number of evaluations
Quality of solutions is based on measuring of objective function is not also the best metric
the distance to one of the following solutions

Figure 3. Performance assessment of the quality of the solutions in a minimization problem.


(Adapted from Talbi, 2009)

Copyright © 2010, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global
is prohibited.
International Journal of Applied Metaheuristic Computing, 1(4), 57-74, October-December 2010 63

for computation time, because it may not be ways for specifying an algorithm (Puntambekar,
the part that takes the most of computation time. 2009). Using natural language is the simplest
As we mentioned, another metric for way for specifying an algorithm. In this case,
performance evaluation is robustness. If the we specify the algorithm simply with natural
results of algorithm have no or small changes language. Although such a specification is
by deviating in the input instances, the algo- very simple, it is not usually clear enough
rithm is robust (Montgomery, 2005). Once the to implement. Pseudocode is another way of
experimental results are obtained, standard specifying an algorithm that is a combination
deviation of solution quality must be considered of natural and programming language. Using
as a measurement of robustness. pseudocode, one can specify algorithm more
Performance measures for multiobjective precisely than a natural language. Instead of
optimization (measures for convergence of two previous ways for specifying an algorithm,
metaheuristics towards the Pareto frontier) one can use flowchart. Flowchart is a graphical
can be found in Collette and Siarry (2005). specification of an algorithm. After specifying
Moreover, performance measures for parallel the overall structure of algorithm, the correct-
optimization can be found in Crainic & Tolouse ness of it must be checked. This work can be
(1998). In addition, performance metrics for done by using one small instance of valid input.
comparison of metaheuristics, such as run In addition to the above tools of descrip-
time and quality of solution, is well discussed tion of an algorithm, the dynamic behaviours
in Silberholz and Golden (2010). It must be of metaheuristics can be described by means
stated here that statistical analysis which must of some concepts from RUP, such as system
be done to evaluate performance of algorithm sequence diagram (SSD) in the UML (Unified
is discussed in section 3.3. Modeling Language) (Siau & Halpin, 2001).
A sequence diagram in UML is a kind of in-
Step 2.3 Select Data Structure teraction diagram that shows how processes
operate with one another and in what order. It
Before designing the algorithm, it is necessary is a construct of a Message Sequence Chart.
to select a proper data structure in step 2.3 of The algorithm must also be analyzed ac-
DIMMA. Data structure is a scheme for organiz- cording to four factors including complexity,
ing information in memory, for better algorithm space efficiency, simplicity and generality.
efficiency (MacAllister, 2009; Puntambekar, The number of steps of pseudocode needed to
2009). Array, files, lists, trees, and tables are the specify the algorithm can be the metrics for
important types of data structures. For instance, complexity. However, these metrics depend on
for TSP, as we will mention in section 6.2, for programming language and style of pseudocode
representing a solution, one can use an array (Silberholz & Golden, 2010). Space efficiency
with the length of number of cities. is space or memory usage of an algorithm.
Selecting the right data structure can make Simplicity is an important factor, because the
an enormous difference in the complexity of simpler the algorithm is, the easier it can be
the resulting implementation. Pick the right programmed and debugged. Finally, generality
data representation can make the programming is applicability of an algorithm to a wide range
easier. If we select wrong data structure for an of inputs (Puntambekar, 2009).
algorithm, implementing it might be too time
consuming (Cormen et al., 2002; Skiena &
Revilla, 2003). 5. CONSTRUCTION PHASE
Step 2.4 Design Algorithm Step 3.1 Implement Algorithm

In the step 2.4 of DIMMA, overall structure of The implementation of an algorithm is done by
algorithm must be specified. There are various suitable programming language in step 3.1 of

Copyright © 2010, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global
is prohibited.
64 International Journal of Applied Metaheuristic Computing, 1(4), 57-74, October-December 2010

DIMMA. For example, if an algorithm consists Step 3.2 Tune Parameters


of objects and related methods then it will be
better to implement such algorithm using one Parameters are the configurable components
of the object oriented programming language of a metaheuristic algorithm. Metaheuristics
such as C++, C# or Java. are sensitive to the value of their parameters.
In this step, in order to speed out program- Therefore, in step 3.2 of DIMMA, parameter
ming and reducing development costs, one can tuning, also known as parameter setting, should
use many free software frameworks. Software be done. To do this, several number of numerical
frameworks are reusable programming codes and/or categorical parameters, has to be tuned,
and components for metaheuristics. One can and to do this scientific method and statistical
use frameworks to implement a metaheuristic analysis could and should be employed (Birat-
without in-depth knowledge of programming. teri, 2009; Talbi, 2009; Barr et al., 1995).
Some of the software frameworks which have In the most of research papers in the field
been proposed for single-objective problems of metaheuristics, parameters are tuned by hand
are EasyLocal++ (Gaspero & Schaerf, 2001), in a trial-and-error procedure. This approach has
Localizer++ (Michel & Van 2001), GAlib several disadvantages such as: time-consuming,
(Wall, 1996), MAFRA (Krasnogor & Smith, labor-intensive, and it need practitioner with
2000), Hotframe (Fink, Voss, & Woodruff, special skills (Biratteri, 2009).
1999), iOpt (Voudouris et al., 2001), DREAM The proper values for the parameters de-
(Arenas et al. 2002), MALLBA (Alba et al., pend on three main factors: the problem at hand,
2002), ECJ (Wilson, 2004), and CIlib (Cloete, the instance of the problem, and the required
Engelbrecht, & Pampar, 2008). EasyLocal++ search time. There are two different strategies
and Localizer++ have been designed for local for parameter tuning including off-line and
search algorithms, while GAlib is the framework on-line strategies.
for genetic algorithm. MAFRA is a framework In off-line tuning strategy, the parameters
for genetic local search algorithm (memtic algo- are set before the execution of metaheuristics
rithm). Hotframe is for evolutionary algorithms and don’t update during the execution. In this
and metaheuristics which use single solution strategy one-by-one parameter tuning doesn’t
instead of population of solutions. Some of guarantee the optimality of parameters values,
these frameworks including MALLBA, and ECJ because there is no interaction between param-
are just for evolutionary algorithms. iOpt is a eters. To overcome this problem, Design of
framework for genetic algorithm that can handle Experiments (DoE) can be used (Fisher, 1935;
hybrid approaches. CIlib is a framework for Montgomery, 2005; Frigon, 1997; Antony,
swarm intelligence and evolutionary computing 2003; Box, 2005). In DoE approaches, there are
algorithms. Many frameworks, and reusable some factors (parameters) that each of them has
codes for Particle Swarm Intelligence (PSO) can different levels (potential values of parameters).
be found in Particle Swarm Central1. There are With a full factorial design, the best level of each
also frameworks for several metaheuristics in factor can be obtained, but this work takes high
MetaYourHeuristic (Yin, 2010) that can be used computational time. However, there are several
for new adopters in the field of metaheuristics. DoE approaches which reduce the number of
In the case of multi-objective optimization experiments (Montgomery, 2005). DOE refers
some of software frameworks including PISA to the process of planning the experiment so that
(Bleuler et al., 2003) and ParadiseEO2 have appropriate data that can be analyzed by statisti-
been developed. However, ParadiseEO can also cal methods will be collected, resulting in valid
handle single-objective problems. There are and objective conclusions (Biratteri, 2009).
also several genetic algorithm source codes for The three basic principles of DoE are rep-
single and multi-objective problems in Kanpur lication, randomization, and blocking. Replica-
Genetic Algorithms Laboratory website3. tion means a repetition of the basic experiment.

Copyright © 2010, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global
is prohibited.
International Journal of Applied Metaheuristic Computing, 1(4), 57-74, October-December 2010 65

Replication has two important properties. First, paper for the proposed algorithm in the case
it allows the experimenter to obtain an estimate study section (Montgomery, 2005).
of the experimental error. Second, if the sample Another approach in off-line parameter
mean is used to estimate the effect of a factor tuning, is to formulate tuning of parameters
in the experiment, replication permits the ex- of a metaheuristic as an optimization problem
perimenter to obtain a more precise estimate (Talbi, 2009). This problem can be seen as an
of this effect. Randomization means that both independent problem that by optimizing it, the
the allocation of the experimental material and best values of parameters are obtained. In the
the order in which the individual runs or trials case of off-line parameter tuning, Coy et al.
of the experiment are determined randomly. (2000) proposed a procedure, based on statisti-
Randomization usually makes this assumption cal design of experiments and gradient descent
valid. Blocking is a design technique used to that finds effective settings for parameters found
improve the precision with which comparisons in heuristics. Adenso-Diaz and Laguna (2006)
among the factors of interest are made. Often proposed a procedure for parameter tuning by
blocking is used to reduce or eliminate the vari- means of fractional design of experiments and
ability transmitted from nuisance factors; that local search. Hutter et al. (2009) proposed An
is, factors that may influence the experimental Automatic Algorithm Configuration Frame-
response but in which we are not directly in- work (ParamILS) as a procedure for automatic
terested (Montgomery, 2005). The important parameter tuning. Another approach for off-line
parameters in DoE approach are response tuning parameters is machine learning that can
variable, factor, level, treatment and effect. be found in Biratteri (2009).
The response variable is the measured variable At different time of the search, different
of interest. In the analysis of metaheuristics, values for parameters are optimal. Because of
the typically measures are the solution quality this, an important drawback of off-line strate-
and computation time (Adenso-Díaz & laguna, gies is that the value or parameters are fixed
2006). A factor is an independent variable ma- and can’t change during the search. Therefore,
nipulated in an experiment because it is thought online approaches that update parameters dy-
to affect one or more of the response variables. namically (a random or deterministic update of
The various values at which the factor is set the parameter values without take into account
are known as its levels. In metaheuristic per- the search process) or adaptively (changes the
formance analysis, the factors include both the values according to the search progress using
metaheuristic tuning parameters and the most the memory of the search) during the search
important problem characteristics (Biratteri, must be designed. In adaptive approaches, the
2009). A treatment is a specific combination parameters are encoded into the representation
of factor levels. The particular treatments will of solutions and as a result by changing solution
depend on the particular experiment design and the value of the parameters are also changed.
on the ranges over which factors are varied. Resent works on tuning parameters for me-
An effect is a change in the response variable taheuristic algorithms can be found in Birattari
due to a change in one or more factors (Ridge, et al. (2002), Bartz-Beielstein (2006), Fukunaga
2007). Design of experiments is a tool that can (2008), and Hutter, Hoos, and Stützle (2007).
be used to determine important parameters and
interactions between them. Four-stages of DoE Step 3.3 Analyze the
consist of screening and diagnosis of important Performance of Algorithm
factors, modeling, optimization and assessment.
This methodology is called sequential experi- In step 3.3 of DIMMA, we must obtain the
mentation, which is used to set the parameters experimental results for different indicators
in the DoE approach and has been used in this to analyze performance of metaheuristic algo-

Copyright © 2010, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global
is prohibited.
66 International Journal of Applied Metaheuristic Computing, 1(4), 57-74, October-December 2010

rithms with statistical tests, such as t-test, and ing outliers and indicating the dispersion of the
ANOVA models for a comparison of more than output data without any assumptions on the
two algorithms (Cohen, 1995). To use these tests, statistical distribution of the data. In addition,
we must obtain some aggregation number that scatter plot is useful to illustrate compromise
summarizes the average and deviation tenden- between different performance indicators
cies. These statistical tests are used to determine (Tufte, 2001).
whether obtained conclusion is due to a sampling
error or not. The selection of a given statistical
hypothesis-testing tool is performed according 6. A CASE STUDY
to the characteristics of the data (Montgomery,
In this section, we design and implement tabu
2005). Generally, it is not sufficient to analyze
search metaheuristic for travelling salesman
an algorithm based only on theoretical approach,
problem as a case study to illustrate applicability
so that empirical performance analysis is a
DIMMA. The reason of choosing TS and TSP
necessary task to perform and must be done on
is that they are well known in the literature.
a fair basis (Bartz-Beielstein, 2006; Rardin &
Uzsoy, 2001; Dean & Voss, 1999). Many trials Initiation Phase
(at least 10, more than 100 if possible) must
be carried out to derive significant statistical In Step 1.1, we state the TSP problem. Given a
results. From this set of trials, many measures list of m cities and their pairwise distances (or
may be computed (Talbi, 2009) such as mean, costs), the problem is to find a minimum tour
median, minimum, maximum, standard devia- that visits each city exactly once. In the other
tion, the success rate that the reference solution word, we want to find a minimum Hamiltonian
(e.g., global optimum, best known, given goal) tour between a set of cities. According to the
has been attained, and so on. direction of arcs between cities, the TSP problem
is divided into two categories: symmetric and
Step 3.4 Document the Results asymmetric. In symmetric TSP, the arcs are
undirected; while in the asymmetric TSP, the
In this step of DIMMA, the documentation
arcs are directed and the costs are depend on
must be done. The interpretation of the results
the directions. In this section, the symmetric
must be explicit and driven using the defined
TSP is chosen as a case study.
goals and considered performance measures.
For clarity, here a mathematical formula-
Generally, graphical tools are more understand-
tion is presented for the symmetric TSP. Let
able than presenting the large amount of data
results using tables. Some popular graphical xj Î {0,1} be the decision variable where j runs
tools include interaction plots, scatter plots, through all arcs A of the undirected graph and
and box plots (Talbi, 2009). cj is the cost of traveling that arc. To find a tour
Interaction plots represent the interaction in this graph, one must select a subset of arcs
between different factors and their effect on such that every city is contained in exactly two
the obtained response (performance measure). of the arcs selected. The problem can therefore
Scatter plot is a tool to illustrate the compromise be formulated as:
between various performance indicators. For
instance, the plots display quality of solutions m

versus time, or time versus robustness, or ro- m in 1 å å ckxk (1)


2 j= 1 kÎJ (j)
bustness versus quality. Box plot illustrate the
distribution of the results through their five- Subject to:
number summaries: the smallest value, lower
quartile (Q1), median (Q2), upper quartile (Q3), å xk = 2 "j = 1,...,m (2)
kÎJ (j)
and largest value. Box plot is useful in detect-

Copyright © 2010, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global
is prohibited.
International Journal of Applied Metaheuristic Computing, 1(4), 57-74, October-December 2010 67

so that we use approximate a metaheuristic al-


å xj £ K - 1 "K Ì {1,...,m } (3) gorithm to solve it. In this example, to illustrate
jÎ A (K )
DIMMA, we use tabu search (Glover & Laguna,
1997; Gendreau, 2003) as a solution method
xj = 0or1 "j Î A (4) for TSP. It should be noted that, according to
the structure of TSP, some other metaheuristic
could be used to solve it.
where J(j) is the set of all undirected arcs con- The main components of TS are repre-
nected to city j and A(K) is the subset of all sentation of solution, neighborhood structure
undirected arcs connecting the cities in any (move), tabu list, tabu tenure, aspiration crite-
proper, nonempty subset K of all cities. The ria, termination condition, intensification, and
objective function (1) is to minimize the tour diversification strategies.
length. Constraint (2) ensures that every city In step 2.2 the performance measures for
is contained in exactly two of the selected arc. solving TSP with TS is defined. The defined
Indeed, this constraint ensures that the selected goal is to find a good solution in a reasonable
arcs construct a tour. Constraint (3) is a sub-tour time. Therefore, as a case study, performance
elimination constraint that prevents construc- measures which are used for this example are
tion of sub-tours. solution quality and CPU time. The global op-
In the next step, the goal of solving TSP tima are known for the selected instances (Table
should be defined. Finding a minimum tour in 1). Therefore, for solution quality measure we
acceptable time is the goal of this case study. use error rate from global optima.
In step 1.3, the TSP instances are selected. In step 2.3, data structure is selected. To
We choose instances from TSPLIB (Reinelt, solve TSP with tabu search, one can use an
1991) that is one of the popular sources for integer array for representation of individu-
TSP instances (Table 1). Selected instances als, which shows the permutation of the cities
must have enough diversity in terms of size. (Figure 4). To store this array and its length,
Therefore, we select 12 instances from TSPLIB currentTour[] and currentTourLength are con-
with different sizes. sidered in data structure, respectively.
To construct initial solution we use nearest
Blueprint Phase neighborhood heuristic method (Dorigo &
In the first step of blueprint phase, solution Stützle, 2004). Therefore, we need an array and
strategy should be selected. Because the TSP is a variable to store them. Integer array nearest-
an NP-hard problem (Garey & Johnson, 1979), NeighborArray[] and integer variable nearest-
NeighborTourLength are used for this purpose.

Table 1. Problem instances for TSP from TSPLIB

Problem Global Problem Number of Global


# Number of cities #
name optima name cities optima
1 ulysses16 16 6859 7 gr202 202 40160
2 ulysses22 22 7013 8 a280 280 2579
3 eil51 51 426 9 pcb442 442 50778
4 berlin52 52 7542 10 gr666 666 294358
5 kroA100 100 21282 11 pr1002 1002 259045
6 rd100 100 7910 12 u1060 1060 224094

Copyright © 2010, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global
is prohibited.
68 International Journal of Applied Metaheuristic Computing, 1(4), 57-74, October-December 2010

Figure 4. Neighborhood structure for TS to solve TSP

For calculating tour length, we need distances In this example, aspiration criteria is that
between cities, so that, two dimensional integer if the result of a tabu move is better than that
array distancesMatrix[][] is used to store dis- of the current best-known solution, then this
tances. move is allowed. For diversification we use
A neighborhood solution can be obtained by frequency based memory (Glover, 1997), in this
swapping two positions of cities in representa- memory the information of swapping frequen-
tion array (Figure 4). Therefore, neighborhood cies are stored (Figure 5b). For example, the
of a solution is all of the solutions that can be value in 6th raw and 9th column in Figure 5b
obtained by one swap. shows the frequency of swapping cities 6 and
The characteristics that must be saved 9 in 70 iteration. According to these frequencies,
in tabu list are the pair of cities that recently one can assign penalty to swap pair cities. The
swapped. In this problem, the tabu list is a two more frequency is the more penalties are as-
dimensional array, tabuList[][]. In this array, the signed. Therefore, the search can do swaps that
tabu tenure of swapping two cities i and j are are rarely done. Finally, for termination condi-
stored in ith raw and jth column (Figure 5a). For tion a number of iteration is chosen. To do this,
example, the value in 6th raw and 9th column in integer variable notImproveBestSoFar is used
Figure 5a shows the tabu tenure for swapping in data structure. In step 2.4 of DIMMA TS
6th and 9th cities. It means cities 6 and 9 cannot algorithm is designed. We use pseudocode to
swap for 2 next iterations. specify TS (Figure 6).

Figure 5. (a) Tabu list for TS to solve TSP, (b) frequency based memory for diversification

Copyright © 2010, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global
is prohibited.
International Journal of Applied Metaheuristic Computing, 1(4), 57-74, October-December 2010 69

Construction Phase To tune parameters for the TS algorithm we


classified instances into two classes, according
In step 3.1 of DIMMA, TS algorithm is imple- to the number of cities. The instances that have
mented. For implementing the algorithm, we less than or equal to 100 cities are categorized in
select Java as a programming language. For cod- class 1, and the other instances are classified into
ing algorithm we used NetBeens integrated de- class 2. For the first class, instances kroA100 and
velopment environment. We construct 2 classes eil51 and for the second-class a280 and pr1002 are
(TSPByTS and Execution) and 20 methods for chosen as representatives for parameter tuning.
our algorithm. In the TS algorithm, solution quality and CPU
In step 3.2, parameters of the TS algorithm time are considered as the response variables.
are tuned. In this example, we use a DoE ap- Factors, their levels, and the final obtained values
proach and Design-Expert statistical software for classes 1 and 2 instances are summarized in
(Vaughn et al., 2000) to obtain optimal values Table 2 and Table 3. Each block is considered
for parameters. with 16 treatments and main effects.

Figure 6. The TS pseudocode for TSP

Table 2. Factors, their levels, and the final obtained values for instances of class 1

Parameter Lowest value Highest value Final Value


Tabu tenure 5 15 10
Termination Condition* 5000 40000 30000
Diversification condition* 1000 10000 3000
* The number of iteration that the best so far solution is not improved

Table 3. Initial, ranges, and final values for instances of class 2

Parameter Initial value Range of change Final Value


Tabu tenure 5 15 14
Termination Condition* 1000 5000 3000
Diversification condition* 1000 5000 2000
* The number of iteration that the best so far solution is not improved

Copyright © 2010, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global
is prohibited.
70 International Journal of Applied Metaheuristic Computing, 1(4), 57-74, October-December 2010

Table 4. Result of instance of class 1

Global Average tour Average Average CPU Best solu-


Test problem Relative gap
optimal length relative gap time (ms) tion
ulysses16 6859 6859.00 0.0000 191.5 6859.0 0.0000
ulysses22 7013 7013.00 0.0000 449.6 7013.0 0.0000
berlin52 7542 7744.05 0.0267 5039.2 7544.3 0.0003
kroA100 21282 22170.47 0.0417 40847.1 22104.1 0.0492
rd100 7910 8232.51 0.0408 9111.1 8217.6 0.0389
Average 0.0218 0.0177

Table 5. Result of instance of class 2

Global Average tour Average Average CPU Best solu-


Test problem Relative gap
optimal length relative gap time (ms) tion
gr202 40160 43304.29 0.0783 9419.4 41787.7 0.0405
pcb442 50778 58197.84 0.1461 60771.5 57386.2 0.1301
gr666 294358 338786.10 0.1509 99886.7 336833.6 0.1443
u1060 224094 256958.92 0.1467 159526.2 252441.6 0.1265
Average 0.1305 0.1104

Figure 8. Global optimum and obtained solutions in each instances of class 2

Copyright © 2010, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global
is prohibited.
International Journal of Applied Metaheuristic Computing, 1(4), 57-74, October-December 2010 71

In step 3.3, the performance of TS algorithm REFERENCES


is analyzed. We run the algorithm ten times with
tuned parameters. The results are summarized Achterberg, T., Koch, T., & Martin, A. (2003). The
in Table 4 and Table 5. The relative gap is cal- mixed integer programming library: Miplib. Re-
culated as follows: trieved from http://miplib.zib.de
Adenso-Díaz, B., & laguna, M. (2006). Fine-tuning of
obtainedsolution - globaloptim a algorithms using fractional experimental design and
relativegap = local search. Operations Research, 54(1), 99–114.
globaloptim a doi:10.1287/opre.1050.0243
Alba, E., Almeida, F., Blesa, M., Cotta, C., D’ıaz,
The step 3.4 of DIMMA is documentation M., Dorta, I., et al. Le’on, C., Moreno, L., Petit, J.,
that can be done by means of graphical tools. Roda, J., Rojas, A., & Xhafa, F. (2002). MALLBA:
Figure 7 shows a sample to illustrate the perfor- A library of skeletons for combinatorial optimization.
mance of TS for TSP during the time for problem In B. Monien & R. Feldman (Eds.), Euro-Par 2002
eil51. Figure 8 also compares the obtained solu- Parallel Processing Conference (LNCS 2400, pp.
927-932). Berlin: Springer.
tions with global optimal in each instance. The
figure illustrates that the algorithm can reach to Alba, E., & Lugue, G. (2005). Measuring the perfor-
near optimal. mance of parallel metaheuristics. In Alba, E. (Ed.),
Parallel metaheuristics: A new class of algorithm
(pp. 43–60). New York: John Wiley & Sons.
CONCLUSION Antony, J. (2003). Design of experiments for engi-
neers and scientists. Barlington, UK: Butterworth-
According to significant growth in using me- Heinemann.
taheuristics as optimization tools, there must
Arenas, M. G., Collet, P., Eiben, A. E., Jelasity, M.,
be a standard methodology for implement- Merelo, J. J., Paechter, B., et al. (2002). A framework
ing them. Such a methodology is used for for distributed evolutionary algorithms. In Parallel
recording experience and allows projects to Problem Solving from Nature Conference (PPSN
be replicated. Moreover, this standard process VII) (LNCS 2439, pp. 665-675). Berlin: Springer.
can be a comfort factor for new adopters with Barr, R. S., Golden, B. L., Kelly, J. P., Resende, M. G.
little metaheuristic background. C., & Stewart, W. R. (1995). Designing and reporting
We have proposed the DIMMA as a meth- computational experiments with heuristic methods.
odology to design and implement metaheuristic Journal of Heuristics, 1(1), 9–32. doi:10.1007/
BF02430363
algorithms. The proposed methodology includes
series of phases, activities, disciplines, and Bartz-Beielstein, T. (2006). Experimental research
principles to design and implement a specific in evolutionary computation. New York: Springer.
metaheuristic for a given optimization problem. Beasley, J. E. (1990). OR-Library: distributing test
In the other word, the DIMMA refers to the problems by electronic mail. The Journal of the
methodology that is used to standardize process Operational Research Society, 41(11), 1069–1072.
of design and implementing a metaheuristic. Birattari, M., Stuetzle, T., Paquete, L., & Varrentrapp,
We hope the proposed methodological K. (2002). A racing algorithm for configuring meta-
approach to design and implementation of heuristics. In W. B. Langdon et al. (Eds.), Proceedings
metaheuristics will draw more researchers to of the Genetic and Evolutionary Computation Con-
standardization of developing metaheuristics. ference (GECCO 2002) (pp. 11-18). San Francisco:
Morgan Kaufmann Publishers.

Copyright © 2010, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global
is prohibited.
72 International Journal of Applied Metaheuristic Computing, 1(4), 57-74, October-December 2010

Biratteri, M. (2009). Tuning Metaheuristics: A ma- El-Abd, M., & Kamel, M. (2005). A taxonomy
chine learning perspective. Heidelberg, Germany: of cooprative search algorithms. In Blesa, M. J.,
Springer. Blume, C., Roli, A., & Samples, M. (Eds.), Hybrid
metaheuristic (pp. 32–42). Heidelberg, Germany:
Bleuler, S., Laumanns, M., Thiele, L., & Zitzler, E. Springer. doi:10.1007/11546245_4
(2003). PISA: A platform and programming lan-
guage independent interface for search algorithms. Fink, A., Voss, S., & Woodruff, D. L. (1999). Building
In Proceedings of the Conference on Evolutionary reusable software components for heuristic search.
Multi-Criterion optimization (EMO’03), Faro, Por- In Kall, P., & Luthi, H. J. (Eds.), Operations
tugal (pp. 494-508). Research Proceedings (pp. 210–219). Heidelberg,
Germany: Springer.
Box, G., Hunter, J. S., & Hunter, W. G. (2005).
Statistics for experimenters: design, innovation, and Fischetti, M., Salazar Gonzalez, J. J., & Toth, P.
discovery. New York: Wiley. (1997). A branch-and-cut algorithm for the sym-
metric generalized traveling salesman problem.
Cloete, T., Engelbrecht, A. P., & Pampar, G. (2008). Operations Research, 45(3), 378–394. doi:10.1287/
CIlib: A collaborative framework for computational opre.45.3.378
intelligence algorithms – part I. Retrieved from
http://www.cilib.net/ Fisher, W. (1935). The Design of Experiments. Ed-
inburgh, UK: Oliver and Boyd.
Cohen, P. R. (1995). Empirical methods for artificial
intelligence. Cambridge, UK: MIT Press. Frigon, N. L., & Mathews, D. (1997). Practical guide
to experimental design. New York: Wiley.
Collette, Y., & Siarry, P. (2005). Three new metrics
to measure the convergence of metaheuristics to- Fukunaga, A. (2008). Automated discovery of local
wards the Pareto frontier and the aesthetic of a set search heuristics for satisfiability testing. Evolu-
of solutions in biobjective optimization. Computers tionary Computation, 16(1), 31–61. doi:10.1162/
& Operations Research, 32, 773–792. doi:10.1016/j. evco.2008.16.1.31
cor.2003.08.017
Garey, M. R., & Johnson, D. S. (1979). Computers
Cormen, T. H., Leiserson, C. E., Rivest, R. L., & and intractability. San Francisco: Freeman and Co.
Stein, C. (2002). Introduction to algorithms. London:
MIT Press. Gaspero, L. Di, & Schaerf, A. (2001). EasyLocal++:
An object-oriented framework for the design of local
Coy, S., Golden, B. L., Runger, G. C., & Wasil, E. search algorithms and metaheuristics. In Proceedings
A. (2000). Using experimental design to find ef- of the MIC’2001 4th Metaheuristics International
fective parameter settings for heuristics. Journal of Conference, Porto, Portugal (pp. 287-292).
Heuristics, 7, 77–97. doi:10.1023/A:1026569813391
Gendreau, M. (2003). An introduction to tabu search
Crainic, T. G., & Tolouse, M. (1998). Parallel meta- . In Glover, F., & Kochenberger, G. A. (Eds.), Hand-
heuristic. In Crainic, T. G., & Laporte, G. (Eds.), Fleet book of metaheuristics (pp. 37–54). Norwell, MA:
management And logistic (pp. 205–235). Norwell, Kluwer Academic publishers.
MA: Kluwer Academic publishers.
Gendreau, M., Laporte, G., & Semet, F. (1998). A tabu
Crainic, T. G., & Tolouse, M. (2003). Parallel search heuristic for the undirected selective travelling
Strategies FOR Meta-heuristics. In Glover, F., & salesman problem. European Journal of Operational
Kochenberger, G. (Eds.), Handbook of metaheuristics Research, 106(2-3), 539–545. doi:10.1016/S0377-
(pp. 475–514). Norwell, MA: Kluwer Academic 2217(97)00289-0
Publishers.
Glover, F., & Laguna, M. (1997). Tabu search.
Doreo, J., Siarry, E., Petrowski, A., & Taillard, E. Dordrecht, The Netherlands: Kluwer Academic
(2006). Metaheuristics for hard optimization. Hei- Publishers.
delberg, Germany: Springer.
Goldberg, D. E. (1989). Genetic algorithms in search,
Dorigo, M., & Stützle, T. (2004). Ant colony opti- optimization and machine learning. Reading, MA:
mization. Cambridge, UK: MIT Press. Addison Wesley.

Copyright © 2010, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global
is prohibited.
International Journal of Applied Metaheuristic Computing, 1(4), 57-74, October-December 2010 73

Hartmann, S., & Kolisch, R. (2000). Experimental Ridge, E. (2007). Design of experiments for the
evaluation of state-of-the-art heuristics for the tuning of optimization algorithms. Unpublished
resource-constrained project scheduling problem. doctoral dissertation, Department of Computer Sci-
European Journal of Operational Research, 127(2), ence, University of York, UK.
394–407. doi:10.1016/S0377-2217(99)00485-3
Siau, K., & Halpin, T. (2001). Unified Modeling
Hutter, F., Hoos, H. H., Leyton-Brown, K., & Stützle, Language: system analysis, design and development
T. (2009). ParamILS: An automatic algorithm con- issues. Hershey, PA: IGI Global.
figuration framework. Journal of Artificial Intel-
ligence Research, 36, 267–306. Silberholz, J., & Golden, B. (2010). Comparison
of metaheuristics. In Gendreau, M., & Potvin, J.-V.
Hutter, F., Hoos, H. H., & Stützle, T. (2007). Auto- (Eds.), Handbook of metaheuristics. Heidelberg, Ger-
matic algorithm configuration based on local search. many: Springer. doi:10.1007/978-1-4419-1665-5_21
AAAI, 1152-1157.
Skiena, S. S., & Revilla, M. A. (2003). Program-
Kirkpatrick, S., Gelatt, C., & Vecchi, M. (1983). ming challenges: The programming contest training
Optimization by simulated annealing. Science, 220, manual. New York: Springer.
671–680. doi:10.1126/science.220.4598.671
Stadler, P., & Schnabl, W. (1992). The landscape
Krasnogor, N., & Smith, J. (2000). MAFRA: A Java of the traveling salesman problem. Physics Let-
memetic algorithms framework. In Freitas, A. A., ters. [Part A], 161, 337–344. doi:10.1016/0375-
Hart, W., Krasnogor, N., & Smith, J. (Eds.), Data 9601(92)90557-3
Mining with Evolutionary Algorithms (pp. 125–131).
Las Vega, NV. Stadler, P. F. (1995). Towards a theory of landscapes.
In R. Lop’ez-Pe˜na, R. Capovilla, R. Garc’ıa- Pelayo,
Kroll, P., & Krutchten, P. (2003). The Rational unified H. Waelbroeck, & F. Zertuche (Eds.), Complex Sys-
process Made Easy. Reading, MA: Addison-Wesley. tems and Binary Networks (Vol. 461, pp. 77-163).
Berlin: Springer.
MacAllister, W. (2009). Data Structures and al-
gorithms using java. New York: Jones & Bartlett Talbi, E. (2006). Parallel combinatorial op-
publishers. timization. Hoboken, NJ: John Wily & Sons.
doi:10.1002/0470053925
Merriam-Webster. (1997). Merriam-Websters’s Col-
legiate Dictionary. Merriam-Websters. Talbi, E. (2009). Metaheuristics: from design to
implementation. Hoboken, NJ: John Wiley & sons.
Michel, L., & Van, P. (2001). Hentenryck. Local-
izer++: An open library for local search (Tech. Thierens, D. (2008). From Multi-start Local Search
Rep. No. CS-01-02). Providence, RI: Department to Genetic Local Search: a Practitioner’s Guide. In
of Computer Science, Brown University. Proceedings of the 2nd International Conference
on Metaheuristics and Nature Inspired Computing
Montgomery, D. (2005). Design and analysis of (META’08). Tunisia: Hammamet.
experiments. New York: Wiley.
Tufte, E. R. (2001). The Visual Display of Quantitative
Morago, R. J., DePuy, G. W., & Whitehouse, G. E. Information (2nd ed.). Cheshire, CN: Graphics Press.
(2006). A solution methodology for optimization
problems. In A. B. Badiru (Ed.), Metaheuristics Vaughn, N., Polnaszek, C., Smith, B., & Helseth, T.
(pp. 1-10, 13). New York: Taylor & Francis Group. (2000). Design-Expert 6 User’s Guide. Stat-Ease Inc.
Puntambekar, A. A. (2009). Analysis of algorithm Voss, S., & Woodruff, D. L. (2002). Optimization
and design. New York: technical publications pune. software class libraries. Norwell, MA: Kluwer.
Rardin, R. L., & Uzsoy, R. (2001). Experimental eval- Voudouris, C., Dorne, R., Lesaint, D., & Liret, A.
uation of heuristic optimization. Journal of Heuris- (2001). iOpt: A software toolkit for heuristic search
tics, 7(3), 261–304. doi:10.1023/A:1011319115230 methods. In Proceedings of the International Con-
ference on Principles and Practice of Constraint
Reinelt, G. (1991). TSPLIB: a traveling salesman Programming (LNCS 2239, pp. 716-729). Berlin:
problem library. ORSA Journal on Computing, 3, Springer.
376-384. Retrieved from http://softl ib.rice.edu/
softlib/tsplib/

Copyright © 2010, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global
is prohibited.
74 International Journal of Applied Metaheuristic Computing, 1(4), 57-74, October-December 2010

Wall, M. (1996). GAlib: A C++ library of genetic Yin, P. Y. (2010). MetaYourHeuristic V. 1.3, Intel-
algorithm components (Tech. Rep.). Mechanical ligence Computing Laboratory, National Chi Nan
Engineering Department, Massachusetts Institute University, Taiwan. Retrieved from http://intel-
of Technology. ligence.im.ncnu.edu.tw
Wang, Y. (2010). A Sociopsychological Perspective
on Collective Intelligence in Metaheuristic Comput-
ing. International Journal of Applied Metaheuristic
Computing, 1(1), 110–128. ENDNOTES
Wilson, G. C., McIntyre, A., & Heywood, M. 1
http://www.particleswarm.info/Programs.
I. (2004). Resource review: Three open source html
systems for evolving programs—Lilgp, ECJ and 2
http://paradiseo.gforge.inria.fr
grammatical evolution. Genetic Programming and 3
http://www.iitk.ac.in/kangal/codes.shtml
Evolvable Machines, 5(19), 103–105. doi:10.1023/
B:GENP.0000017053.10351.dc
Wolpert, D. W., & Macready, W. G. (1997). No free
lunch theorems for optimization. IEEE Transac-
tions on Evolutionary Computation, 1(1), 67–82.
doi:10.1109/4235.585893

Masoud Yaghini is Assistant Professor of Department of Rail Transportation Engineering, School


of Railway Engineering, Iran University of Science and Technology. His research interests in-
clude data mining, optimization, metaheuristic algorithms, and application of data mining and
optimization techniques in rail transportation planning. He published several books and papers
in the field of data mining, metaheuristics, and rail transportation planning. He is teaching data
mining, advanced operations research, and metaheuristic algorithms postgraduate courses.

Mohammad Rahim Akhavan Kazemzadeh is a MSc. Student in Rail Transportation Engineering,


School of Railway Engineering, Iran University of Science and Technology. His research interests
are metaheuristics optimization methods, parameter tuning of metaheuristics, multicommodity
network design problems, and optimization in rail transportation problems. He has one under-
publishing book in the field of metaheuristics and some published papers in the field of network
design and metaheuristics.

Copyright © 2010, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global
is prohibited.

You might also like