Academia.eduAcademia.edu

Genetic algorithms and grouping problems [Book Review]

2000, IEEE Transactions on Evolutionary Computation

AI-generated Abstract

The review discusses Emanuel Falkenauer's book on using genetic algorithms (GAs) to solve combinatorial grouping problems, such as the bin-packing problem. The reviewer critiques the clarity and accuracy of the explanations presented in the book, highlighting omissions in definitions and inaccuracies surrounding well-established concepts in GAs. The review also addresses the effectiveness of various GA operators and provides insights into the book's overall contribution to the field, noting the need for improved clarity in discussions of stochastic processes and schema analysis.

IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 5, NO. 3, JUNE 2001 297 Book Reviews___________________________________________________________________________ Genetic Algorithms and Grouping Problems—Emanuel Falkenauer (Chichester, U.K.: Wiley, 1998, 238 pp., $110) Reviewed by Colin R. Reeves. Emanuel Falkenauer is well known for his work on using genetic algorithms (GAs) to solve hard practical combinatorial problems connected with what he calls “grouping.” Perhaps the simplest example is the bin-packing problem (BPP): given a set of objects and a set of bins (of equal capacity), how do we assign objects to bins in such a way as to minimize the number of bins? This book is an extended account in five parts of how this and related problems can be tackled with a GA. Chapter 1 introduces the concepts of a (combinatorial) problem, an algorithm, and the time complexity of problems and algorithms. A simple account of the classes P and NP follow, motivating the idea that we need heuristics, which are illustrated by the first-fit and first-fit descending heuristics for the BPP. This leads into a discussion of metaheuristics, including GAs, and the no-free-lunch theorem. The presentation is for the most part clear, marred only by a few simple omissions, such as the use of technical terms (e.g., NP-complete) without definition. (There is an appendix later that deals with such matters, but it is not referenced.) More seriously, the impression is given that tabu search is stochastic (it can be, but it is usually regarded as deterministic) and his discussion of simulated annealing implies probabilities greater than one. The next chapter is the longest, dealing with the basics of GAs. Some time is spent on the nature of the phenotype-to-genotype mapping (considering the mapping from genotype-to-phenotype is perhaps more normal, but that is a minor detail) and especially the undesirable property that many chromosomes represent the same solution. Although Falkenauer is not alone in calling this “redundancy,” Radcliffe and Surry [9] have pointed out that a better term would be “degeneracy.” The basic operators—selection, recombination, inversion, and mutation—are then discussed, along with some variations. He appears to recommend roulette-wheel selection because it is noisy, where many researchers would argue that this is a reason for using (say) stochastic universal selection [1] instead. The discussion on tournament selection is rather unclear, but other operators are described adequately. The second part of the chapter develops the traditional schema processing arguments: k -armed bandits, “exponential” growth of schemata, implicit parallelism, and so on, as if nothing has changed since 1975. The statement of the schema theorem (p. 69) perpetuates the original error of failing to stress its stochastic and dynamic nature, which is disappointing in a book published in 1998. True, he does later offer a critique of schema-based analyses, but its impact is somewhat reduced by his earlier poor use of notation. The second part of the book is the heart of it: chapters 3, 4, and 5 describe the class of grouping problems, the difficulty of solving them with traditional GAs, and the author’s own solution—the “grouping GA” (GGA). This is a simple, but highly effective idea: encode the groups explicitly, as well as the string of object-to-group assignments, and apply the “genetic” operators to the groups rather than to the strings. Falkenauer worries a little as to whether what results is still a GA, but perhaps this is because of the primacy of schema thinking in his concept of a GA. While he mentions Radcliffe’s work [5]–[8] Manuscript received March 5, 2001. The reviewer is with the School of Mathematical and Information Sciences, Coventry University, Coventry CV1 5FB, U.K. Publisher Item Identifier S 1089-778X(01)05208-0. in passing, I feel he could have made much greater use of Radcliffe’s concepts of transmission, respect, and assortment as an underpinning for his own developments. In my view, he rather underplays the GGA’s need for an effective “repair” algorithm following “crossover” and for a suitable fitness function. As in any GA, the credit for good performance will have to be split between several components and both of these are clearly very important ingredients in the GGA. Regardless of the “theoretical” status of the GGA, the approach is shown to be highly effective in part 3, where the results of experimental work on several grouping problems are presented. The GGA is shown to outperform conventional GAs on the BPP, assembly line balancing, the economies of scale problem (ESP), conceptual clustering, and the “equal piles” problem. In the case of the BPP, the GGA is also shown to be more effective than branch and bound algorithms, which is impressive. However, in other cases, no comparisons with other approaches are reported. This seems to be especially unfortunate in the case of the ESP, which clearly has strong similarities to well-known problems such as generalized assignment and set covering, both of which have proved amenable to Langrangean relaxation methods. To some extent, the question of comparison is covered by the use of artificially generated problem instances with known optima or tight bounds (which the GGA usually achieves), but as Gent points out [2], instances generated in this way may have built-in properties that make them easy to solve by rather simple means, quite apart from a GA-based method. Part 4 is a very brief summary and review, while part 5 is an Appendix. The first chapter of the Appendix is a more detailed description of the issues of complexity, P and NP, etc. Such questions are not the easiest to understand, but Falkenauer’s discussion is clear and concise. My only complaint would be that it would have been more useful at the start, rather than being relegated to an Appendix. The final chapter is a personal plea against genetic research, which is clearly something about which the author has strong feelings. However, it does seem a rather bizarre inclusion in this book. There are more typos, misspellings, and grammatical solecisms than I would expect to see in a quality publication: “mimick” (p. 16), “recombinating” (p. 32), “layouting” (p. 83), “isolatedly” (p. 83), “bouncing of[f]” (p. 166), “neronal” (p. 200) are some of the more obvious ones. All in all, this gives a rather quirky introduction to GAs. It has some good things in it, but as a guide to GAs overall, Goldberg [3] is clearer and, although Falkenauer’s critique of the schema theorem is a useful if incomplete corrective to Goldberg’s account, Mitchell [4] does it better. It is also disappointing that he has not really engaged with Radcliffe’s major contribution to exactly the types of problems he is solving. However, I wholeheartedly endorse his major theme—that GAs should not be used as a black box, but should make use of problem-specific knowledge in the choice of representation and operators. His own implementations are an object lesson in how to do this successfully, and not just for toy problems, but in the real world. For that reason, it deserves a place on the practitioner’s bookshelf. REFERENCES [1] J. E. Baker, “Reducing bias and inefficiency in the selection algorithm,” in Proc. 2nd Int. Conf. Genetic Algorithms, J. J. Grefenstette, Ed. Hillsdale, NJ: Lawrence Erlbaum, 1987, pp. 14–21. [2] I. Gent, “Heuristic solution of open bin packing problems,” J. Heuristics, vol. 4, pp. 299–304, 1998. [3] D. E. Goldberg, Genetic Algorithms in Search, Optimization, and Machine Learning. Reading, MA: Addison-Wesley, 1989. 1089–778X/01$10.00 © 2001 IEEE 298 IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 5, NO. 3, JUNE 2001 [4] M. Mitchell, An Introduction to Genetic Algorithms. Cambridge, MA: MIT Press, 1996. [5] N. J. Radcliffe, “Forma analysis and random respectful recombination,” in Proc. 4th Int. Conf. Genetic Algorithms, R. K. Belew and L.B. Booker, Eds. San Mateo, CA: Morgan Kaufmann, 1991, pp. 222–229. , “Equivalence class analysis of genetic algorithms,” Complex Syst., [6] vol. 5, pp. 183–205, 1991. , “Non-linear genetic representations,” in Parallel Problem-Solving [7] from Nature, 2, R. Männer and B. Manderick, Eds. Amsterdam, The Netherlands: Elsevier, 1992, pp. 259–268. [8] N. J. Radcliffe and F. A. W. George, “A study in set recombination,” in Proc. 5th Int. Conf. Genetic Algorithms, S. Forrest, Ed. San Mateo, CA: Morgan Kaufmann, 1993, pp. 23–30. [9] N. J. Radcliffe and P. Surry, “Formae and the variance of fitness,” in Foundations of Genetic Algorithms 3, D. Whitley and M. Vose, Eds. San Mateo, CA: Morgan Kaufmann, 1995, pp. 51–72. The Second NASA/DoD Workshop on Evolvable Hardware— Jason Lohn, Adrian Stoica, Didier Keymeulen, Silvano Colombano (Los Alamitos, CA: IEEE Comput. Soc. Press, 2001) Reviewed by Hugo de Garis. The second annual NASA/DoD Workshop on Evolvable Hardware (EH-2000) took place on July 13–15, 2000, in Palo Alto, CA, sponsored by the National Aeronautics and Space Administration (NASA) and the Defense Advanced Research Projects Agency (DARPA) and cohosted by NASA Ames Information Sciences and Technology Directorate, the Jet Propulsion Laboratory (JPL) Center for Integrated Space Microsystems (CISM), and the JPL Center for Space Microelectronics Technology (CSMT), with the cooperation of various other divisions of NASA Ames. The very fact that these evolvable hardware (EH) workshops are now an annual event shows clearly that the field has been established solidly in the U.S., after America’s initial four-year delay behind Switzerland. The Workshop About 90 people from 11 different countries attended the workshop. (This was down from the 130 who attended the prior workshop in 1999.) The three-day workshop was held fully in plenary session. The invited talks were spread over the three days, but most occurred during the first morning. Once the invited talks concluded, most of which will be highlighted below, the submitted paper talks were divided into the following eight topics, which give an indication of how the EH field is evolving in human terms. 1) Algorithms. 2) From Biology to Robotics. 3) Evolvability. 4) Evolution of Analog and Mixed Signal Circuits. 5) Evolution of Digital Functions. 6) Reconfiguration Architecture and Devices. 7) Evolution of CA and Brain-Inspired Architecture. 8) Real World Applications Highlights of the Workshop First, I will cover the invited speakers. agement in the year since the first workshop. In particular, Stoica from JPL and Lohn from NASA Ames (two of the driving forces of EH in the U.S.) and others had persuaded their superiors of the importance of this new research field for the future of NASA and for the future of space travel in general. For example, Dr. Zornetzer, Director of the NASA Ames Information Systems and Technology Directorate, made a very strong statement in his kickoff speech, “Maybe NASA’s future will depend on evolvable hardware.” Zornetzer has the reputation of supporting very innovative research such as nanotechnology, which he did several years before it became mainstream. His brief introductory talk asked the question, “Why is NASA interested in EH?” I found his answers very persuasive and promising for the future development of EH in the U.S. in general and for space in particular. He said that future planetary and deep space exploration over the next 20–30 years will demand that the space vehicles have robust system architectures and be dynamically reconfigurable. Current technologies cannot cope, he said, because they are just not robust enough. Future space missions will send probes into the saline oceans under Europa’s ice cap. How can such an unpredictable environment be explored? Obviously, with autonomous real-time robots that must be robust, dynamically reconfigurable, and not brittle. It is impractical for NASA to send up triply redundant systems to ensure reliability because the weight problem makes that too expensive. Future architectures need to be adaptive to uncertain environments. EH may have a major role to play in this regard. Zornetzer’s vision is to develop three broad areas of competence,: 1) space technology; 2) nanotech; and 3) information technology. He has set up a newly funded intelligent systems (IS) program ($70 million/year) and a space biology program, which researches biologically-based systems for space. He also set up a nanotech initiative. IS includes automated reasoning, human-centered computing, and revolutionary computing (e.g., quantum computing, neural nets, EH, etc.). Space biology includes understanding biological principles, the functioning of nanotubes, EH, etc. The nanotech initiative is seen as a requirement for future NASA missions, to revolutionize electronics, for autonomous microrovers, mini helicopters, etc. The IS initiative received 500–600 proposals. The space biology initiative went out for January 2001 and the nano proposal will be announced in 2001. There are other funding programs which are unsolicited. 2) Nikzad Toomarian (JPL)—“EH for 100 vival” + year space probe sur- The second institutional speaker was Toomarian of JPL, who made rather similar remarks to Zornetzer’s, such as “EH is needed for deep space exploration in extreme environments (0150–300 ).” He stressed that the various planned space missions, such as the Pluto express, launch date 2004 with a flight time of 8–9 years, and later interstellar explorations will need to emphasize long-term survivability and evolvability. He wants to see future space hardware systems based on nature’s adaptability that can reevolve themselves in seconds. He wants EH for 100+ years survival of space systems with low power and high intelligence. 1) Steve Zornetzer (NASA)—“NASA’s future will depend on EH” 3) Carver Mead—“I’ve been doing EH all my life” What struck me initially about the EH-2000 workshop was the very impressive public relations job its organizers had done on their man- Mead, who is in his mid-60s, is famous in electronics circles. He virtually invented most of its modern approaches. He has had several research careers in four fields: 1) device physics (smaller transistors, metal–oxide–semiconductor field effect transistors); 2) very large scale integration design methodology (silicon boundaries); 3) biological electronics and neuromorphic scaling behavior (silicon cochlea and retina); and 4) collective electrodynamics (with a new book to come Manuscript received April 5, 2001. The reviewer is with Brain Builder Group, STARLAB, Uccle, Brussels B-1180, Belgium (e-mail: [email protected]). Publisher Item Identifier S 1089-778X(01)05207-9. 1089–778X/01$10.00 © 2001 IEEE View publication stats