Intro To Genetic Algorithms: I400/I590 Artificial Life As An Approach To Artificial Intelligence Larry Yaeger

Download as pdf or txt
Download as pdf or txt
You are on page 1of 22

Intro to Genetic Algorithms

Lecture 3 I400/I590 Artificial Life as an approach to Artificial Intelligence Larry Yaeger


Professor of Informatics, Indiana University

What Are Genetic Algorithms?


A way to employ evolution in the computer Search and optimization technique based on variation and selection

Optimization Techniques
Analytical Given y = f(x), take the derivative of f w.r.t. x, set the result to zero, solve for x
x y = 0

y
y x

y x

Works perfectly, but only for simple, analytical functions

Optimization Techniques
Gradient-based, or hill-climbing Given y = f(x)
pick a point x0 compute the gradient f(x0) step along the gradient to obtain x1 = x0 + f(x0). repeat until extremum is obtained

Optimization Techniques
Gradient-based, or hill-climbing

Requires existence of derivatives, and easily gets stuck on local extrema

Optimization Techniques
Enumerative Test every point in the space in order Random Test every point in the space randomly

Optimization Techniques
Genetic Algorithm (Evolutionary Computation) Does not require derivatives, just an evaluation function (a fitness function) Samples the space widely, like an enumerative or random algorithm, but more efficiently Can search multiple peaks in parallel, so is less hampered by local extrema than gradient-based methods Crossover allows the combination of useful building blocks, or schemata (mutation avoids evolutionary dead-ends) Robust!

Evolutionary Computation History


1953 N. Barricelli symbiogenesis experiments at IAS 1957 Alex S. Fraser and Barricelli published independently on evolutionary computing; Fraser already using diploidy, alleles, and crossover Late 1950s/early 1960s, influenced by Fraser, evolutionary biologists model evolution, though not as general purpose function optimization algorithm By 1962 G.E.P. Box, G.J. Friedman, W.W. Bledsoe, H.J. Bremermann independently developed evolutioninspired algorithms for optimization and machine learning, but work received little follow-up Friedman (1959) attempted to evolve computer code, with no success, demonstrating that variation and selection can only work if variation is able to produce viable offspring

Evolutionary Computation History


1962 John Holland, at Univ. of Michigan, began publishing and teaching adaptive systems, and in 1965 first described importance of crossover 1965 Ingo Rechenberg, Berlin, devised evolution strategy, but had no population or crossover (one parent mutated to produce one offspring, with the better individual being kept) 1966 L.J. Fogel, A.J. Owens and M.J. Walsh published first book on evolutionary computation, Artificial Intelligence Through Simulated Evolution, introducing evolutionary programming, based on Finite State Machines; like evolution strategies, kept better of one parent, one offspring

Evolutionary Computation History


1970 A.S. Fraser and D. Burnell publish second book of the field, Computer Models in Genetics, capturing over a decade of work 1975 John Holland published Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence

Considered by most to be the seminal work in the field Established formal, theoretical basis for evolutionary optimization with introduction of schemata (building blocks, partial solutions)

Evolutionary Computation History


1975 Kenneth De Jongs thesis, under Holland, introduced a set of test functions with distinct properties, demonstrating the broad applicability of GAs Much research ensues

Genetic Programming
1985 N.L. Cramer, A Representation for the Adaptive Generation of Simple Sequential Programs 1987 D. Dickmanns, J. Schmidhuber, A. Winklhofer, Der genetische Algorithmus: Eine Implementierung in Prolog 1987 C. Fujiki and J. Dickinson published on using GAs to evolve lisp source code for solving the prisoners dilemma problem 1988 (patent filed) /1989 (publication) John Koza, at Stanford, introduced genetic programming, an application of evolutionary computing to tree representations of lisp programs for solving engineering problems

Genetic Algorithm Pseudo-Code


Choose initial population (usually random) Repeat (until terminated) Evaluate each individual's fitness Prune population (typically all; if not, then the worst) Select pairs to mate from best-ranked individuals Replenish population (using selected pairs)
- Apply crossover operator - Apply mutation operator

Check for termination criteria


(number of generations, amount of time, minimum fitness threshold satisfied, fitness has reached a plateau, other) - Loop, if not terminating

String-based GAs
Hollands original approach, and probably still the most common Genome (or chromosome) is a string of bits
1 1 0 1 0 0 1 0 1 1 0 1 1 1 0 0

Evaluation Selection Crossover Mutation

Other Representations
The basic vary-and-select GA approach can be applied to almost any data structure Trees

Arrays

Lists

Fitness
Fitness is computed for each individual To help maintain diversity or differentiate between similar individuals, raw objective scores are sometimes scaled to produce the final fitness scores Rank - no scaling Linear scaling (fitness proportionate) - normalizes based on min and max fitness in population Sigma truncation scaling - normalizes using population mean and std. dev., truncating low-fitness individuals Sharing (similarity scaling) - reduces fitness for individuals that are similar to other individuals in the population

Selection
The selection scheme determines how individuals are chosen for mating, based on their fitness scores Too great a bias towards the best individuals can result in premature convergence, so the best selection schemes are designed to maintain a diverse population Rank - pick the best individuals every time Roulette wheel (proportionate) - probability of selection is proportional to fitness Tournament - initial large number are selected via roulette wheel, then the best ranked are chosen Stochastic - various methods of replenishment of less fit stock (useful) or initial selection (not useful) Elite - in combination with other selection schemes, always keep the fittest individual around

Crossover
There are a number of techniques, but all involve swapping genessequences of bits in the stringsbetween two individuals (or between two strands of a diploid individual) Parent 1

1 1 0 1 0 0 1 0 1 1 0 1 1 1 0 0

1 1 0 1 1 0 1 1 0 1 0 1 1 1 0 0

Child

Parent 2

0 0 1 1 1 0 1 1 0 1 0 0 1 0 1 1

Mutation
Generally, bits are flipped with a small probability This explores parts of the space that crossover might miss, and helps prevent premature convergence Before

1 1 0 1 1 0 1 1 0 1 0 1 1 1 0 0

1 1 0 1 1 0 1 0 0 1 0 1 1 1 0 0

After

Replacement
Simple or generational GAs replace entire population, per the dictates of the selection scheme Steady state or online GAs use different replacement schemes Replace worst Replace best Replace parent Replace random Replace most similar (crowding) Only replace worst and replace most similar are generally very effective (when replace parent works, it is because parents are similar)

Genetic Algorithm Demo


http://neo.lcc.uma.es/TutorialEA/semEC/appsim/appsi m.html

Defunct: http://ai.bpa.arizona.edu/~mramsey/ga.html

Other Interesting GA Links


http://www.econ.iastate.edu/tesfatsi/holland.GAIntro.htm http://myducksoup.com/family/alex/davidfogel.shtml http://www.talkorigins.org/faqs/genalg/genalg.html http://www.hao.ucar.edu/public/research/si/pikaia/tutorial.html http://lancet.mit.edu/~mbwall/presentations/IntroToGAs/ http://en.wikipedia.org/wiki/Genetic_algorithm http://samizdat.mines.edu/ga_tutorial/ http://www.generation5.org/content/2000/ga.asp http://gaul.sourceforge.net/intro.html http://www.burns-stat.com/pages/Tutor/genetic.html http://www.cs.bgu.ac.il/~sipper/ga.html
http://www.natural-selection.com/Library/2000/FogelFraser-CEC2000.pdf

You might also like