Lecture10 (GA)

Download as pdf or txt
Download as pdf or txt
You are on page 1of 19

EPCE6202 Intelligent Control

and Its Applications


(Genetic Algorithm)

ASTU
School of EEC

10.1 What are EAs ?


• Evolutionary algorithms (EAs) are a family of
population-based problem solvers with a stochastic
character.
• In evolutionary computation, an initial set of
candidate solutions is generated and iteratively
updated. Each new generation is produced by
stochastically removing less desired solutions, and
introducing small random changes.
• The iteration sequence is continued until a termination
criterion is met.

1
Contd
• Ant colony optimization (ACO)
A probabilistic algorithm for solving computational
problems inspired by the pheromone-based
communication behavior of real ants (M. Dorigo,
1992).
• Evolution strategies (ESs)
use natural problem-dependent representations, and
primarily mutation and selection, as search
operators(I. Rechenberg, H. Schwefel et al., 1970s).

Contd
• Particle swarm optimization (PSO)
A computational method that improves a candidate
solution with regard to a given measure of quality.
It is based on the ideas of animal flocking behavior
(Kennedy and Eberhart, 1995).
• Simulated annealing (SA)
A probabilistic technique for being inspirated from
annealing in metallurgy, a technique involving
heating and controlled cooling of a material to
increase the size of its crystals and reduce their
defects. (Kirkpatrick, Gelatt Jr.,Vecchi, 1983).
• Genetic algorithm (GA)
The most popular type of EA(J. Holland, 1975).

2
10.2 What are GAs ?
• A global search algorithm based on
• Natural evolution- the fittest
individuals survive
• Genetics- organisms produce offspring
via sexual mating
• Natural mutation-
sometime benefit for
organisms
• Developed: J. Holland in
the 1975

Contd
Basic structure of GA
k= 0;
Compute an initial population P(0);
Evaluate the fitness;
WHILE <stopping condition not fulfilled> DO BEGIN
k= k+1;
Select individuals for reproduction;
Create offspring by crossover;
Mutate some individuals;
Evaluate the fitness;
END
Output the so far best solution

3
Contd
Start

Initialize Fitness
population evaluation

yes
Converged ?
Population no
(1100010110) Selection Output
results
(0110100101)
(1110101110) Crossover
(1010010111) Stop
(0011010111)
(1010011100) Mutation

Contd
Differences between two methods
Traditional Methods Evolutionary Methods
A set of potential solutions
Single solution
(population)
Some mechanism for
Derivative of the objective
searching and escaping local
function
optima
Local optima Global optima
Deterministic results Stochastic results

4
10.3 Chromosome
Cell and chromosome
• A human cell contains 23 pairs of chromosomes made
of protein and DNA.
• DNA, passed from parents to offspring, contains the
specific instructions that make each human unique.

Contd
1. Binary encoding- chromosome is represented by a
string of bits, 0 or 1
s= (101001001111)
x1 x2 x3

2. Real number encoding- chromosome is represented


as a real vector
s = (0.2 -0.4 0.7)
3. Symbolic encoding- chromosome is represented by
symbolic integer (Job scheduling)
s= (4127536)

5
10.4 Population Initialization
Start

Initialize Fitness
population evaluation

yes
Converged ?
Population no
(1100010110) Selection Output
results
(0110100101)
(1110101110) Crossover
(1010010111) Stop
(0011010111)
(1010011100) Mutation

Contd
Start

Initialize Fitness
population evaluation

yes
Converged ?
Population no
(4.3,2.1,0.5) Selection Output
results
(0.1,3.3,1.1)
(6.2,9.1,0.6) Crossover
(3.5,6.6,8.1) Stop
(6.2,7.2,3.2)
(4.2,5.5,8.0) Mutation

6
Contd
Start

Initialize Fitness
population evaluation

yes
Converged ?
Population no
(42637185) Selection Output
results
(46231578)
(24315678) Crossover
(45826317) Stop
(36247851)
(27364815) Mutation

Contd
The initial population can be generated randomly or
heuristically.
1. Random method- choose gene values randomly
from the allowed set of values.

2. Heuristic method- use to bias toward good solutions


when prior knowledge about the search space is
available

Random method Heuristic method

7
10.5 Fitness Evaluation
• Decides how ‘good’ the solution is with respect to the
problem. Fitness function is used to guide simulation
towards the optimal solution.
• Fitness F is calculated from the objective function f(x)
• Maximization problem:
max f(x)  max F(x)
where F(x)= f(x) +c1  0
• Minimization problem :
min f(x)  max F(x)
where F(x)= - f(x) +c2  0 or
F(x) = 1/[f(x) +c3]  0

Contd
For example, min f(x)= 0.15(x-1)2-1
4 6 12

2 4 8

00 2 4

-2 00 00
-2 0 2 4 6 -2 0 2 4 6 -2 0 2 4 6
a) f(x) b) F= -f(x)+c2 c) F= 1/[f(x)+c3]

c2= 2.75 c3= 1.1

8
10.6 Genetic Operator: Selection
• Mimics the natural selection. Better individuals get
higher chance for survival
• Chances are proportional to fitness
• The most frequently used selection operators:
- Roulette wheel (Assign to each individual a part of the
roulette wheel)
- Tournament
- Remainder stochastic sample with replacement
- Ranking-based
- Affine (Prof. Jin proposed)

Contd
P(0) Fitness Psel Roulette
 (4.3, 2.1, 0.5) 14.0 31% wheel method
 (0.1, 3.3, 1.1) 2.3 5%
 (6.2, 9.1, 0.6) 16.2 36%
 (3.5, 6.6, 8.1) 5.4 12% P(1)
 (6.2, 7.2, 3.2) 6.3 14%  (4.3, 2.1, 0.5)
 (4.2, 5.5, 8.0) 0.9 2%  (6.2, 7.2, 3.2)
45.0 100%  (6.2, 9.1, 0.6)
 (3.5, 6.6, 8.1)
 (6.2, 9.1, 0.6)
 
 (4.3, 2.1, 0.5)
Selection 
position  

9
10.6 Genetic Operator: Crossover
• Imitates sexual reproduction
in nature
• Produces offspring from two
parents to exchange
information between
chromosomes
• Crossover takes place with a
probability Pc[0, 1].
• There are various operators
depending on the
chromosome encoding

Contd
Binary encoding
a. One point xover

Xover point
s1= (0011011) s1= (0001100)
s2= (1101100) s2= (1111011)

b. Two point xover


s1= (0011011) s1= (0011101)
s2= (1101100) s2= (1101010)
Xover points

10
Contd
Real number encoding
a. Arithmetic xover
- Two parents produce two offspring. The offspring
are arithmetically produced by
x= (x1, x2, x3) x= (x1, x2, x3)
y= (y1, y2, y3) y= (y1, y2, y3)
Parents Offspring
xi= yi+(1-) xi
yi= xi+(1-)yi
where [0,1) is a uniform random number and
may vary with regard to chromosome.

Contd
For example, offspring arithmetically with given two
parents where = 0.23
x= (2.3, -4.1, 7.4)
y= (-1.8, 5.9, 3.2)

is given by
x= (1.357, -1.800, 6.434)
y= (-0.857, 3.600, 4.166)

11
10.6 Genetic Operator: Mutation
• Mutation alters one or more gene values in a.
• Mutation alters each gene independently with a
mutation probability pm.
• There are various operators depending on the
chromosome encoding.

Contd
Binary encoding
a. Simple mutation
• It selects one or more bits randomly and flip with 0
changed to 1 and 1 changed to 0, with a probability
given by the mutation rate.

s1= (1 1 0 1 0 0 1 0)

s1= (1 1 0 1 0 1 1 0)
• This mutation is used for a binary encoding GA.

12
Contd
Real number encoding
a. Uniform mutation
• A gene is randomly selected and its value is changed
with a random value generated between lower and
upper bounds.
x1 x2 x3 x4
s1= (1.37, -2.04, 0.06, 0.17), -5.0  x3  5.0
s1= (1.37, -2.04, 2.92, 0.17)

Contd
b. Non-uniform mutation
 x   (k , xi(U )  xi ) if q  0 x= (x ,, x , , x )
xi'   i 1 i n
( L)
 xi   (k , xi  xi ) if q  1
k
(1 )b x= (x1, , xi, , xn)
where  (k , y )  y[1  r T ]

xi(U), xi(L): upper and lower bounds of xi


q: random digit (0 or 1)
r: uniform random number, r[0,1)
b: parameter (b= 5)
T: maximum generation number
k: current generation

13
10.7 Parameter Settings
Population size (N)
• Large N increases genetic diversity at the expense of
requiring more computation (N= 20-200)
Crossover rate (Pc)
• Controls crossover frequency (Pc= 80-100%)
• Large Pc can destroy good schemata
Mutation rate (Pm)
• Controls mutation frequency (Pm= 0.5-1%)
• Large Pm increases population diversity but will
destroy good schemata

10.8 MTLAB GA Function


ga function
x= ga(fun,nvars)
x= ga(fun,nvars,A,b,Aeq,beq,lb,ub,nonlcon,options)
[x,fval]= ga(-)
where fun: function handle navr: number of variables
x: solution fval: function value of x
min f(x)
s.t. Ax  B (linear inequality constraints)
Aeqx = Beq (linear equality constraints)
C(x)  0 (nonlinear inequality constraints)
Ceq(x)= 0 (nonlinear equality constraints)
S= {x x(L)  x  x(U)} (bounds)

14
Contd
Example 1: Consider the Rastrigin’s function.
n
Minimize f ( x)  10n  [ x
i 1
2
i  10 cos(2 xi )]

The search area is restricted to hyphercube −5.12 ≤ xi ≤ 5.12,


(i= 1,2,. . . , n). Its global minimum value is f(x) = 0 at xi= 0,
(i= 1,2,. . . , n).

Write a MATLAB program


that finds the solution when n= 5.

Contd
% Finds the solution of the Rastrigin’s function using a GA
rng default % Initialize the random number generator
options= optimoptions('ga');
options.PlotFcn= 'gaplotbestf';
options.PopulationSize= 50;
options.MaxGenerations= 5000;
nvar= 5;
LB= -5.12*ones(1,nvar);
UB= 5.12*ones(1,nvar);
[x,fval]= ga(@fun,nvar,[],[],[],[],LB,UB,[],options)

15
Contd
function f= fun(x)
n= length(x);
f= 10*n+sum(x.^2-10*cos(2*pi*x));
end

n
f ( x)  10n  [ x
i 1
2
i  10 cos(2 xi )]

Contd
Optimization terminated: average change in the fitness
value less than options.FunctionTolerance.
x=
1.0e-04 *
0.1379 0.0268 0.0530 0.1317 -0.0863
fval =
9.3899e-08

16
Contd
Example 2: Consider the PID control system

where r= 1, t>0 and the parameters are k= 1, T= 5 and L= 2.


Write a program that tunes the gains of the PID controller
minimizing the IAE performance criterion.
40
IAE  
0
| r  y | dt

Contd
% Tune the gains of the PID controller using the PSO
rng default % Initialize the random number generator
options= optimoptions('ga');
options.PlotFcn= 'gaplotbestf';
options.PopulationSize= 30;
options.MaxGenerations= 100;
nvar= 3;
LB= [0 0 0];
UB= [15 15 15];
[x,fval]= ga(@EvalObj,nvar,[],[],[],[],LB,UB,[],options)

17
Contd
% Measure the performance of the PID control system
% for setpoint tracking
function J= EvalObj(x)
% Retrieve PID controller parameters
Kp= x(1); Ki= x(2); Kd= x(3);

s= tf('s');
Gc= Kp+Ki/s+Kd*s; % PID controller TF
K= 1; T= 5; L= 2;
Gp= K*exp(-L*s)/(T*s+1); % Plant TF
% Overall TF from yr to y
G= feedback(Gp*Gc,1);

Contd
% Unit step response for 40seconds
h= 0.01; t= 0:h:40; r= 1;
y= step(G,t);

% Calculate IAE using the trapezoidal integration rule


e= abs(r-y); % IAE
% e= (r-y).^2; % ISE
% e= t'.*abs(r-y); % ITAE (t must be transposed)
J= 0.5*h*(e(1)+2*sum(e(2:end-1))+e(end));
end

18
Contd
Optimization terminated: maximum number of generations
exceeded.
x=
2.2557 0.3857 1.8355
fval =
2.7229 Fitness value

Q&A

19

You might also like