How is it possible that gamultiobj gives worse solution when the number of MaximumGenerations is raised?
4 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
Steffen Kuehl
il 14 Nov 2016
Commentato: Steffen Kuehl
il 16 Nov 2016
Hi,
My muliobjective optimization modell (gamultiobj) does not get continuously better when I raise the number of MaximumGenerations. Sometimes a solution is worse than another solution which was generated with the same parameters, but a smaller number of MaximumGenerations. Since gamultiobj is a variant of NSGA-II( I can't find anything more specific in the Matlab-documentation), shouldn't the elitism of the algorithm make sure, that good solutions, that are calculated in early generations, are kept until the end?So, that it's not possible to get a worse solution when more generations are calculated?
I use the same starting/Initial Population for all my runs. I created my own Creation, Crossover and Mutation Function. Since the Elitism is supposed to be located in the Selection Function, I don't think that creating my own functions should interfer with the elitism.
I appreciate any idea and help. Thanks Steffen
And to clarify, here is some of my code.
These are the options for the Algorithm.
% code
options = optimoptions('gamultiobj',...
'UseParallel', true,...
'UseVectorized', false,...
'CreationFcn',@popFun,...
'CrossoverFcn',@crossoverBinary,...
'MutationFcn',@mutFun,...
'PopulationSize',InitPop,...
'MaxStallGenerations',MaxStallG,...
'MaxGenerations',MaxGamultiobj)
This is the Functioncall. A and B are linear constraints.
% code
[X,fval,exitflag] = gamultiobj(objFun,n,A,B,[],[],[],[],[],options);
This is my creation function.
% code
function Population = popFun(GenomeLength,~,options)
%Population function for example PopExample.
% This function must ensure that the linear constraints are met
global Zusatz
Population=zeros(options.PopulationSize,GenomeLength);
rng('default');
rng(Zusatz);
intcon=1:GenomeLength;
A=xlsread('Linearconstraints.xlsx','A2:MD86');
B=ones(85,1);
B(36,1)=-1;
lb=zeros(1,GenomeLength);
ub=ones(1,GenomeLength);
opts =optimoptions('intlinprog','IntegerTolerance',1e-06,'Display','off');
for a=1:options.PopulationSize
f= randn(GenomeLength,1);
[x,fval,exitflag]= intlinprog(f,intcon,A,B,[],[],lb,ub,opts);
Population(a,:)=x;
end
end
This is the Crossover function
if true
% code
end
function xoverKids = crossoverBinary(parents,options,GenomeLength,~,~,thisPopulation)
global Zusatz
rng('default');
rng(Zusatz);
% Extract information about linear constraints, if any
linCon = options.LinearConstr;
nKids = length(parents)/2;
index = 1;
xoverKids = nan(nKids,GenomeLength);
for k = 1:nKids
% Get the parents from the population
parent1 = thisPopulation(parents(index),:);
index = index + 1;
parent2 = thisPopulation(parents(index),:);
index = index+1;
% find locations where parents have a different genome
idx = randi(GenomeLength,1);
% Where genome is the same, keep it the same in the kids
xoverKids(k,1:idx) = parent1(1:idx);
xoverKids(k,idx+1:end) = parent2(idx+1:end);
% Ensure that kid astisfies constraints
flag = any(linCon.Aineq*(xoverKids(k,:)') > linCon.bineq,1);
while flag % Does not satisfy constraints
idx = randi(GenomeLength,1);
xoverKids(k,1:idx) = parent1(1:idx);
xoverKids(k,idx+1:end) = parent2(idx+1:end);
flag = any(linCon.Aineq*(xoverKids(k,:)') > linCon.bineq,1);
end
end
end
And the Mutation Function.
if true
% code
end
function mutationChildren = mutFun(parents, options, GenomeLength, ...
~, ~, ~, thisPopulation)
linCon = options.LinearConstr;
global Zusatz
rng('default');
rng(Zusatz);
% Initialize the output
mutationChildren = nan(length(parents),GenomeLength);
for k = 1:length(parents)
mut = thisPopulation(parents(k),:)';
idx = randi(GenomeLength,1);
mutated = mut;
if mutated(idx)==1
mutated(idx)=0;
else
mutated(idx)=1;
end
% Check that constraints are satisfied.
flag = any(linCon.Aineq*mutated > linCon.bineq,1);
while flag
idx = randi(GenomeLength,1);
mutated = mut;
if mutated(idx)==1
mutated(idx)=0;
else
mutated(idx)=1;
end
% Check that constraints are satisfied.
flag = any(linCon.Aineq*mutated > linCon.bineq,1);
end
mutationChildren(k,:) = mutated';
end
end
5 Commenti
Risposta accettata
Brendan Hamm
il 15 Nov 2016
Ok. So there is an additional change you make in here that was not mentioned and is affecting the output. You change the MaxStallGenerations which was causing one of the solutions to run more iterations which were producing "better" results. While they may have been better with respect to some of the functions, they were not necessarily better with respect to others.
The basic idea is that if the average change in the best Objective Evaluations between the current generation and the Generation which occurred MaxStallGenerations ago is less than the FunctionTolerance (1e-4 by default), termination will occur.
So, I have a few suggestions:
- Keep the MaxStallGenerations the same between different runs.
- There is no need to change the rng in each function, in fact I would discourage this. Instead what you can do is get the 4th output of gamultiobj which contains the state of the RNG at the start of the algorithms run. See the link for information on this.
- Increase the population size. For a problem with this many variables I would consider having in the range of 500 different initial populations, this may provide you with a large enough "Elite" population to carry on to the next generation.
4 Commenti
Brendan Hamm
il 16 Nov 2016
I am referring to setting the PopulationSize to a scalar value of 500. The Sub-Population refers to the Migration options which are not actually implemented. Presumably this is something which has been considered for a future release (I am speculating and have no actual knowledge of this plan), but currently the options does nothing except apparently give you an error :).
Più risposte (2)
John D'Errico
il 14 Nov 2016
This is a stochastic solver. It generates points using random methods. As well, ANY numerical optimization tool can only find an approximate solution. They do not find an exact solution.
So there is no presumption that one optimization using a stochastic optimizer will always give as good a solution as another call to the same optimizer. Yes, by allowing it more iterations, you increase the chance that it will be able too improve, but it is only abetter chance, not an assurance of success.
Walter Roberson
il 14 Nov 2016
One aspect that is not always obvious is that a later configuration, with a larger (less optimal) objective function value, might be less of a constraint violation. The magnitude of the constraint violation is compared as well as the objective function value.
Vedere anche
Categorie
Scopri di più su Particle Swarm in Help Center e File Exchange
Prodotti
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!