5. Metaheuristics#
In this chapter, we’ll cover how to solve all optimization problem using metaheuristics, which are methods. Depending on the actual metaheuristic different models can be solved, but in general metaheuristics are more versatile than the gradient-based methods.
Model#
The two method covered in this chapter will be applied on our most general nonlinear constrained optimization problem as defined before in (3.1).
Method#
Two different methods will be treated: scipy.optimize.differential_evolution
and the genetic algorithm from pymoo.optimize
. There are many different methods available in different programming languagues, these two methods are only an example of what you could use.
Scipy#
scipy
has implemented, among others, the differential evolution method [Storn and Price, 1997]. The documentation of this function is available here: https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.differential_evolution.html [The SciPy community, 2024]. In this course we’ll cover only the relevant parts.
We need to run at least scipy.optimize.differential_evolution(fun, x0, bounds, constraints, integrality, strategy, popsize, mutation, recombination, ...)
with:
fun
, the function representing the objective function \(f\left(x\right)\) to be minimized.fun
is a callable. Thescipy.optimize.minimize
function takes care of defining and inputting our design variable \(x\).x0
, the initial guess for our design variable \(x\). It needs to be andarray
with length \(n\)Bounds
: A sequence of \(i\)(min, max)
pairs for each element in \(x\), defining the minimum \(x_i^l\) and maximum values \(x_i^u\) of that decision variable. For this functionNone
cannot be used to specify no bound. The valuesmin
andmax
need to be finite.constraints
, a single or a list of constraint objective defined in the same way as in nonlinear constrained optimization with:scipy.optimize.LinearConstraint
scipy.optimize.NonlinearConstraint
integrality
, anndarray
specifying whether part of the \(n\) design variables is constrained to integer values.strategy
, optionalstring
argument which allows you to select another differential evolution strategy. The default isbest1bin
popsize
, an optionalinteger
argument which allows you to set the total population size.mutation
, an optionalfloat
ortuple(float,float)
argument which allows you to set the mutation constant.recombination
, an optionalfloat
argument which allows you to set the recombinatino constant.
Please note that are even more options to adapt the optimization algorithm.
The function scipy.optimize.differential_evolution
outputs an object scipy.optimize.OptimizeResult
similar as scipy.optimize.minimize
explained for unconstrained optimization.
pymoo#
pymoo
has implemented, among others, the genetic algorithm. The documentation of this function, although very limited, is available here: https://pymoo.org/ [Blank and Deb, 2020]. Again, we’ll cover only the relevant parts. Make sure you install pymoo as explained here in the section Add other packages and in the documentation of pymoo.
The main function we need is pymoo.minimize(problem, algorithm, ...)
with:
problem
, pymoo object containing the problemalgorithm
, pymoo object containing the method
This results in an object pymoo.core.result.Result
with:
Result.x
, the solution foundResult.F
, value of the objective function for the solutionResult.G
, value of the constraint functions for the solutionResult.time
, the time required to run the algorithm
The problem needs to be defined in an object, therefore we’ll use pymoo.problems.functional(n_var, objs, constr_ieq=[], constr_eq=[], xl, xu, ...)
with:
n_var
, the number of design variables \(n\), this needs to be anint
.objs
, the objective function \(f\) to be minimized, this needs to be acallable
.constr_ieq
, list of \(m\) inquality constraint functions \(g\), this needs to be a list ofcallable
constr_eq
, list of \(n\) equality constraint functions \(h\), this needs to be a list ofcallable
.xl
,Float
orndarray
of length \(n\) representing the lower bounds of the design variables.xu
,Float
orndarray
of length \(n\) representing the upper bounds of the design variables.
As a method, we’ll use the genetic algorithm. This is stored in the object pymoo.algorithms.soo.nonconvex.ga(pop_size=100, sampling=<pymoo.operators.sampling.rnd.FloatRandomSampling object>, selection=<pymoo.operators.selection.tournament.TournamentSelection object>, crossover=<pymoo.operators.crossover.sbx.SBX object>, mutation=<pymoo.operators.mutation.pm.PM object>, survival=<pymoo.algorithms.soo.nonconvex.ga.FitnessSurvival object>, ...)
with:
pop_size
,int
defining size of the populationsampling
, pymoo object defining how sampling should happen. If you want to solve integer problems, input must bepymoo.operators.sampling.rnd. IntegerRandomSampling()
selection
, pymoo object defining how selection should happencrossover
, pymoo object defining how crossover should happen. If you want to solve integer problems, input must bepymoo.operators.crossover.sbx.SBX(repair=pymoo.operators.repair.rounding.RoundingRepair())
mutation
, pymoo object defining how mutation should happen. If you want to solve integer problems, input must bepymoo.operators.mutation.pm.PM(repair=pymoo.operators.repair.rounding.RoundingRepair())
survival
, pymoo object defining how survival should happen