LION 5

January 17-21

Rome

Motivation: exploiting information •

First: we have to gain information! – Real and virtual exploration of the search space

•

Local search – Finding accurate candidate solutions

•

Maintaining diversity – Exploring interesting regions of the objective space

•

How can we reduce computational costs? – Managing available resources

LION 5

January 17-21

Rome

Motivation: exploiting information •

•

•

•

First: we have to gain information! – Real and virtual exploration of the search space DoE, GA, Metamodels Local search – Finding accurate candidate solutions Gradient-Based Algorithms, SQP, Scalarization Maintaining diversity – Exploring interesting regions of the objective space Elitism, Epsilon-Constrained method How can we reduce computational costs? – Managing available resources Steady-State evolution, Parallelization, Archiving, Metamodels Training

LION 5

January 17-21

Rome

Building Blocks (I): GA • •

•

Standard operators (Mutation, Crossover, Selection), variable-wise encoding, non-dominated sorting, crowding distance. We focus on Managing Elitism: – We allow population size increments if the set of non-dominated points grows. – We use Controlled Elitism otherwise Steady State evolution: – We want to avoid idle computational resources From xkcd.com

LION 5

January 17-21

Rome

Building Blocks (II): SQP The idea is to use first-order derivatives (plus BFGS formula) to iteratively solve a quadratic local model of the original problem.

We rely on Filter method to ensure global convergence. The use an adaptive filter can handle constraints with different lengthscales. From R. Fletcher, S. Leyffer, P. Toint, A brief history of filter methods, Argonne NL Preprint (’06)

LION 5

January 17-21

Rome

Building Blocks (III): Hybridization

•

SQP solver acts as a new operators, starting only from non-dominated points with a prescribed probability.

•

A modified ε-constrained method transforms the original possibly multiobjective problem into a single objective one to be solved with AFilterSQP

From X. Hu, Z. Huang and Z. Wang. “Hybridization of the Multi-Objective Evolutionary Algorithms and the Gradient-based Algorithms”, in Proc. of the 2003 Congress on Evolutionary Computation (CEC'2003)

LION 5

January 17-21

Rome

Building Blocks (III): Hybridization

•

SQP solver acts as a new operators, starting only from non-dominated points with a prescribed probability.

•

A modified ε-constrained method transforms the original possibly multiobjective problem into a single objective one to be solved with AFilterSQP Random weights

LION 5

January 17-21

Rome

Building Blocks (III): Hybridization

•

SQP solver acts as a new operators, starting only from non-dominated points with a prescribed probability.

•

A modified ε-constrained method transforms the original possibly multiobjective problem into a single objective one to be solved with AFilterSQP

If the starting point breaks any constraint, we try first to recover feasibility LION 5

January 17-21

Rome

Building Blocks (IV): Metamodels

Radial Basis Function and Polynomial SVD techniques build surrogate models as the sum of many contribution. By the chain rule it is possible to use them to approximate also derivatives of involved functions. At each gradient request we pick a local training set and we compute derivatives without additional evaluations.

The initial DoE provides the first training set. LION 5

January 17-21

Rome

Put all together Initial population

Parent population

Genetic operators

SQP run

Generation Archive

Sorting, Elitism LION 5

January 17-21

Rome

Put all together Parent population produces a child each time a free computing resource is found.

Initial population

Parent population

Genetic operators

SQP run

Generation Archive

Sorting, Elitism LION 5

January 17-21

Rome

Put all together Initial population Standard genetic operators transform parents into children which are stored in an archive.

Parent population

Genetic operators

SQP run

Generation Archive

Sorting, Elitism LION 5

January 17-21

Rome

Put all together SQP solver starts from a non-dominated points. It uses the archive for metamodel training. If the run is successful the child point replaces its parent immediately

Initial population

Parent population

Genetic operators

SQP run

Generation Archive

Sorting, Elitism LION 5

January 17-21

Rome

Put all together Initial population

Parent population

Genetic operators

SQP run

Generation Archive

Sorting, Elitism LION 5

January 17-21

When a pre-determined number of “genetic” children are produced, the whole archive (i.e. genetic+sqp) is sorted and a new parent population is extracted Rome

Parallelization • •

•

The parallelization follows the master-slaves approach The master runs the algorithms using different threads (we use Java) and distributes design evaluations among the slaves. Computing-intensive routines are avoided when possible: – The search for nearest points for metamodels training is approximated – The non-dominated sorting and the crowding distance operators are crucial and therefore performed exactly MASTER (runs MetaHybrid)

SLAVE 01 (design evaluation)

LION 5

SLAVE 02 (design evaluation)

SLAVE 03 (design evaluation)

January 17-21

…

Rome

Benchmark Tests Although the algorithm is designed in order to deal with engineering problems, a first row of mathematical benchmark tests is mandatory.

The results are interesting, especially for constrained problems. However, the appropriate applicability field for this algorithm has to be determined with more precision.

Future work will focus on this task rather than on fine tuning the algorithm for obtaining great results on ideal problems. LION 5

January 17-21

Rome

Benchmark Tests

LION 5

January 17-21

Rome

Benchmark Tests

LION 5

January 17-21

Rome

Benchmark Tests

LION 5

January 17-21

Rome

Benchmark Tests

LION 5

January 17-21

Rome

Benchmark Tests

LION 5

January 17-21

Rome

Benchmark Tests

LION 5

January 17-21

Rome

Conclusions

•

• • • • •

The proposed algorithm is a good trade-off between standard hybrid SQP+GA algorithms (too many evaluations spent for gradient evaluations) and pure GA (possibly weak local search). Metamodels allow the SQP solver both to get the global structure of the problem and to converge to the nearest Pareto point. The modified ε-constrained method helps in maintaining diversity and in handling constraints efficiently. Managing elitism is worth the effort. The parallelization, the use of threads and the steady state evolution have been combined in an efficient scheme. We are ready for testing on real-world problems.

LION 5

January 17-21

Rome

Questions?

LION 5

January 17-21

Rome

Questions? I have some…

• • •

•

Is it possible to improve metamodels accuracy without requiring more information? Is it possible to control (and hence to exploit) the relation between training sets and global/local accuracy? Could self adaptive parameters help? And, in particular, is it possible to tune the SQP operator probability depending on the smoothness of the problem? Maybe using metamodels validation data… …

LION 5

January 17-21

Rome