GSoC/GCI Archive
Google Summer of Code 2013

PaGMO / PyGMO

License: Academic Free License 3.0 (AFL 3.0)

Web Page: http://esa.github.io/pygmo/

Mailing List: mailto:pagmo-applicants@lists.sourceforge.net

PaGMO and its Pythonic alter ego PyGMO (the Python Parallel Global Multiobjective Optimizer) is a scientific library providing a large number of optimisation algorithms and problems under the same powerful parallelization abstraction built around the generalized island-model paradigm. What this means to the user is that the available algorithms are all automatically parallelized (asynchronously, coarse-grained approach) thus making efficient use of the underlying multi-core architecture. The user can also program his own solvers ... they also will be parallelized by PyGMO!! PyGMO’s implementation of the generalized migration operator allows the user to easily define “migration paths” (topologies) between a large number of “islands” (CPU cores).

Efficient implementations of state-of-the-art bio-inspired algorithms are sided to state-of the art optimization algorithms (Simplex Methods, SQP methods ....) and can be easily mixed (also with your newly invented algorithms) to build a super-algorithm exploiting cooperation via the asynchronous, generalized island model.

Many complex-networks topologies (Hypercube, Ring, Barabasi-Albert, Watts-Strogatz, Erdos-Renyi, etc.) are built-in and may be used to define the migration pathways of good solutions among islands. Custom topologies are also possible.

PaGMO/PyGMO can be used to solve constrained, unconstrained, single objective, multiple objective, continuous mixed int optimization problems, or to perform research on novel algorithms and paradigms, easily comparing them to state of the art implementations of established ones.

PaGMO/PyGMO is interfaced with SciPy's optimization algorithms, NLOPT algorithms, GSL algorithms, SNOPT, IPOPT and, hopefully .... more to come. In Python, packages such as NetworkX and VPython enhance functionalities allowing advanced visualization options.

The code is in use at the European Space Agency (ESA) and is mainly maintained / developed by the Advanced Concepts Team (ACT) at ESTEC. 

 

Projects

  • Constraints Handling Techniques in Evolutionary Algorithms Because of the stochastic behavior of Evolutionary Algorithms, some constraints handling methods use empirical parameters defined by the user to take into account constraints violation during the optimization process. Unfortunately these parameters can drive the optimizer to local optima and are usually effective for a single problem. This proposal consists in implementing in PyGMO/PagMO more or less sophisticated constraints handling techniques that are problem independent and to test them against a set of typical problems. In an implementation point of view, most of the constraint handling techniques can be seen as a modification of the initial constrained problem, algorithm or population. Thus, the implementation will consists in redefining the initial problem/algorithm/population through the use of operators/modifiers.
  • Expanding the multi-objective capabilities of PaGMO I will expand the multi-objective capabilities of PaGMO. At the moment only two algorithms for multi-objective optimization are available (NSGA-II, IHS). The aim of this project is to extend the library of multi-objective optimization algorithms including algorithms like MOEA/D, Multi-Objective Particle Swarm Optimization, Niched Pareto Genetic Algorithm, Strength Pareto Evolutionary Algorithm and Pareto Adaptive ε-Dominance
  • Hyperfast hypervolumes for Multi-objective optimization Project involves computation of hypervolume in higher dimensional spaces for a set of hypercubes sharing one common point (reference point that strictly dominates the whole set). It is a geometrical interpretation of quality measure for a Pareto set. It can be used as a comparison between two population of individuals, and as a fitness function during the evolution phase in the genetic algorithm. First scenario allows for exact computation, since time is not as crucial of a factor, as it is in second case (executing hypervolume computation during the algorithm). This requires implementation of both exact and approximated methods. Additionally, few methods have been proposed that use hypervolume as a input for the fitness measure. I intend to implement at least one of them during the GSoC as well.
  • Interactive Cloud Optimisation via Google Talk (XMPP) Project goal is to create PyGMO servers cloud with xmpp as a transport layer. Users of the cloud would be able to control, send, execute and monitor the execution of optimalization tasks remotely, using Google Talk. Cloud itself would be able to balance load of nodes, and would take care about the job execution issues (timeouts/interrupts/data loss etc). The appropriate communication protocol would be created, dealing with both the communication between nodes and between users and nodes.
  • Racing Population Individuals and Algorithms The main objective of this project is two-fold: firstly, to enable individuals (and meta-heuristics algorithms) in PaGMO with stochastic quality indicators be distinguished efficiently via racing algorithms; secondly, to implement interesting functionalities in PaGMO leveraging the racing capability. The most straightforward enhancement is to enable stochastic optimization problems to be solved more efficiently without too much modification to the existing algorithms. A perhaps more interesting consequent of the racing module would be an in-house automatic parameter configuration system for PaGMO optimizers / meta-heuristics. This could potentially mean that in future, algorithms implemented in PaGMO will no longer require (heavy) manual parameter tuning.