ljvmiranda921 / pyswarms

A research toolkit for particle swarm optimization in Python
https://pyswarms.readthedocs.io/en/latest/
MIT License
1.28k stars 332 forks source link

Implemented options to hotstart training #499

Open LucasWaelti opened 2 years ago

LucasWaelti commented 2 years ago

Description

This PR implements an option to provide the swarm position, velocity and global best (or any particle) to the optimizer. This was performed by adding the following optional arguments to the optimizer class:

The arguments were added to GlobalBestPSO, GeneralOptimizerPSO and LocalBestPSO as well as to the abstract base classes SwarmOptimizer and DiscreteSwarmOptimizer.

The optimizer can now be called as follows:

# Call instance of PSO 
optimizer = pyswarms.single.GlobalBestPSO(
                    n_particles=100, dimensions=dim, options=options, bounds=bounds, 
                    init_pos=init_pos,init_vel=init_vel,init_best=init_best)

Related Issue

Motivation and Context

The current implementation does not allow to properly resume a previous optimization process. This PR makes it possible while maintaining full compatibility with previous versions.

How Has This Been Tested?

This has been tested in a limited manner so far. Only using GlobalBestPSO on Ubuntu 18.04, python 3.6.

Screenshots (if appropriate):

Types of changes

Checklist: