ljvmiranda921 / pyswarms

A research toolkit for particle swarm optimization in Python
https://pyswarms.readthedocs.io/en/latest/
MIT License
1.28k stars 332 forks source link

Implement Constrained PSO Variants #8

Closed ljvmiranda921 closed 6 years ago

ljvmiranda921 commented 7 years ago

Hi! Thank you for checking out this issue!

Currently, this whole part is pretty much open. As of now, I'm planning to give PySwarms four major optimization capabilities:

We've established some grounds on single-objective continuous optimization (with the standard implementations of global-best and local-best PSO). But we haven't done anything yet as for constrained optimization. Would you like to give us a headstart?

These are the steps that will be undertaken to close this issue:

  1. Creating an abstract class in the pyswarms.base module. This will provide a skeleton on how other implementations of the same optimization nature would be written. Take for example how global-best (pyswarms.single.gbest) and local-best (pyswarms.single.lbest) are inheriting from the class SwarmBase (this is the abstract class for single-objective continuous).

  2. Implementing a standard PSO algorithm inheriting the abstract class written in Step 1. This means that a particular constrained PSO optimization algorithm will be implemented while inheriting the base class.

  3. Writing unit tests and use-case examples This is to show how the proposed skeleton and algorithm will be used by the end-user, and of course some unit tests to check its robustness (please check the tests directory)

As you can see, these steps are asking for a lot of things. Right now, we're setting this in low-priority because I am currently writing the abstract classes for the other PSO variants. If you want to be a super-contributor, then go ahead and do all the steps above. 👍 😄 But I believe it would be much better if I set-up a basis first then we iterate from there.

But perhaps, the best way to contribute on this issue would be the following: (note that these contributions don't require pull requests nor git commits)

That's it for this issue!! For any questions, just drop a comment here!

luisdelatorre012 commented 7 years ago

Saw this on the python subreddit "what are you working on this week". I'm interested in contributing on this. Is there a specific constrained PSO algorithm that you want to implement? A quick search of the literature shows a few different implementations. I'm also happy to suggest some that look promising. I don't have a ton of experience with nature-inspired algorithms, but I have pretty good grasp (no pun intended) on metaheuristics.

ljvmiranda921 commented 7 years ago

Hey @luisdelatorre012 , I have updated the description for this issue. Check the comment above! Currently, contributions for this issue won't require a pull request. But I like your idea! If there are promising implementations of constrained PSO in literature, please comment them here with the corresponding research title and link (it's okay if it's behind a paywall!). Then from that we can review which ones to prioritize.

I do appreciate your initiative! Thank you so much!

luisdelatorre012 commented 7 years ago

Thanks for the update. Constrained problems are huge for me (my work is in operations research, and I solve problems with a lot of business-related side constraints). I'll see what I find on constrained PSO.

ljvmiranda921 commented 7 years ago

Hi @luisdelatorre012 , how's this going? 😄

luisdelatorre012 commented 7 years ago

Sorry it's been awhile, but I'm still interested in this!

I've spent some time getting acquainted with pyswarms and used it for solving some continuous and binary bound-constrained problems. I'm just now looking into constraint handling. I'll see if this paper gives some promising leads: https://link.springer.com/article/10.1007/s00521-014-1808-5

I'll keep you posted.

ljvmiranda921 commented 7 years ago

Awesome! I'll read through this!

How did you find PySwarms? As of now, we don't release quite often, so I think it's better to test the bleeding-edge version by cloning this repository, and running python setup.py

Okay sure! Thanks a lot for your help @luisdelatorre012 !

luisdelatorre012 commented 7 years ago

I'm finding the package easy to use. Appreciate the good use of docstrings.

The review paper doesn't do any computational tests comparing the methods it reviews. I'd say that the scientific thing to do is to do such a comparison. That is a lot of work, and it's not obvious that any one of the methods reviewed is much better than the others. I'm inclined to start by implementing the method it identifies as most popular (what they refer to as Deb's approach).

Here's the earliest paper they cite that uses this approach: https://link.springer.com/chapter/10.1007/978-3-540-88636-5_43 The constrained PSO described in this paper looks straightforward to implement.

Is your plan to first create a constrained base class analogous to SwarmBase and DiscreteSwarmBase, or to create a class that inherits from one of these two (similar to GlobalBestPSO and LocalBestPSO but for constrained problems)?

ljvmiranda921 commented 7 years ago

Hi @luisdelatorre012 ,

I'd trust your judgement and we can start with the constrained PSO by Deb et. al. In the long run, it's also possible to implement other variants described in the earlier paper. But we can start with the one you suggested first.

My plan is to create a base class ConstrainedSwarmBase similar to SwarmBase and DiscreteSwarmBase, in which all other constrained PSO implementations will inherit. For this part, I'd probably need your opinion deciding how we can structure it.

Some of my concerns right now is on what kind of data structure we can best represent an objective function with constraints. For example, if we have, say, an objective function "Rosenbrock function constrained with a cubic and a line" with the following equation:

f(x,y) = (1-x)^2 + 100(y-x^2)^2 with constraints (x-1)^3 - y +1 <0 and x+y-2<0

Option 1 We can define the constraints separately, c_1 and c_2 with the objective function f as methods then pass them to a constrained PSO optimizer, say, DebPSO:

# Define optimizer
optimizer = DebPSO(n_particles=100, dims=10, options=options)
# Run optimization
optimizer.run(n_iters=1e3, objective_func=f, constraints=(c_1, c_2))

What do you think?

luisdelatorre012 commented 7 years ago

Assuming that you want to be able to express any problem of the form

min f(x) s.t. g(x) <= 0, h(x) = 0

your suggestion seems like the right way to do it. I'd separate equality and inequality constraints:

# Define optimizer
optimizer = DebPSO(n_particles=100, dims=10, options=options)
# Run optimization
optimizer.run(n_iters=1e3, objective_func=f, ineq_constraints=(g_1, g_2), eq_constraints=(h_1, h_2))

You could allow only inequality constraints, and require h(x) = 0 to be written as h(x) <= 0, -h(x) <= 0, but this seems unnecessarily complicated for the user.

What other options are there? You could define a class that holds the objective function and constraints, but that wouldn't be consistent with how you defined the other problem base classes. You could have a single vector-valued function that evaluates all of the constraints? I don't see a lot of upside to this.

Thoughts? Other options?

ljvmiranda921 commented 7 years ago

Hi @luisdelatorre012 , sorry for my very late reply. Actually I'm also considering a class that holds both objective function and constraints, but it might seem a bit overkill for that (in addition to what you said). Yes, maybe we can start of with that.

I'm just wondering why we need to separate both equality and inequality constraints? Hmmm, I'm not sure if this is generally difficult for other Python users (so I've heard), but in my functional programming experience, I found lambda expressions as something very helpful to define expressions.

So if we have h(x) = 0 and g(x) <= 0 as constraints, I can define:

h = lambda x: x == 0
g = lambda x: x <= 0

and so I can evaluate a number or vector,

>>> import numpy as np
>>> input = np.array([0,1,2,3])
>>> h(input)
array([True, False, False, False], dtype=bool)
>>> g(input)
array([True, False, False, False], dtype=bool)

And thus for my input, we can just lump all of them together in a tuple:

# Define optimizer
optimizer = DebPSO(n_particles=100, dims=10, options=options)
# Run optimization
optimizer.run(n_iters=1e3, objective_func=f, constraints=(h,g))
ljvmiranda921 commented 7 years ago

I forgot that the general form for constrained optimization has the equality and inequality constraints separated. Pardon me for asking in the previous comment. I guess for now I'd prefer the use of normal methods in place of lambda expressions, just to keep with the coherence and "only-one-solution" in Python.

ljvmiranda921 commented 6 years ago

I will just open this up again once we have a slew of activity happening in this issue. Thank you for all your input, @luisdelatorre012 !