SimonBlanke / Gradient-Free-Optimizers

Simple and reliable optimization with local, global, population-based and sequential techniques in numerical discrete search spaces.
https://simonblanke.github.io/gradient-free-optimizers-documentation
MIT License
1.13k stars 79 forks source link

add support for continuous parameter ranges #40

Open SimonBlanke opened 1 year ago

SimonBlanke commented 1 year ago

In this issue I will show the progress of adding support for continuous parameter ranges in the search-space.

For most optimization algorithms it should be easy to add support for continuous parameter ranges:

So in conclusion: Adding support for continuous search-spaces should be possible with reasonable effort.

The next problem to discuss is how this will be integrated into the current API. It is important to me, that the API design stays simple and intuitive. Also: It would be very interesting if the search-space can have discrete parameter ranges in some dimensions and continuous ones in other dimensions.

The current search-space looks something like this:

search_space = {
    "x1": np.arange(-100, 101, 0.1),
    "x2": np.arange(-100, 101, 0.1),
}

How would a continuous dimension look like? It cannot be a numpy array and it should be distinguishable enough from a discrete dimension. Maybe a tuple:

search_space = {
    "x1": np.arange(-100, 101, 0.1),
    "x2": np.arange(-100, 101, 0.1),
    "x3": (-1, 1),
}

I will brainstorm some ideas and write some prototype code to get a clear vision for this feature.

logan-dunbar commented 10 months ago

Hi @SimonBlanke, I was also looking for continuous parameter ranges, perhaps you could look at the Gymnasium Spaces code for ideas for handling discrete/continuous (its one of the standard Reinforcement Learning toolkits). The Box space has low and high attributes which specify the bounds, also, the use of a generic vector for the bounds alleviates the need to name each dimension as in your current implementation.

SimonBlanke commented 8 months ago

Hello @logan-dunbar,

sorry for this very late answer. I read you comment and looked into the link you provided, but answering you somehow fell of my radar.

Using low and high is more verbose, but it might not be necessary. In this example:

search_space = {
    "x1": np.arange(-100, 101, 0.1),
    "x2": np.arange(-100, 101, 0.1),
    "x3": (-1, 1),
}

The low and high values (in parameter x3) are like positional arguments of a function. So it is very similar to the numpy method in x1 and x2. It is intuitive that the first one is low and the second one is high. I would consider using the solution you provided if there will be additional parameters for a continuous dimension in the future. This would look like this:

search_space = {
    "x1": np.arange(-100, 101, 0.1),
    "x2": np.arange(-100, 101, 0.1),
    "x3": {
        "low": -1,
        "high": 1,
        "additional parameter 1": ...,
        "additional parameter 2": ...,
    },
}

In this case the naming improves the readability, because putting multiple values into a tuple gets confusing at some point.


the use of a generic vector for the bounds alleviates the need to name each dimension as in your current implementation.

How would this look like? I guess you mean the way the dimensions are accessed in the objective-function. In this case:

The names for the dimensions are genertic in this example and look like they could just be indices of a vector. But if you want to do something like hyperparameter-optimization the search-space looks very different and the name for the dimensions helps for readability.

mxv001 commented 7 months ago

Just wanted to chime in here. Perhaps what @logan-dunbar is asking for is multi-dimensional parameter declaration. Something like

search_space = {
    "x1": np.arange(-100, 101, 0.1),
    "x2": np.arange(-100, 101, 0.1),
    # something like below where there is array declaration
    # obviously this doesn't work even numpy syntax is wrong
    "x3": np.array(4,2) 
}

For example, in the nevergrad package there is an interface for parameter arrays: see https://facebookresearch.github.io/nevergrad/parametrization.html

The explicit API they expose is

search_space = nevergrad.p.Dict(
    param_array=nevergrad.p.Array(shape=(2,4)).set_bounds(0, 2),
)

Perhaps this is a different topic than the original post. If it is, then I can create it as a new feature request. It would be a great feature! Many models have arrays of parameters, and it can be a pain to fold/unfold everything.

SimonBlanke commented 7 months ago

Hello @mxv001,

thanks for your suggestion. I looked into the nevergrad-package. The interface you have shown is somewhat related to this issue, because it also enables continuous parameter ranges. But it is also a much broader topic, because of of the multi-dimensional parameter declaration.

I would suggest, that you open another issue (feature request).

I am not sure if a "nevergrad-style" of search-space creation will find its way into gradient-free-optimizers (I like to keep the API very simple), but I think it would be valuable to discuss it.