facebookresearch / nevergrad

A Python toolbox for performing gradient-free optimization
https://facebookresearch.github.io/nevergrad/
MIT License
3.89k stars 349 forks source link

Set up step size in nevergrad #1604

Open Siddharth8 opened 3 months ago

Siddharth8 commented 3 months ago

I am using nevergrad to optimize and has got several hyperparameters to tune in. I am assigning one of them as x = ng.p.Scalar(lower=0, upper=1) and nevergrad takes any value between 0 to 1. However I want to include step size like increase by 0.1 always so values can be any from 0, 0.1, 0.2, 0.3 ... 1. I tried looking into how to achieve this in nevergrad but no success. Can anyone please help me to setup this step size ?