CMA-ES / pycma

Python implementation of CMA-ES
Other
1.11k stars 179 forks source link

Optimizing a 7-parameter black-box function #282

Open revanth-s opened 16 hours ago

revanth-s commented 16 hours ago

Hi,

I am working on optimizing a 7-parameter black-box function and have some questions that I would really appreciate if you could help me with. I have listed them below:

  1. Parameter Scaling

    • My parameters have different upper and lower bounds, and initially, some of them were exponential. I converted the exponential parameters to linear by taking the logarithm and passed them to CMA-ES.
      log_parameter_bounds = [
       (-15, -10),
       (-17, -10),
       (2, 3.5),
       (0, 5),
       (1.5, 3),
       (-12, -6),
       (-13, -7),
      ]
    • Is it generally recommended to scale these parameters before optimization, for example, using StandardScaler()?
  2. Population Size

    • After browsing through the documentation and reading papers, I decided to use a population size (popsize) greater than the number of parameters, and hence I had chosen popsize = 8.
    • What should the ideal value of popsize be?
  3. Parallel Population Search

    • From my understanding, each popsize searches over a different area (independent from each other). Can each population search be performed in a separate thread in parallel using the multiprocessing python module?
  4. Relationship between popsize and sigma0

    • I have noticed that increasing popsize also requires me to lower sigma0 for my 7-parameter function to avoid crashing.
    • Is there a relationship between popsize and sigma0?
  5. Finding Multiple Local Minima

    • My 7-parameter function has multiple local minima, and I want to find as many of these minima as possible.
    • Therefore, I want to restart CMA every time the algorithm is stuck on a minimum, but I have been finding it hard to find the right termination conditions to use to stop and restart the algorithm.
    • I tried using tolupsigma, but it never terminated optimization except when set to 1. When set to 1, the algorithm was terminated within 2-3 iterations.
    • Currently, I am using tolstagnation = 2 and still facing the same problem.

Thank you in advance for your assistance.

LazyLysistrata commented 13 hours ago

I'll answer a couple of your questions from my own experience because if I am mistaken the BBOBies can hopefully correct my mistakes and false assumptions!. :)

Scaling the parameter limits means that all parameters will, in a sense, be treated more equally by CMA-ES. For example, you could rewrite your objective function so that all parameters lie between 0 and 1 inclusive. (I prefer -5 to +5 for all parameters in my programs because I am more comfortable when the initial sigma is 1.0 or 2.0, instead of smaller values.)

One termination condition you should consider is a small minimum value for sigma, e.g. 10^-12. If sigma is less than that, it is an indication that you have found a local minimum and should restart at some other (probably random location).

If sigma exceeds some large maximum value that you have chosen, that could be an indication that CMA-ES is producing a sequence of covariance matrices that are not converging, or are unstable in some sense. You should probably also restart in that case.

(I don't actually use pycma, and I am using use a type of matrix adaptation that doesn't need the covariance matrix, so I apologize if my advice and terminology is incorrect).

nikohansen commented 10 hours ago

re. 1: maybe have a look at these practical hints re. 2: if you don't know, use the default and increase if there is time left (IPOP-CMA-ES, so to speak). re. 3: not sure about the correctness of the statement, but this may be related: https://github.com/CMA-ES/pycma/issues/276 re. 4: maybe have a look again at the practical hints, and Fig.1 in this reference suggests that this also can very much depend on the function to be optimized re. 5: what about the defaults? Otherwise, the 'tol*' options are the goto options to consider, see e.g. here