SimonBlanke / Hyperactive

An optimization and data collection toolbox for convenient and fast prototyping of computationally expensive models.
https://simonblanke.github.io/hyperactive-documentation
MIT License
512 stars 42 forks source link

Optimization in serial? #51

Closed esmuigors closed 2 years ago

esmuigors commented 2 years ago

Dear SimonBlanke,

First of all, thank You for this wonderful project!

My question is: can I somehow specify the optimizers to NOT run in parallel? My function is actually a call to external (parallelized) program, which furthermore cannot be run in multiple instances on the same machine (it is a commercial program). So I would like to prevent the optimizer from initiating next calculations while the first one is not finished.

Best regards, Igors

SimonBlanke commented 2 years ago

Hello @esmuigors,

thank you for your kind words! :-)

I suspect this issue does not belong in this repository, because it is a question about the package itself and not about the tutorial (please correct me if I am wrong about this). Gradient-Free-Optimizers does not have any parallel processing capabilities. Hyperactive can do parallel processing via multiprocessing, joblib and pathos.

So I will transfer this issue to Hyperactive.

SimonBlanke commented 2 years ago

About your question:

Hyperactive will automatically use parallel processing if:

In the first case Hyperactive adds the same search n_jobs-number of times. In the second case Hyperactive adds different searches.

n_jobs is 1 per default. So you do not need to change this. And you should call .add_search() just once before calling .run(). To not use parallel processing you should write something like this:

# In this case Hyperactive will not use any parallel-processing package to run the optimization. 
# (Even if you use population based optimization algorithms)

...

hyper = Hyperactive()
hyper.add_search(model, search_space)
hyper.run()

...

Does this answer your question? Let me know if this works for your use case.

esmuigors commented 2 years ago

Dear @SimonBlanke

I am very sorry for hastily opening this issue (though it was the opportunity to say those kind words :) ). It turned out to be a problem completely unrelated to either gradient_free_optimizers or the Hyperactive... I didn't look at the modification times of the files, then it became obvious :'-) What a shame on me...

I hope that You answer will be useful for someone else at least :)

Best wishes, Igors

SimonBlanke commented 2 years ago

Hello @esmuigors,

no Problem! I think the question and answer is still useful for others. If you have more questions don't hesitate to open another issue.