Closed SimonBlanke closed 2 years ago
@SimonBlanke let me start with this.
@djokester I would like to give you some guidance for this task:
You can start by creating the class "GridSearch" and inherit both the BaseOptimizer and Search. You can use random search as a template for this.
You should also keep in mind that each optimizer has the following methods:
For the GridSearch init_pos, finish_initialization and evaluate can be inherited from BaseOptimizer. "evaluate" doesn't need to be implemented for GridSearch because it does not decide what to do next based in the new score.
This is all I figured out about the implementation. I hope this helps!
Yeah @SimonBlanke. Thanks it helps a lot
Hello @djokester,
have you made some progress on the grid search algorithm? If you need some additional assistance I would be happy to help.
@SimonBlanke I have gone through the code base. I will send a PR by this weekend.
@djokester a PR would be nice! You can ignore the extended tests (performance, parameters, ...), I will take care of those. If you have a basic concept we can continue from that together.
@djokester Any news? Thank you for your time!
@cryptocoinserver we have an open PR #18, let us know what you think!
@tgdn and his team @EdouardVilain-Git, @tabeare, @annabieber, @MachineLearner75 provided an implementation of grid-search, which can be used in Gradient-Free-Optimizers >= v0.5.0.
You can read more about it in the optimization-tutorial.
I think it would be useful to have a grid search optimizer in this package. But its implementation would probably be quite different from other ones (sklearn, ...).
The requirements are: