Open nietootein opened 5 years ago
The script run_multiple_configurations.py is available for grid search or random search, but it doesn't have any optimization features.
Thanks, Ari. An idea would be to have a new script to handle the optimization procedure, having grid/random search as an option, but also including the training of a bunch of pre-defined configurations, in a sense including the functionality of run_multiple_configurations.py in it. How does that sound?
Yes, that makes sense. To clarify, run_multiple_configurations can handle both the cases of grid/random search and training pre-defined configurations. However, it has no automatic optimization capabilities, that is, to optimize each set of runs you have to manually look at the output metrics from the previous runs and re-tune the search parameters, iteratively.
Hi @nietootein @aribrill, I came across this project in your GSoC page and would like to work on it. Can you guys please guide me on where to start?
Hello, I am interested in taking up this project for GSOC-2019 and I have already started working on it.
Hello, I am a first year student in Columbia Engineering and would love to work on this project through GSOC 2019 and beyond. I would love to get in touch with the mentors and discuss my ideas. I am currently doing research at the intersection of deep learning and physics and have taken some physics classes, so the applicability of this project really motivates me! I look forward to learning more about how I can get started on this and if you have a guide for writing the proposal!
Hello everyone, I'm a third year undergrad from India and I would like to contribute to this project and at the same time apply for GSoC 2020. It would be really helpful if any mentors out there could help me get started.
Hi @bolt25! We are only participating with the root project for GSoC 2020. In case you want to have a look at the outcome of this GSoC 2019 project, you can find it here. @nietootein, can we close this issue?
Model optimization needs to be implemented.
A skeleton for a possible iterative workflow that will pursue model optimization would look like this:
Point 3 could be executed by a script capable of writing, for a given model, a new configuration file from scratch (with default hyperparameters), or updating a previously existing one, taking as input the hyperparameters to be updated.
For point 2, the user should be able to select the metric to be optimized among a collection of implemented metrics (e.g. accuracy, auc, quality factor, etc.).
There are many strategies that could be implemented in point 3 for the exploration of the hyperparameter space:
There could (very likely) be software packages out there that may already feature most of the functionalities that we may need.