-
In the WebUI under: Trial details -> Hyper-parameter one can choose to show the top xx% trials.
By default this shows the trials with the highest scores.
Is there a way to change this to showing the…
-
Correct me if I'm wrong, but (and I see this is a recurring issue) since the tuner seems to tune hyperparameters regardless of whether they're active or not, wouldn't this have an effect on the way Ba…
-
Is there any possibly skip to the next epochs in case a trial is stalled, or starts decreasing the goal? (defined under certain custom options)
Also, I think there should be a way of training the m…
-
Whilst training a model using a BayesianOptimization tuner on a Ubuntu 22.04 system, as the programs runs, the program continues to increase its usage of RAM without releasing it in-between trials, un…
-
Hi,
I was wondering if I could include the ray tune (hyper-parameter search) library as either a callback or in the base trainer class to look for the right hyper-parameters for a model and even st…
-
Is the `epochs` argument in the `search()` method redundant for `Hyperband`?
For what I understood the algorithm should "automatically" allocate the the number of epochs during the tuning process …
vb690 updated
2 years ago
-
Using most recent version of keras-tuner.
Here is what my code looks like.
```
def build_model(hp):
model = keras.Sequential()
for i in range(hp.Int('num_layers', min_lay…
-
I am currently tring to set up a Hyperparameter Optimization using multiple GPUs on a single Host.
I followed and implemented this tutorial:
https://keras-team.github.io/keras-tuner/tutorials/distr…
-
The problem:
It is hard to know an adequate total_time_limit for a specific training scenario in AutoML. The time limit depends on data size, training machine capability and algorithms chosen. Selec…
-
Hi there,
I have been having an issue with searches easily going up and beyond a few tens of GB, which on my current setup becomes prohibitive when I have multiple projects, etc.
I was wondering if …