Open sgerke-ST opened 3 years ago
Update: Seems like metrics are only reported at the end of the training.
Would a callback help?
class ReportMetric(tf.keras.callbacks.Callback):
def __init__(self, monitoring=None, metric_tag=None):
self.hypertune = hypertune.HyperTune()
self.monitoring = monitoring
self.metric_tag = metric_tag
def on_epoch_end(self, epoch, logs=None):
name = self.monitoring or ("val_loss" if "val_loss" in logs else "loss")
self.hypertune.report_hyperparameter_tuning_metric(
hyperparameter_metric_tag=self.metric_tag,
metric_value=logs[name],
global_step=self.model.optimizer.iterations.numpy(),
)
[...]
history = model.fit(..., callbacks=[..., ReportMetric(monitoring="val_accuracy", metric_tag="accuracy"))
Hello, I was trying to setup hypertuning with custom containers on Google Cloud ML, however the reported hyperparameters do not show up in the training dashboard in the column where the reported metric values are supposed to show up. Is there anything special needed (e.g. environment variables) to make this work with Google Cloud ML?