Open GeoffNN opened 3 years ago
Hey @GeoffNN
The closest to a hook I can think of with optimizers is the closure that we pass in the optimizer.step()
method. You could log the buffers as a histogram in that step.
As for the watch for optimizers, I am creating a ticket for the same and would update the thread as soon as I get any info about the same.
This issue is stale because it has been open 60 days with no activity.
I'm working on optimizers in PyTorch, and would like to log some buffers in my optimizer. Having something like
wandb.watch
for optimizers would be a great way to understand what's going on during training. In a PyTorch Optimizer, each parameterp
has an associatedstate
dict; using keys for logging values of this dict for each parameter would be awesome.Example use:
wandb.watch_opt(optimizer, keys=['exp_avg', 'step'], modes=['hist', 'value'], log_freq=1000)
I don't know that Optimizer objects have hooks like
nn.Module
objects do.