wandb / wandb

🔥 A tool for visualizing and tracking your machine learning experiments. This repo contains the CLI and Python API.
https://wandb.ai
MIT License
8.65k stars 643 forks source link

[Feature] Logging something from the optimizer state dict #2171

Open GeoffNN opened 3 years ago

GeoffNN commented 3 years ago

I'm working on optimizers in PyTorch, and would like to log some buffers in my optimizer. Having something like wandb.watch for optimizers would be a great way to understand what's going on during training. In a PyTorch Optimizer, each parameter p has an associated state dict; using keys for logging values of this dict for each parameter would be awesome.

Example use: wandb.watch_opt(optimizer, keys=['exp_avg', 'step'], modes=['hist', 'value'], log_freq=1000)

I don't know that Optimizer objects have hooks like nn.Module objects do.

ariG23498 commented 3 years ago

Hey @GeoffNN The closest to a hook I can think of with optimizers is the closure that we pass in the optimizer.step() method. You could log the buffers as a histogram in that step.

As for the watch for optimizers, I am creating a ticket for the same and would update the thread as soon as I get any info about the same.

github-actions[bot] commented 3 years ago

This issue is stale because it has been open 60 days with no activity.