cybertronai / autograd-hacks

The Unlicense
153 stars 32 forks source link

How to run calculate the gradient per sample in a loop? #13

Closed cijerezg closed 2 years ago

cijerezg commented 2 years ago

In a nutshell, my code looks like this:

autograd_hacks.add_hooks(model)
all_params = []
for i in range(20):
      epoch_params = []
      train_loss = training_function(model, data, lr)
      autograd_hacks.compute_grad1(model)
      for name, params in model.named_parameters():
            sample_grads = params.grad1.clone().cpu().detach().numpy()
            epoch_params.append(sample_grads)
      all_params.append(epoch_params)
      autograd_hacks.disable_hooks(model)

all_params should contain different values as the gradients are changing every epoch, but it outputs always the same array. I tried using remove_hooks and clear_backprop but either it gave me errors or it does nothing. The training function has the usual loss, step, etc. I'd imagine the solution to this is easy. If it is not, I can write a minimal reproducible example.