BorisTheBrave / nice-hooks

Convenience functions for working with pytorch hooks.
MIT License
6 stars 0 forks source link

Potential issue with registering forward_pre_hooks #3

Open afspies opened 1 year ago

afspies commented 1 year ago

https://github.com/BorisTheBrave/nice-hooks/blob/a834cd1cfbe7cabe285de87ddd9226d2b4a35a7c/nice_hooks/nice_hooks.py#L227C11-L227C11

I found that p.slice was a tuple for a simple module path model.transformer_encoder.layers.0.self_attn and changing this assert to p[0] is None fixed my issues. Not sure if the expansion is glitchy or I am doing something naughty.

BorisTheBrave commented 1 year ago

do you have a simple example that reproduces this?

afspies commented 1 year ago

Sure:

import torch
import nice_hooks

model = torch.nn.TransformerEncoderLayer(512, 8, 2048, 0.1, batch_first=True)

inp = torch.rand(7, 2, 512)

def forward_pre_hook(module, inp, args):
    print("Forward pre hook")

nice_hooks.run(model, inp, forward_pre_hooks={'self_attn': forward_pre_hook})