Open kshitij12345 opened 4 months ago
triage review:
This issue is still present in the torchbench HF models: hf_GPT2, hf_GPT2_large, hf_T5 ,hf_T5_base, hf_T5_large. In particular the error in that case can be reproduced by setting the buffer to None:
class TestModel(torch.nn.Module):
def __init__(self, *args, **kwargs) -> None:
super().__init__(*args, **kwargs)
self.register_buffer("buffer", torch.ones((1, 2)))
def forward(self, x):
self.buffer = None
return x
With stacktrace similar to:
File "/opt/pytorch/lightning-thunder/thunder/core/jit_ext.py", line 1698, in thunder_general_jit
process_recorded_modifications(ctx, epilogue_trace)
File "/opt/pytorch/lightning-thunder/thunder/core/jit_ext.py", line 1578, in process_recorded_modifications
assert isinstance(value.value, Proxy)
AssertionError
To reproduce the original error from HF, checkout to hf-benchmarks
and run pytest thunder/benchmarks/targets.py -k "torchbench and hf and -thunder-hf" -v --benchmark-disable
. No need to checkout to the branch if #1238 is merged at the time of reading.
Errors with: