hao-ai-lab / LookaheadDecoding

[ICML 2024] Break the Sequential Dependency of LLM Inference Using Lookahead Decoding
https://arxiv.org/abs/2402.02057
Apache License 2.0
1.15k stars 67 forks source link

NameError: name 'F' is not defined #17

Closed CSWellesSun closed 11 months ago

CSWellesSun commented 1 year ago

Traceback (most recent call last): File "/home/xxx/project/lade/minimal.py", line 30, in greedy_output = model.generate(model_inputs, max_new_tokens=1) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/xxx/miniconda3/envs/xxx/lib/python3.11/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context return func(*args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/home/xxx/miniconda3/envs/xxx/lib/python3.11/site-packages/transformers/generation/utils.py", line 1606, in generate return self.greedy_search( ^^^^^^^^^^^^^^^^^^^ File "/home/xxx/project/lade/lade/decoding.py", line 23, in greedy_search_proxy return jacobi_greedy_search_multilevel(self, chat=False, args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/xxx/project/lade/lade/decoding.py", line 278, in jacobi_greedy_search_multilevel outputs = self.jforward_multilevel( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/xxx/project/lade/lade/models/llama.py", line 405, in jforward_multilevel logits = [F.linear(hidden_states, lm_head_slices[i]) for i in range(self.config.pretraining_tp)] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/xxx/project/lade/lade/models/llama.py", line 405, in logits = [F.linear(hidden_states, lm_head_slices[i]) for i in range(self.config.pretraining_tp)] ^ NameError: name 'F' is not defined

Viol2000 commented 1 year ago

I did not consider the compatibility with pretraining_tp>1 Add 1 line import torch.nn.functional as F to quick fix.