Got AttributeError: 'LlamaConfig' object has no attribute 'rope_theta' error when running the Llama-2-7b-chat-hf example from README. Caused by commit 497dba023e0bb7741b38da674c1ca66ca3bd4b42 . Revert to earlier commit a5bc8b26b0e9d5e8aaedd31853f0527c4b47bd22 fixes the problem. Because the attribute rope_theta only exist for CodeLlama but not LLAMA2?
File "/opt/conda/lib/python3.10/site-packages/transformers/modeling_utils.py", line 2675, in from_pretrained
model = cls(config, *model_args, **model_kwargs)
File "/home/PainlessInferenceAcceleration/pia/lookahead/models/llama/modeling_llama.py", line 661, in __init__
self.model = LlamaModel(config)
File "/home/PainlessInferenceAcceleration/pia/lookahead/models/llama/modeling_llama.py", line 484, in __init__
self.layers = nn.ModuleList([LlamaDecoderLayer(config) for _ in range(config.num_hidden_layers)])
File "/home/PainlessInferenceAcceleration/pia/lookahead/models/llama/modeling_llama.py", line 484, in <listcomp>
self.layers = nn.ModuleList([LlamaDecoderLayer(config) for _ in range(config.num_hidden_layers)])
File "/home/PainlessInferenceAcceleration/pia/lookahead/models/llama/modeling_llama.py", line 293, in __init__
self.self_attn = LlamaAttention(config=config)
File "/home/PainlessInferenceAcceleration/pia/lookahead/models/llama/modeling_llama.py", line 184, in __init__
self.rope_theta = config.rope_theta
File "/opt/conda/lib/python3.10/site-packages/transformers/configuration_utils.py", line 261, in __getattribute__
return super().__getattribute__(key)
AttributeError: 'LlamaConfig' object has no attribute 'rope_theta'
Got
AttributeError: 'LlamaConfig' object has no attribute 'rope_theta'
error when running theLlama-2-7b-chat-hf
example from README. Caused by commit 497dba023e0bb7741b38da674c1ca66ca3bd4b42 . Revert to earlier commit a5bc8b26b0e9d5e8aaedd31853f0527c4b47bd22 fixes the problem. Because the attributerope_theta
only exist for CodeLlama but not LLAMA2?