kssteven418 / SqueezeLLM-gradients

Apache License 2.0
12 stars 7 forks source link

num_linear_layers not found in Llama #7

Open Quang-elec44 opened 7 months ago

Quang-elec44 commented 7 months ago

Hi, I'm trying to do the quantization with the Bloom model and cannot find the equivalent attribute num_linear_layers. In the file run.py, you said that it's llama-specific but when I used the llama model ("NousResearch/Llama-2-7b-chat-hf"), I couldn't find the num_linear_layers either. Please check again!