Open puja93 opened 2 months ago
Hi, after quantizing LLAMA3, the layer sort of expanded in this checkpoint :
Now i can load the original Llama3 just fine using the layers provided here https://github.com/meta-llama/llama3/blob/main/llama/model.py, because the checkpoint has the same corresponding layers.
I wonder if you guys have written model.py for quantized llama model.
Thanks
Hi, after quantizing LLAMA3, the layer sort of expanded in this checkpoint :
Now i can load the original Llama3 just fine using the layers provided here https://github.com/meta-llama/llama3/blob/main/llama/model.py, because the checkpoint has the same corresponding layers.
I wonder if you guys have written model.py for quantized llama model.
Thanks