Closed tpoisonooo closed 7 months ago
Fixed after debugging transformers
source code whole day.
Hi @tpoisonooo, I'm facing the same issue, how did you solve is? Anything specific to transformers version or do I need to convert safetensors? Thanks.
It looks like the model saving code using accelerator is saving model.safetensors
file along with the true safetensors files, which is causing the issue. Removing that file seems to have solved the issue. But, I'm not sure why huggingface is loading that file despite specifying a weight map in model.safetensors.index.json
.
@praveenkanithi @tpoisonooo May I ask how you solved this problem? I only get safetensors model, but get the titled error when load the model.
After check #45 , #40 and some hard-code modification, these command passed
And I got these files:
For load it with
passkey.py
, I merge these safetensors into originalNousResearch/Llama-2-7b-hf
and got this errorI noticed that your official
https://huggingface.co/NousResearch/Yarn-Llama-2-7b-64k
does not need any safetensor and can be test succesfully,Did I missed any model conversion script ?