Stability-AI / StableLM

StableLM: Stability AI Language Models
Apache License 2.0
15.84k stars 1.03k forks source link

OSError: stabilityai/stablelm-base-alpha-3b-v2 does not appear to have a file named pytorch_model.bin, tf_model.h5, model.ckpt or flax_model.msgpack. #99

Closed RylanSchaeffer closed 1 year ago

RylanSchaeffer commented 1 year ago

When I try loading a StableLM, I get an OSError. Code from HuggingFace:

from transformers import AutoModelForCausalLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("stabilityai/stablelm-base-alpha-3b-v2")
model = AutoModelForCausalLM.from_pretrained(
  "stabilityai/stablelm-base-alpha-3b-v2",
  trust_remote_code=True,
  torch_dtype="auto",
)

Immediate error:

OSError: stabilityai/stablelm-base-alpha-3b-v2 does not appear to have a file named pytorch_model.bin, tf_model.h5, model.ckpt or flax_model.msgpack.

Can someone please help me solve this?

RylanSchaeffer commented 1 year ago

I thought my huggingface-hub might be out of date, so I upgraded from 0.10.X to 0.16.4 and the error persists.

jon-tow commented 1 year ago

Hi, @RylanSchaeffer. Which version of transformers do you have installed? The model weights are stored as safetensors, and the safetensors package has become a core dependency of the lib since v4.30.0. Try updating with:

pip install -U transformers
RylanSchaeffer commented 1 year ago

That seemed to do the trick - thank you!