from ctransformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("radames/phi-2-quantized", hf=True)
I get:
RuntimeError: Failed to create LLM 'phi-msft' from '/root/.cache/huggingface/hub/models--radames--phi-2-quantized/blobs/77971e348da4b424832a089f812ba50dec2bd633ae39d26b4f2c89c0ff3dea27'.
However, when I use transformers, it runs fine (see Google Colab).
When running:
I get:
RuntimeError: Failed to create LLM 'phi-msft' from '/root/.cache/huggingface/hub/models--radames--phi-2-quantized/blobs/77971e348da4b424832a089f812ba50dec2bd633ae39d26b4f2c89c0ff3dea27'.
However, when I use transformers, it runs fine (see Google Colab).