marella / ctransformers

Python bindings for the Transformer models implemented in C/C++ using GGML library.
MIT License
1.76k stars 137 forks source link

How to load the finetuned model in safetensors format(not in gguf) #201

Open pradeepdev-1995 opened 4 months ago

pradeepdev-1995 commented 4 months ago

I have finetuned the mistral base model with my data using LORA PEFT Base model tried: mistralai/Mistral-7B-Instruct-v0.2 Finetuned merged model folder structure: Screenshot from 2024-02-15 10-49-03 All model files are in safetensors format. not in gguf. Now i tried to load the finetuned model with ctransformes

import ctransformers
from ctransformers import AutoModelForCausalLM
llm = AutoModelForCausalLM.from_pretrained("Directory Path",
                                           model_file="Model folder name",
                                           model_type="mistral",
                                           gpu_layers=1)

But it shows the given error

ValueError: Model file '<Model folder name>' not found in '<Directory Path>'