marella / ctransformers

Python bindings for the Transformer models implemented in C/C++ using GGML library.
MIT License
1.76k stars 137 forks source link

Add Support for Google/Gemma-2b-it #207

Closed Arya920 closed 3 months ago

Arya920 commented 3 months ago

I am facing the same error , I am using GGUF version of a "fine tuned GEMMA-2B-it model" using the following libraries ~

from langchain_community.llms import CTransformers

model link--> https://huggingface.co/Shritama/GEMMA-2b-GGUF/tree/main Now while inferencing it 's showing something like this ~ [RuntimeError: Failed to create LLM 'gguf' from 'D:\ISnartech Folder\Project_Folder\Streamlit APP\GgufModels\Q4_K_M.gguf'. ] please help.