marella / ctransformers

Python bindings for the Transformer models implemented in C/C++ using GGML library.
MIT License
1.8k stars 137 forks source link

WizardCoder-Python-34b GGUF #116

Open MichaelMartinez opened 1 year ago

MichaelMartinez commented 1 year ago

This may not be a problem with ctransformers per se. If I have made an error posting here, please feel free to close this post haste and I apologize for wasting your time. Thank you for this brilliant code, it runs soooo buttery smooth!

That said, I am attempting to use TheBloke WizardCoder-Python-34b with lollms. I am unable to bind it with c_transformers. I can bind any of the CodeLlama models (GGUF) perfectly well... so it seems to be a config problem, I am just unable to figure out how to correct. I have attempted to change a few params in init.py in the lollms integration with c_transformers but nothing has worked.

Bindings zoo found in your personal space.
Pulling last personalities zoo
Already up to date.
Personalities zoo found in your personal space.
Pulling last personalities zoo
Model built
update_settings : New model selected
Configuration model_name updated
Already up to date.
Extensions zoo found in your personal space.
Pulling last personalities zoo
Already up to date.
>Loading binding c_transformers. Please wait ...
Binding c_transformers loaded successfully.
>Loading model wizardcoder-python-34b-v1.0.Q2_K.gguf. Please wait ...
Building model
The model you are using is not supported by this binding
Personality  lollms mounted successfully but no model is selected
Checking discussions database... ok
Your personal data is stored here :D:\CODE\Chat\lollms_config
debug mode:false
Please open your browser and go to http://localhost:9600 to view the ui
 * Serving Flask app 'Lollms-WebUI'
 * Debug mode: off