-
## Expected Behavior
Selected model after downloading it.
## Current Behavior
It looks like you we couldn't load the model.
Here is the error message:
Could not find module 'D:\lollms-webui\ins…
-
### Description
@vince-westmonroe has been playing around with locally hosted LLMs and LangChain. It turns out that LangChain has internal support for the CTransformers package that will download m…
-
hey any chance the team can work to provide ctransformers / GGML support? also key description options would be clutch, thanks
-
@hippalectryon-0 introduced HF text embeddings with #45.
May you - if it fits you well - elaborate how [this](https://github.com/marella/ctransformers#langchain) performs?
Edit: missing embeddi…
-
I am trying to run this project on a Macbook pro m1 and am getting the following stacktrace.
```
Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/chainlit/utils…
-
Does ctransformers support ollama models?
How do I specify the model in this code below?
llm = CTransformers(model="***where is the model file for a ollama model?",
model_…
-
### Issue you'd like to raise.
I have installed langchain and ctransformer using -
```
pip install langchain
pip install ctransformers[cuda]
```
I am trying following piece of code -
```
…
-
Is there any chance of a future integration with CTransformers or something similar to allow for guided generation using quantized models on CPU? If I were to try and hack away at this, what would be …
-
Tracker to add support to ctransformers https://github.com/marella/ctransformers
https://github.com/jllllll/ctransformers-cuBLAS-wheels
-
After follow the guide at last step https://postgresml.org/docs/guides/unified-rag#unified-retrieval-+-reranking-+-text-generation
I got this error after run the query (via Docker)
I think is crash …