-
**env:**
transformers ==4.35.2
ctransformers==0.2.27+cu121
```
from ctransformers import AutoModelForCausalLM, AutoTokenizer
model_name = "/home/me/project/search_engine/text-generation-web…
-
Hi there.
Very nice project!
Would it be possible to use an OpenAi compatible APi endpoint with a local LLM through [LM Studio](https://lmstudio.ai/) or [text-generation-webui](https://github.co…
-
### Describe the bug
I checked IlyaGusev/saiga_llama3_8b_gguf, in LM Studio I get around 45-49 tokens, while in webui I get only 21 tokens.
![image](https://github.com/oobabooga/text-generation-we…
-
Hello, I can't get llm-text to work with ollama. Can I have some explanations on how to configure the setup exactly, for example about the API key etc... Thank you in advance. I have Ollama running on…
-
### Epic domain
### Problem:
Chat / Playgrounds for LLM communication is very difficult to code because there are **so** many formats to think about (markdown, html, graphs, special codes, etc.)…
-
### App Information
- Name: Ollama
- Short Description: App for running LLM
- Official Website: https://ollama.com
- GitHub Repository: https://github.com/ollama/ollama
- Docker Image: llama/olla…
-
So I did a conversation with my chatbot in solar 7B .awq and this extension worked fine
Then I switched to a .gptq quant of the same model and continued the conversation and this is what I got
`Tr…
-
Hello,
i have got a issue while runnning ollama on the A380 gpu.
This log snippet is from the ollama log while executing a prompt from open-webui.
The system runs:
Fedora 39
Kernel 6.10.7
…
-
### Describe the bug
Well, basically a summary of my problems: I am using the most up-to-date version of Ubuntu, where, by the way, I did a completely clean installation just to test the interface an…
-
```
2023-07-15 22:07:19 ERROR:Failed to load the extension "guidance_api".
Traceback (most recent call last):
File "O:\Workspace\AI\LLM\ooba\text_generation_webui_01\text-generation-webui\modules…