-
Changing the example [MiddleSchoolMathAgent.ts](https://github.com/lgrammel/modelfusion/blob/main/examples/middle-school-math-agent/src/MiddleSchoolMathAgent.ts) from openai to ollama results in the f…
-
### System Info / 系統信息
目前Xinference更新到0.13.3,transformers为4.42.1,GPU为4090、3090。若使用transformers引擎:
```
xinference launch --model-engine transformers -u glm4-chat -n glm4-chat -s 9 -f pytorch --max_m…
-
> Found out that the 'OPENAI_API_TYPE' value 'llama2' does not work. I also noticed that the `llm = get_llm()` is used 3 times but not used in the code. A llm is used via `chat = get_cha…
-
Congratulations on your achievements, @austin-starks ! I see a huge potential for this project!
I was wondering if you could implement support for Groq and open source fast models such as Llama 3.1…
-
This is a little more complicated as it will require creating an Ollama Modelfile / manifest in addition to linking the models.
- lm-studio (mostly) parses the filename and the GGML/GGUF metadata t…
-
### What is the issue?
I use Proxmox VE for virtualization. If I install ollama in a Linux VM it works fine. If I install Ollama in a LXC (Host Kernel 6.8.4-3) it don't works with CPU.
#####
olla…
-
I am trying to change the environment to Ollama in Danswer, However, I am encountering an issue where it still requests an OpenAPI key. This issue persists despite multiple attempts to resolve it.
…
-
The wrong occurred when I use new llama3.1,and it never occurred when I use other models
![download](https://github.com/user-attachments/assets/5bd93a1b-c70e-42d4-a5de-c4e3f752ceae)
-
Hi,
this plugin is not working for me and I don't quite understand why (maybe because I'm on windows?).
I don't get a response from ollama when using this plugin, so the response modal is always…
-
![微信截图_20240626152830](https://github.com/ollama/ollama/assets/129468439/23e316b8-cb87-4783-81af-96b94690a61a)