Open zinwelzl opened 3 months ago
Like I said, tested 2.0.6 and 2.1.0 this weekend, and both work great. No problem!
Glad to hear that :wink:
Now about LLM. ollama is very slow on VPS, no GPU.
Can you add Groq API? https://console.groq.com/docs/models https://console.groq.com/docs/api-reference#chat It is free for now, only need registration.
At the moment our goal is to stabilize reNgine-ng, but we add your request to the backlog We will respond to you later
If you think you are able to do this and submit a PR, go for it !
Expected feature
Hi.
Like I said, tested 2.0.6 and 2.1.0 this weekend, and both work great. No problem!
Now about LLM. ollama is very slow on VPS, no GPU.
Can you add Groq API? https://console.groq.com/docs/models https://console.groq.com/docs/api-reference#chat It is free for now, only need registration.
Thanks for great project!
Alternative solutions
No response
Anything else?
No response
Acknowledgements