-
# Bug Report
## Description
Configured a proxied LiteLLM model in open-webui. Cannot get response completion from the model, UI visualisation just shows a blining awaiting response.
Note I do g…
-
358-540004ef67bc4adc.js:1 Uncaught (in promise) Error: Column 'groq_api_key' of relation 'profiles' does not exist
-
Hello Devika Team and Community,
I've been exploring the capabilities of Devika to automate the development of a complex application. The project is structured into multiple steps, from selecting a…
-
Selecting the openrouter endpoint (i.e. `https://openrouter.ai/api/v1/`) then clicking "models" in the quick settings to list the available models crashes SpeakGPT because `Caused by: w9.b: Fields [cr…
-
### Validations
- [X] I believe this is a way to improve. I'll try to join the [Continue Discord](https://discord.gg/NWtdYexhMs) for questions
- [X] I'm not able to find an [open issue](https://githu…
-
For example, I want to use the native API for a translation software, but the model selection of the translation software is only gpt3.5 or gpt4, what should I do :)?
-
I tried to use the customized llm with strict_json. However, I found that the current version does not support AsyncOpenAI and this module is important for massive generation. I recommend the develope…
-
When the seed is fixed, it can still generate new content, I don't know if I used it wrong or was designed to do so
-
Here, the key being used is `GROQ_API_KEY`, but the error message still says `OpenAI`:
```
% unset OPENAI_API_KEY
% echo 'what is 1+1?' | mods -m mixtral --no-cache
The sum of 1 + 1 is 2.
…
-
https://console.groq.com/playground
This is fast and inexpensive, request to add