-
Currently we have support for OpenAI & Groq, for both we are using the OpenAIChatClient class in common/ai_model.py - this is no longer accurate as it is not just OpenAI.
Rename this to something m…
-
Where i how do we set a Api key and a custom baseurl to make it compatible wirh Openai API?
-
### Brief Description
. add support for chat groq agent
### Rationale
1. Faster streamin response
### Suggested Implementation
**vocode/streaming/agent/groq_agent.py**
```
import logg…
-
**NOTE: ~~It~~ Mixtral can at times be... Fragile. Let's call it that. Keep the temperature *LOW*. You can indeed drive it nuts, at least with the system prompt I was using.**
I intend to make a fo…
-
AI Thought Bubble - Next Action:
Thought: I need to read more content from other relevant websites and articles to gather more information about the topic.
Action: Read website content
Action…
-
如题。现在基本都是硬编码写死的,感觉不是很灵活,类似 OpenAI 的模型未开通 plus 的话其实是用不了 4 的模型的,但由于硬编码的缘故,只要配置 API key,都会固定列出那几个模型。
-
[Groq](https://groq.com/) has tremendous inference speeds (280 tokens per second for Llama 3 70B and 877 tokens per second for Llama 8B0. It would be amazing to get support for this in Jupyter AI.
-
### Brief Description
Add support for using Groq as an LLM
### Rationale
Groq has an [OpenAI-compatible API](https://console.groq.com/docs/text-chat) that allows you to call Mixtral LLM generation …
-
希望将:
OPEN_AI_API_KEY / OPEN_AI_API_HOST / GEMINI_API_KEY /GROQ_API_KEY 等定义为environment,方便docker定义。
-
- [ ] [phidata/cookbook/groq/README.md at main · phidatahq/phidata](https://github.com/phidatahq/phidata/blob/main/cookbook/groq/README.md)
# TITLE: phidata/cookbook/groq/README.md at main · phidata…