Open morris90901 opened 4 months ago
It's possible and is really something I should officially support at some point. For now you can hack it together by making the following changes.
GROQ_BASE_URL
to your custom OpenAI compatible server, i.e. export GROQ_BASE_URL=http://127.0.0.1:6748/v1
server.py
to return a list of valid models for groq
. The models should all start with groq/
, i.e. groq/llama3
Let me know if that works for you. Eventually we could make this more pluggable / a native feature.
Is it possible to add a different model maybe something that we can change in the code if it's not directly supported? If we are using openrouter which has openai compatible API Is there any way we can use some other models that are available on openrouter