Closed HakaishinShwet closed 3 months ago
@HakaishinShwet Groq implements an OpenAI-compatible API, which means that you can already use it with Continue if you have an API key by doing something like here: https://continue.dev/docs/reference/Model%20Providers/openai#openai-compatible-servers--apis
As for using Groq by default, we'd love to do this, but will have to wait until we are given access : )
Let me know if you have any questions!
@sestinj thanks for guiding. was just curious about that "given access" part so if possible can you explain little bit about it :-))
@HakaishinShwet We now have access, but Groq has very low rate limits, so we won't be able to serve it through our free trial. If you can obtain your own API key though, you will be able to use it with Continue by adding something like the following to your config.json:
{
"models": [
{
"title": "Groq",
"provider": "openai",
"model": "mixtral-8x7b-32768",
"apiKey": "EMPTY",
"apiBase": "https://api.groq.com/openai/v1"
}
]
}
Validations
Problem
No one Right now provides fast inference like groq and they are use best open source models with are more capable than gpt 3.5 plus in some case beat gpt 4 plus there price for token generation is pretty great so many will want to use it and they have api too so in this solution we can use like open ai gpt etc
Solution
just implement groq api system, they provide easy api Hope to see it soon in this awesome foss solution :-)) Thankyou very much