Open fire17 opened 2 months ago
@KillianLucas please open a branch groq
so i can make a PR to it instead of main
atleast until tests are added, and it is fully production ready
as it is now, the errors are comming from litellm's side, groq is too good to not include, hope you feel the same i can see it being the default for a lot of people. it is already default for me in all my other pipelines
Thanks a lot and all the best! 😇
groq.mdx
docs addedDamn I came to add this fork, beat me by an hour! Nice going!
haha thanks @Cobular , as stated this pr doest make code execute yet, just swaps the completion apis correctly, so if yours does, you might as well
--model
yetWith techfren's help :heart:
The key is to use api_base url --api_base "https://api.groq.com/openai/v1" --api_key $GROQ_API_KEY --model "mixtral-8x7b-32768"
export GROQ_API_KEY='<your-key-here>'
poetry run interpreter --api_base "https://api.groq.com/openai/v1" --api_key $GROQ_API_KEY --model "mixtral-8x7b-32768" --context_window 32000
export GROQ_API_KEY='<your-key-here>'
poetry run interpreter --model "groq/mixtral-8x7b-32768" --context_window 32000
https://github.com/BerriAI/litellm/pull/3176 Potentially already being done by litellm
NICE. Love Groq, great work on this @fire17. As @CyanideByte mentioned I think we should push this into LiteLLM (they abstract away the Groq interaction so it's = to an OpenAI client.)
And it looks like it works with the latest LiteLLM! interpreter --model groq/llama3-70b-8192
runs OI with groq, and can execute code, if I also pass in my api_key
.
In that case, it would be great if we could merge this PR with just the documentation then. I'll make that change then merge if that's okay with you. If there's anything else to include from the PR let me know, we can reopen.
I'll make that change then merge if that's okay with you. If there's anything else to include from the PR let me know, we can reopen.
For sure! @KillianLucas I've just added the --api_base workaround to the docs incase anyone still running into issues. Welcome to take the doc :)
All the best!
Ps checkout my new PR #1259
Describe the changes you have made:
Groq's official python api now fits well into oi flow, no errors. Though final answers are halucinated rather than actual output. Seems to plan, write code, but not execute yet.
Reference any relevant issues:
Trying to get groq/mixtral to work #1237 aka groq is not working with litellm out of the box
--model groq/mixtral-8x7b-32768
throws errorsPre-Submission Checklist (optional but appreciated):
docs/CONTRIBUTING.md
docs/ROADMAP.md
(not fully yet, but no mention of groq)OS Tests (optional but appreciated):