OpenInterpreter / open-interpreter

A natural language interface for computers
http://openinterpreter.com/
GNU Affero General Public License v3.0
50.58k stars 4.41k forks source link

Added Groq Support #1238

Open fire17 opened 2 months ago

fire17 commented 2 months ago

Describe the changes you have made:

Groq's official python api now fits well into oi flow, no errors. Though final answers are halucinated rather than actual output. Seems to plan, write code, but not execute yet.

Reference any relevant issues:

Trying to get groq/mixtral to work #1237 aka groq is not working with litellm out of the box --model groq/mixtral-8x7b-32768 throws errors

Pre-Submission Checklist (optional but appreciated):

OS Tests (optional but appreciated):

fire17 commented 2 months ago

@KillianLucas please open a branch groq so i can make a PR to it instead of main atleast until tests are added, and it is fully production ready

as it is now, the errors are comming from litellm's side, groq is too good to not include, hope you feel the same i can see it being the default for a lot of people. it is already default for me in all my other pipelines

Thanks a lot and all the best! 😇

fire17 commented 2 months ago

Current state of PR:

Todos:

Cobular commented 2 months ago

Damn I came to add this fork, beat me by an hour! Nice going!

fire17 commented 2 months ago

haha thanks @Cobular , as stated this pr doest make code execute yet, just swaps the completion apis correctly, so if yours does, you might as well

One important thing.... there is a way to make it work Now just not with --model yet

With techfren's help :heart: The key is to use api_base url --api_base "https://api.groq.com/openai/v1" --api_key $GROQ_API_KEY --model "mixtral-8x7b-32768"

This is working

export GROQ_API_KEY='<your-key-here>'
poetry run interpreter --api_base "https://api.groq.com/openai/v1" --api_key $GROQ_API_KEY  --model "mixtral-8x7b-32768" --context_window 32000

This is NOT working

export GROQ_API_KEY='<your-key-here>'
poetry run interpreter  --model "groq/mixtral-8x7b-32768" --context_window 32000
CyanideByte commented 2 months ago

https://github.com/BerriAI/litellm/pull/3176 Potentially already being done by litellm

KillianLucas commented 1 month ago

NICE. Love Groq, great work on this @fire17. As @CyanideByte mentioned I think we should push this into LiteLLM (they abstract away the Groq interaction so it's = to an OpenAI client.)

And it looks like it works with the latest LiteLLM! interpreter --model groq/llama3-70b-8192 runs OI with groq, and can execute code, if I also pass in my api_key.

In that case, it would be great if we could merge this PR with just the documentation then. I'll make that change then merge if that's okay with you. If there's anything else to include from the PR let me know, we can reopen.

fire17 commented 1 month ago

I'll make that change then merge if that's okay with you. If there's anything else to include from the PR let me know, we can reopen.

For sure! @KillianLucas I've just added the --api_base workaround to the docs incase anyone still running into issues. Welcome to take the doc :)

All the best!

Ps checkout my new PR #1259