OpenInterpreter / open-interpreter

A natural language interface for computers
http://openinterpreter.com/
GNU Affero General Public License v3.0
50.58k stars 4.41k forks source link

Adding Groq Support #1237

Open fire17 opened 2 months ago

fire17 commented 2 months ago

Is your feature request related to a problem? Please describe.

Hi there :) I have not seen any other issue or pr on this

I prefer not to use openai api, i dont trust my data with them.

I think if we can get mixtral or llama3-70b (or future 400b) to work with OpenInterpreter it will be a much needed speed improvement and cost reduction.

Describe the solution you'd like

I would like to be able to use Groq's API instead of OpenAI's

They offer the best-in-the-world inference speeds They are not yet affiliated with any deep corp (that i am aware of)

plus, their api is free, atleast for now

Describe alternatives you've considered

Honestly, since it is not mentioned at all (not in the readme, not in the docs, issues, or prs) i thought this wasn't already supported, but reading litellm docs i can see that they do support groq already

Additional context

I have already played with the official groq python api quite a bit, using it in other advanced pipelines, and it seems it will fit well with oi, but when using --model groq/mixtral-8x7b-32768 i am getting errors: Screen Shot 2024-04-26 at 5 40 35

I have created a new branch groq to try to fix this, bypassing litellm and using the official api directly I've managed to hook it well into the existing oi flow - no errors :) but also not getting any code executed...

Screen Shot 2024-04-26 at 5 31 21 its doing everything right, but seems to halucinate the result intead of actually running it the code it wrote

i understand that it has to output 'execute' and 'code' but not sure in which format, can anyone tell me what format oi is expecting ( i currently dont have openai api to test against it ) example for a working output json would be great seems it's just a matter of making mixtral and other models work well with oi expected output

(ill do more reseach, maybe someone solves this in another issue)

Really hoping to get this to work Thanks a lot and all the best!

fire17 commented 2 months ago

With @aj47's help ❤️ The key is to use api_base url --api_base "https://api.groq.com/openai/v1" --api_key $GROQ_API_KEY --model "mixtral-8x7b-32768"

This is working

export GROQ_API_KEY='<your-key-here>'
poetry run interpreter --api_base "https://api.groq.com/openai/v1" --api_key $GROQ_API_KEY  --model "mixtral-8x7b-32768" --context_window 32000

This is NOT working

export GROQ_API_KEY='<your-key-here>'
poetry run interpreter  --model "groq/mixtral-8x7b-32768" --context_window 32000