OpenInterpreter / 01

The #1 open-source voice interface for desktop, mobile, and ESP32 chips.
https://01.openinterpreter.com/
GNU Affero General Public License v3.0
4.92k stars 517 forks source link

Litellm/01 is unable to connect to non-openAI providers. #272

Open mitstastic opened 4 months ago

mitstastic commented 4 months ago

What causes the issue: Run 01 specifying any non OAI server-host and api key

Expected: Be able to connect to other services like Groq, Anthropic, OpenRouter etc as the seem to be working with the base Open Intepreter

Screenshots:

Screenshot 2024-05-13 at 3 58 25 AM

Using:

Feedback After many attempts using different settings, it seems either 01 is not passing the right arguments to litellm, or litellm isn't yet correctly configured for other providers for 01

mitstastic commented 4 months ago

Update: This error above was probably caused by using a url argument with --server-host instead of --server-url erroneously, however the connection still doesn't open with the latter--See pic below

Screenshot 2024-05-13 at 6 09 33 PM
rwmjhb commented 4 months ago

This is also the question I want to ask. It turns out that the command line is written like this. Do I need to install the litellm service first and start it to obtain the local connection interface?

Merlinvt commented 4 months ago

Since OpenInterpreter uses litellm, I think you need to specify this differently. Here is what I think would work: 'poetry run 01 model "groq/gemma-7b-it --tts-service piper --stt-service local-whisper'."

Litellm already pulls all the data automatically if you specify the provider in the model. Or at least it should do that.

Here are some instructions on how to get it to work with open router: https://discordapp.com/channels/1146610656779440188/1194880263122075688/1240334434352365569

mitstastic commented 4 months ago

Well as you know in the discord community some people seemed to suggest 01 is automatically appending "openai/" before the model names specified in the arguments. So for instance you might end up with "openai/groq/gemma-7b-it". Is that what's causing the issue?

Litellm already pulls all the data automatically if you specify the provider in the model. Or at least it should do that.

If it does, why the need to specify all the details when people use the open interpreter directly? And based on my experience when I leave out the server arguments it seems to default to OAI and complain about no OAI key set. So I think something in the litellm code for 01 is probably interfering or not fully configured to support other providers yet, as it's only been confirmed working with GPT.

rwmjhb commented 4 months ago

Does the project side care about it? No developer has responded to questions for so many days?

Merlinvt commented 4 months ago

If you want to get the 01 to work with open router ( and others? ), you can try this:

Screenshot_from_2024-05-15_18-05-36.png

Screenshot_from_2024-05-15_18-05-49.png

It's still super unintuitive and I think maybe should be made more intuitive. But you can make it work.

The openai key is for whisper and TTS. If you use a local model you can leave this out.

I also forgot the "poetry install" before "poetry run" ?

Different model name would be "openrouter/meta-llama/llama-3-70b"

rwmjhb commented 4 months ago

What if I haven't openai key and also local model,how can I use whisper and TTS?Can I only use openrouter apikey for all fucntion?

Merlinvt commented 4 months ago

Openrouter does not have whisper. There is a rewrite on the way that implements more options for TTS and STT. https://github.com/KillianLucas/01-rewrite I don't think they will implement any options in this repo, so without Openai or the local models you might need to wait until the rewrite is done. But I could be wrong. You can use Open Interpreter until then.

aj47 commented 4 months ago

this is how i ran 01 with groq and local tts/stt/ changing i.py as per following diff and also running with / poetry run 01 --stt-service local-whisper --tts-service piper

diff --git a/software/source/server/i.py b/software/source/server/i.py
index bc792fd..f7a7454 100644
--- a/software/source/server/i.py
+++ b/software/source/server/i.py
@@ -185,10 +185,14 @@ def configure_interpreter(interpreter: OpenInterpreter):
     ### SYSTEM MESSAGE
     interpreter.system_message = system_message

-    interpreter.llm.supports_vision = True
+    interpreter.llm.supports_vision = False
     interpreter.shrink_images = True  # Faster but less accurate

-    interpreter.llm.model = "gpt-4"
+    # RUN WITH THIS COMMAND FOR LOCAL TTS AND STT 
+    # `poetry run 01 --stt-service local-whisper --tts-service piper`
+    interpreter.llm.model = "llama3-70b-8192"
+    interpreter.llm.api_base = "https://api.groq.com/openai/v1/"
+    interpreter.llm.api_key = "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" # YOUR API HERE

     interpreter.llm.supports_functions = False
     interpreter.llm.context_window = 110000
achoozachooz commented 4 months ago

Can you tell which of the line we need to change??

Merlinvt commented 4 months ago

@aj47 just making sure, that the api key is fake or revoked ;)

aj47 commented 4 months ago

@aj47 just making sure, that the api key is fake or revoked ;)

yep all g, revoked before posting

guoper59 commented 3 months ago

this is how i ran 01 with groq and local tts/stt/ changing i.py as per following diff and also running with / poetry run 01 --stt-service local-whisper --tts-service piper

diff --git a/software/source/server/i.py b/software/source/server/i.py
index bc792fd..f7a7454 100644
--- a/software/source/server/i.py
+++ b/software/source/server/i.py
@@ -185,10 +185,14 @@ def configure_interpreter(interpreter: OpenInterpreter):
     ### SYSTEM MESSAGE
     interpreter.system_message = system_message

-    interpreter.llm.supports_vision = True
+    interpreter.llm.supports_vision = False
     interpreter.shrink_images = True  # Faster but less accurate

-    interpreter.llm.model = "gpt-4"
+    # RUN WITH THIS COMMAND FOR LOCAL TTS AND STT 
+    # `poetry run 01 --stt-service local-whisper --tts-service piper`
+    interpreter.llm.model = "llama3-70b-8192"
+    interpreter.llm.api_base = "https://api.groq.com/openai/v1/"
+    interpreter.llm.api_key = "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" # YOUR API HERE

     interpreter.llm.supports_functions = False
     interpreter.llm.context_window = 110000

Thank you Sir, this it works for me, really, really appreciated