OpenInterpreter / 01

The #1 open-source voice interface for desktop, mobile, and ESP32 chips.
https://01.openinterpreter.com/
GNU Affero General Public License v3.0
4.91k stars 512 forks source link

`UnboundLocalError` and unable to override `interpreter.llm.model` #116

Open jcp opened 6 months ago

jcp commented 6 months ago

There are two issues within software/source/server/i.py:

1. UnboundLocalError

The os module is imported at the top level and within the configure_interpreter function, which results in an UnboundLocalError error. You can reproduce this by using os.getenv within the configure_interpreter function.

File "/Users/jcp/Development/01/software/source/server/i.py", line 193, in configure_interpreter
    interpreter.llm.model = os.getenv("MODEL", "gpt-4")
                            ^^
UnboundLocalError: cannot access local variable 'os' where it is not associated with a value

2. interpreter.llm.model value is hardcoded

interpreter.llm.model is hardcoded to "gpt-4." From what I can tell, this makes it impossible to fully use 01 locally. When you run poetry run 01 --local, you'll get this error:

openai.openaiError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

I can submit a PR that fixes the above by letting users pass in the local model via --model or a LLM_MODEL environment variable.

If there's interest, I can also submit a separate PR to make 01 configurable via environment variables, command-line arguments, and a config.yaml file.

tyfiero commented 6 months ago

This absolutely is an issue we need to fix, we would welcome a pull request to fix these issues! We also need to add the --api_base flag, that open interpreter has, so that we can support LMStudio/Ollama etc. If that's too tricky I can help out with that one. Thank you for bringing this up @jcp!

HanClinto commented 6 months ago

When you run poetry run 01 --local, you'll get this error: openai.openaiError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

I just ran into this and came here to open an issue. Very interested in getting this working!

jcp commented 6 months ago

This absolutely is an issue we need to fix, we would welcome a pull request to fix these issues! We also need to add the --api_base flag, that open interpreter has, so that we can support LMStudio/Ollama etc. If that's too tricky I can help out with that one. Thank you for bringing this up @jcp!

Great to hear. I'll submit another PR for the larger fix soon. In the meantime, #119 helps with running locally.