Open jcp opened 6 months ago
This absolutely is an issue we need to fix, we would welcome a pull request to fix these issues! We also need to add the --api_base flag, that open interpreter has, so that we can support LMStudio/Ollama etc. If that's too tricky I can help out with that one. Thank you for bringing this up @jcp!
When you run poetry run 01 --local, you'll get this error:
openai.openaiError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
I just ran into this and came here to open an issue. Very interested in getting this working!
This absolutely is an issue we need to fix, we would welcome a pull request to fix these issues! We also need to add the --api_base flag, that open interpreter has, so that we can support LMStudio/Ollama etc. If that's too tricky I can help out with that one. Thank you for bringing this up @jcp!
Great to hear. I'll submit another PR for the larger fix soon. In the meantime, #119 helps with running locally.
There are two issues within software/source/server/i.py:
1. UnboundLocalError
The
os
module is imported at the top level and within theconfigure_interpreter
function, which results in anUnboundLocalError
error. You can reproduce this by usingos.getenv
within theconfigure_interpreter
function.2. interpreter.llm.model value is hardcoded
interpreter.llm.model
is hardcoded to "gpt-4." From what I can tell, this makes it impossible to fully use 01 locally. When you runpoetry run 01 --local
, you'll get this error:I can submit a PR that fixes the above by letting users pass in the local model via
--model
or aLLM_MODEL
environment variable.If there's interest, I can also submit a separate PR to make 01 configurable via environment variables, command-line arguments, and a
config.yaml
file.