haseeb-heaven / code-interpreter

An innovative open-source Code Interpreter with (GPT,Gemini,Claude,LLaMa) models.
https://pypi.org/project/open-code-interpreter/
MIT License
219 stars 40 forks source link

Local-model not working, as it expects openai api key. #13

Closed bcosculluela closed 3 months ago

bcosculluela commented 3 months ago

Hello! Regarding this issue, I am currently using LM Studio. When using local-model, it does not work. As I can see in code, in interpreter_lib.py, line 324, variable _custom_llmprovider is set to 'openai', so it expects de openai api key. Which has to be the value of this variable when using open-source LLMs as Mistral?

haseeb-heaven commented 3 months ago

Okay i will take a look at this issue today.

bcosculluela commented 3 months ago

Thanks! In my case, the issue has been solved by setting: model = "ollama/llama2" And removing the variable _custom_llmprovider. Just in case it helps! 😄

haseeb-heaven commented 3 months ago

Fix this bug in this PR : https://github.com/haseeb-heaven/code-interpreter/pull/14