Closed nagomiso closed 11 months ago
Hey there, @nagomiso!
Thanks for pointing this out.
If you’d like to submit a Pull Request with your proposed fix, I’d be happy to get it reviewed and approved.
@ericrallen Thank you for your reaction!! I'll try creating a Pull Request later.
@nagomiso It looks like @shubhexists has created a PR for this Issue if you want to check out #687.
Describe the bug
I understand that Open Interpreter can set the "temperature" parameter for LLM.
However, looking at the implementation you provided, it appears that setting
temperature = 0
is not possible. This is because in Python, the number0
(or0.0
) is evaluated asFalse
in conditional statements.https://github.com/KillianLucas/open-interpreter/blob/fdf0af3b284609a0c9276f02f25e0903e6f9cd7d/interpreter/llm/setup_openai_coding_llm.py#L81-L82
https://github.com/KillianLucas/open-interpreter/blob/fdf0af3b284609a0c9276f02f25e0903e6f9cd7d/interpreter/llm/setup_text_llm.py#L104-L105
In other words, Open Interpreter cannot set
0
or0.0
forparams["temperature"]
when calling the LLM API.Reproduce
1. Start Open Interpreter with -t 0.0 and -d set
2. Press Enter without entering anything
3. Check the contents of the debug message that says "Sending this to LiteLLM:"
You can confirm that
params["temperature"]
is not set to anything.I will paste the actual output debug message below.
Expected behavior
My expected behavior is "temperature" can be set to
0
. (However, if there's a circumstance where OpenInterpreter does not function well when "temperature" is0
, it's fine to close this issue)I believe the corresponding part can be corrected by modifying the implementation as follows.
interpreter/llm/setup_text_llm.py#L104-L105 & interpreter/llm/setup_openai_coding_llm.py#L81-L82
interpreter/core/core.py#L49
Screenshots
No response
Open Interpreter version
0.1.10
Python version
3.11.5
Operating System name and version
macOS 13.6
Additional context
No response