Open meetr1912 opened 1 week ago
Can you please try again with Python 3.11 or 3.10
$ interpreter --local
Open Interpreter supports multiple local model providers.
[?] Select a provider:
Ollama Llamafile LM Studio Jan
[?] Select a model:
llama3.2:1b llama3.2:3b llava-llama3 llama3.1:8b phi3:3.8b nomic-embed-text qwen2:7b ↓ Download llama3.1 ↓ Download phi3 ↓ Download mistral-nemo ↓ Download gemma2 ↓ Download codestral Browse Models ↗
Loading llama3.2:1b...
Traceback (most recent call last):
File "/Users/niehu/miniforge3/envs/open_interpreter/bin/interpreter", line 8, in
I got the same error for both 3.10 and 3.11
Yes. Same here with both Versions of 3.10 and 3.11 on Windows 11. Don´t know what to do after reinstall Python and OI. 😊
Open Interpreter supports multiple local model providers.
[?] Select a provider:
Ollama Llamafile LM Studio Jan
[?] Select a model: llama3.2:1b llama3-groq-tool-use llama3.1:8b llama3.1 llama3.2
qwen2.5-coder deepseek-coder-v2 mistral nemotron-mini qwen2.5:7b starcoder2:3b gemma2 codegemma
Loading qwen2.5-coder...
Traceback (most recent call last):
File "
Please run pip install 'open-interpreter[local]'
If the issue persists, please share the output of interpreter --version
and ollama --version
Please run
pip install open-interpreter[local]
If the issue persists, please share the output of
interpreter --version
andollama --version
Sadly wont Work :(
I`d had previous Python 3.12 installed on the Local Maschine but i changed it to 3.11 and delete all dependencies of 3.12.
-> Sandbox:
I have installed it on a Sandbox. Sadly wont work. But i get an other Error Mesage.
-> Local:
(oi_venv) PS D:\OpenInterpreter> interpreter --version Open Interpreter 0.4.3 Developer Preview (oi_venv) PS D:\OpenInterpreter> ollama --version ollama version is 0.3.14 (oi_venv) PS D:\OpenInterpreter> python --version Python 3.11.0
"DEPRECATION: wget is being installed using the legacy 'setup.py install' method, because it does not have a 'pyproject.toml' and the 'wheel' package is not installed. pip 23.1 will enforce this behaviour change. A possible replacement is to enable the '--use-pep517' option. Discussion can be found at https://github.com/pypa/pip/issues/8559 Running setup.py install for wget ... done DEPRECATION: pyperclip is being installed using the legacy 'setup.py install' method, because it does not have a 'pyproject.toml' and the 'wheel' package is not installed. pip 23.1 will enforce this behaviour change. A possible replacement is to enable the '--use-pep517' option. Discussion can be found at https://github.com/pypa/pip/issues/8559"
Edit: After new Installment with python -m pip install --upgrade pip, Error above won`t occur. Same Error if i started the --local.
@Grunkah
Sadly wont Work :(
What won't work?
@Grunkah
Sadly wont Work :(
What won't work?
I've tried the approach suggested above, so far and have also recently reinstalled Open-Interpreter a few times. I've also reinstalled Python in an attempt to resolve the issue.
After multiple attempts to install Open-Interpreter and starting the "Interpreter --local" just result in the same Error from above.
I'm starting to suspect that there might be a configuration problem with my Windows 11 installation. Unfortunately, I'm not sure what this would entail or how to fix it. I don't want to reinstall Windows, if fixing the issue is an option,
To be honest, I'm getting frustrated with the issues caused by Windows 11 again - it's not the first time I've encountered problems like this due to its quirks. Last time, it was related to PyTorch 😂. Guess how i fixed it.
Loading llama3.2:3b...
Traceback (most recent call last):
File "/home/tyson/open-interpreter/.env/bin/interpreter", line 8, in
same issue not working
Add:
--no-llm_supports_functions
When launching interpreter
Basically all Ollama models are failing to run. Some even load, but they all crash after inserting the prompt.
Basically all Ollama models are failing to run. Some even load, but they all crash after inserting the prompt.
See my comment above https://github.com/OpenInterpreter/open-interpreter/issues/1514#issuecomment-2464606167
Will be fixed next release #1524
See my comment above #1514 (comment)
I've tried that option. It's still not working.
Will be fixed next release #1524
Great. Thanks!
Describe the bug
interpreter --local
Open Interpreter supports multiple local model providers.
[?] Select a provider:
[?] Select a model: llama3.2
Downloading llama3.1...
pulling manifest pulling 8eeb52dfb3bb... 100% ▕████████████████▏ 4.7 GB
pulling 948af2743fc7... 100% ▕████████████████▏ 1.5 KB
pulling 0ba8f0e314b4... 100% ▕████████████████▏ 12 KB
pulling 56bb8bd477a5... 100% ▕████████████████▏ 96 B
pulling 1a4c3c319823... 100% ▕████████████████▏ 485 B
verifying sha256 digest writing manifest success Loading llama3.1...
Traceback (most recent call last): File "/opt/anaconda3/bin/interpreter", line 8, in
sys.exit(main())
^^^^^^
File "/opt/anaconda3/lib/python3.12/site-packages/interpreter/terminal_interface/start_terminal_interface.py", line 612, in main
start_terminal_interface(interpreter)
File "/opt/anaconda3/lib/python3.12/site-packages/interpreter/terminal_interface/start_terminal_interface.py", line 471, in start_terminal_interface
interpreter = profile(
^^^^^^^^
File "/opt/anaconda3/lib/python3.12/site-packages/interpreter/terminal_interface/profiles/profiles.py", line 64, in profile
return apply_profile(interpreter, profile, profile_path)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/lib/python3.12/site-packages/interpreter/terminal_interface/profiles/profiles.py", line 148, in apply_profile
exec(profile["start_script"], scope, scope)
File "", line 1, in
File "/opt/anaconda3/lib/python3.12/site-packages/interpreter/core/core.py", line 145, in local_setup
self = local_setup(self)
^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/lib/python3.12/site-packages/interpreter/terminal_interface/local_setup.py", line 314, in local_setup
interpreter.computer.ai.chat("ping")
File "/opt/anaconda3/lib/python3.12/site-packages/interpreter/core/computer/ai/ai.py", line 134, in chat
for chunk in self.computer.interpreter.llm.run(messages):
File "/opt/anaconda3/lib/python3.12/site-packages/interpreter/core/llm/llm.py", line 86, in run
self.load()
File "/opt/anaconda3/lib/python3.12/site-packages/interpreter/core/llm/llm.py", line 397, in load
self.interpreter.computer.ai.chat("ping")
File "/opt/anaconda3/lib/python3.12/site-packages/interpreter/core/computer/ai/ai.py", line 134, in chat
for chunk in self.computer.interpreter.llm.run(messages):
File "/opt/anaconda3/lib/python3.12/site-packages/interpreter/core/llm/llm.py", line 322, in run
yield from run_tool_calling_llm(self, params)
File "/opt/anaconda3/lib/python3.12/site-packages/interpreter/core/llm/run_tool_calling_llm.py", line 178, in run_tool_calling_llm
for chunk in llm.completions(request_params):
File "/opt/anaconda3/lib/python3.12/site-packages/interpreter/core/llm/llm.py", line 466, in fixed_litellm_completions
raise first_error # If all attempts fail, raise the first error
^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/lib/python3.12/site-packages/interpreter/core/llm/llm.py", line 443, in fixed_litellm_completions
yield from litellm.completion(params)
File "/opt/anaconda3/lib/python3.12/site-packages/litellm/llms/ollama.py", line 428, in ollama_completion_stream
raise e
File "/opt/anaconda3/lib/python3.12/site-packages/litellm/llms/ollama.py", line 406, in ollama_completion_stream
function_call = json.loads(response_content)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/lib/python3.12/json/init.py", line 346, in loads
return _default_decoder.decode(s)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/lib/python3.12/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/lib/python3.12/json/decoder.py", line 353, in raw_decode
obj, end = self.scan_once(s, idx)
^^^^^^^^^^^^^^^^^^^^^^
json.decoder.JSONDecodeError: Unterminated string starting at: line 1 column 2 (char 1)
Reproduce
above command
Expected behavior
above
Screenshots
No response
Open Interpreter version
0.4.3
Python version
3.12.4
Operating System name and version
macOS 13
Additional context
No response