OpenInterpreter / 01

The #1 open-source voice interface for desktop, mobile, and ESP32 chips.
https://01.openinterpreter.com/
GNU Affero General Public License v3.0
4.92k stars 517 forks source link

HTTP Error 404 after selecting a model in --local mode #141

Closed MartinMF closed 6 months ago

MartinMF commented 6 months ago

prompt> poetry run 01 --server --local proceeds to model selection. After selecting a local LLM, the server crashes with the error: urllib.error.HTTPError: HTTP Error 404: Not Found. This error occurs with both Ollama and LM Studio.

Log:

[?] Which one would you like to use?:
 > Ollama
   LM Studio

3 Ollama models found. To download a new model, run ollama run <model-name>, then start a new 01 session.

For a full list of downloadable models, check out https://ollama.com/library

[?] Select a downloaded Ollama model:
   failed
   NAME
 > llama2

Using Ollama model: llama2

Exception in thread Thread-13 (run_until_complete):
Traceback (most recent call last):
  File "C:\Users\Martin\anaconda3\envs\01\Lib\threading.py", line 1045, in _bootstrap_inner
    self.run()
  File "C:\Users\Martin\anaconda3\envs\01\Lib\threading.py", line 982, in run
    self._target(*self._args, **self._kwargs)
  File "C:\Users\Martin\anaconda3\envs\01\Lib\asyncio\base_events.py", line 654, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  [...]
  File "C:\Users\Martin\anaconda3\envs\01\Lib\urllib\request.py", line 496, in _call_chain
    result = func(*args)
             ^^^^^^^^^^^
  File "C:\Users\Martin\anaconda3\envs\01\Lib\urllib\request.py", line 643, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 404: Not Found
[IPKernelApp] WARNING | Parent appears to have exited, shutting down.
[IPKernelApp] WARNING | Parent appears to have exited, shutting down.

image

prompt>ollama list
NAME                    ID              SIZE    MODIFIED
codellama:latest        8fdf8f752f6e    3.8 GB  18 minutes ago

OS: Windows 11