muquit / privategpt

An on-premises ML-powered document assistant application with local LLM using ollama
Other
1 stars 0 forks source link

Install error #3

Closed usaraj closed 1 day ago

usaraj commented 2 days ago
  1. Cloned your repo and followed the instructions

  2. Virtual environment: Python 3.12 on ubuntu 20

  3. Ollama running on port 11434 and is operational with various

  4. Ran Streamlit in server mode

  5. When calling your frontend ui, it is resulting in error: Here is the traceback: File "/home/mgptvenv/lib/python3.12/site-packages/streamlit/runtime/scriptrunner/exec_code.py", line 88, in exec_func_with_error_handling result = func() ^^^^^^ File "/home/mgptvenv/lib/python3.12/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 579, in code_to_exec exec(code, module.dict) File "/home/m_privategpt/assistant/assistant_ui.py", line 325, in doit() File "/home/m_privategpt/assistant/assistant_ui.py", line 142, in doit all_models = [model["name"] for model in ollama_client.list()["models"]]

    
    File "/home/mgptvenv/lib/python3.12/site-packages/ollama/_types.py", line 32, in __getitem__
    raise KeyError(key)

Your help in resolving this, will be greatly appreciated. Thanks

mqtc commented 2 days ago

python ollama module is updated and it broke things. Will update the fix tomorrow. Thanks for trying it out.

mqtc commented 1 day ago

ollama Python package was updated from 0.3.3 to 0.4.2 which changed how model information is returned from the API. The old code tried to access models using dict syntax model["name"] but the new version returns Model objects that need to be accessed using model.model attribute. Now a pinned version of requirements_pinned.txt is supplied with the versions of the modules known to work