zylon-ai / private-gpt

Interact with your documents using the power of GPT, 100% privately, no data leaks
https://privategpt.dev
Apache License 2.0
53.51k stars 7.19k forks source link

mistral:7b-instruct-q8_0 does not work #1784

Closed mictadlo closed 1 month ago

mictadlo commented 5 months ago

Hi, I was able to get PrivateGPT running with Ollama + Mistral in the following way:

conda create -n privategpt-Ollama python=3.11 poetry
conda activate privateGPT-Ollama

git clone https://github.com/imartinez/privateGPT privateGPT-Ollama
cd privateGPT-Ollama

ollama run mistral
ollama pull nomic-embed-text
ollama serve
sudo lsof -i tcp:11434
kill -9 787

poetry install --extras "ui llms-ollama embeddings-ollama vector-stores-qdrant"
CMAKE_ARGS="-DLLAMA_METAL=on" pip install --force-reinstall --no-cache-dir llama-cpp-python
PGPT_PROFILES=ollama make run

However, when I changed to ollama run mistral:7b-instruct-q8_0 I got this errors:

% ollama list               
NAME                        ID              SIZE    MODIFIED       
mistral:7b-instruct-q8_0    2162e081e7f0    7.7 GB  14 minutes ago  
nomic-embed-text:latest     0a109f422b47    274 MB  4 days ago      

% PGPT_PROFILES=ollama make run    
poetry run python -m private_gpt
13:54:52.470 [INFO    ] private_gpt.settings.settings_loader - Starting application with profiles=['default', 'ollama']
13:55:00.454 [INFO    ] private_gpt.components.llm.llm_component - Initializing the LLM in mode=ollama
Traceback (most recent call last):
  File "/Users/michal/miniconda3/envs/privateGPT-Mistral/lib/python3.11/site-packages/injector/__init__.py", line 798, in get
    return self._context[key]
           ~~~~~~~~~~~~~^^^^^
KeyError: <class 'private_gpt.ui.ui.PrivateGptUi'>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/michal/miniconda3/envs/privateGPT-Mistral/lib/python3.11/site-packages/injector/__init__.py", line 798, in get
    return self._context[key]
           ~~~~~~~~~~~~~^^^^^
KeyError: <class 'private_gpt.server.ingest.ingest_service.IngestService'>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/michal/miniconda3/envs/privateGPT-Mistral/lib/python3.11/site-packages/injector/__init__.py", line 798, in get
    return self._context[key]
           ~~~~~~~~~~~~~^^^^^
KeyError: <class 'private_gpt.components.llm.llm_component.LLMComponent'>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/michal/Projects/ai-tools/privateGPT-Ollama/private_gpt/components/llm/llm_component.py", line 111, in __init__
    from llama_index.llms.ollama import Ollama  # type: ignore
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ModuleNotFoundError: No module named 'llama_index.llms.ollama'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "/Users/michal/Projects/ai-tools/privateGPT-Ollama/private_gpt/__main__.py", line 5, in <module>
    from private_gpt.main import app
  File "/Users/michal/Projects/ai-tools/privateGPT-Ollama/private_gpt/main.py", line 6, in <module>
    app = create_app(global_injector)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/michal/Projects/ai-tools/privateGPT-Ollama/private_gpt/launcher.py", line 63, in create_app
    ui = root_injector.get(PrivateGptUi)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/michal/miniconda3/envs/privateGPT-Mistral/lib/python3.11/site-packages/injector/__init__.py", line 91, in wrapper
    return function(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/michal/miniconda3/envs/privateGPT-Mistral/lib/python3.11/site-packages/injector/__init__.py", line 974, in get
    provider_instance = scope_instance.get(interface, binding.provider)
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/michal/miniconda3/envs/privateGPT-Mistral/lib/python3.11/site-packages/injector/__init__.py", line 91, in wrapper
    return function(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/michal/miniconda3/envs/privateGPT-Mistral/lib/python3.11/site-packages/injector/__init__.py", line 800, in get
    instance = self._get_instance(key, provider, self.injector)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/michal/miniconda3/envs/privateGPT-Mistral/lib/python3.11/site-packages/injector/__init__.py", line 811, in _get_instance
    return provider.get(injector)
           ^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/michal/miniconda3/envs/privateGPT-Mistral/lib/python3.11/site-packages/injector/__init__.py", line 264, in get
    return injector.create_object(self._cls)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/michal/miniconda3/envs/privateGPT-Mistral/lib/python3.11/site-packages/injector/__init__.py", line 998, in create_object
    self.call_with_injection(init, self_=instance, kwargs=additional_kwargs)
  File "/Users/michal/miniconda3/envs/privateGPT-Mistral/lib/python3.11/site-packages/injector/__init__.py", line 1031, in call_with_injection
    dependencies = self.args_to_inject(
                   ^^^^^^^^^^^^^^^^^^^^
  File "/Users/michal/miniconda3/envs/privateGPT-Mistral/lib/python3.11/site-packages/injector/__init__.py", line 91, in wrapper
    return function(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/michal/miniconda3/envs/privateGPT-Mistral/lib/python3.11/site-packages/injector/__init__.py", line 1079, in args_to_inject
    instance: Any = self.get(interface)
                    ^^^^^^^^^^^^^^^^^^^
  File "/Users/michal/miniconda3/envs/privateGPT-Mistral/lib/python3.11/site-packages/injector/__init__.py", line 91, in wrapper
    return function(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/michal/miniconda3/envs/privateGPT-Mistral/lib/python3.11/site-packages/injector/__init__.py", line 974, in get
    provider_instance = scope_instance.get(interface, binding.provider)
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/michal/miniconda3/envs/privateGPT-Mistral/lib/python3.11/site-packages/injector/__init__.py", line 91, in wrapper
    return function(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/michal/miniconda3/envs/privateGPT-Mistral/lib/python3.11/site-packages/injector/__init__.py", line 800, in get
    instance = self._get_instance(key, provider, self.injector)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/michal/miniconda3/envs/privateGPT-Mistral/lib/python3.11/site-packages/injector/__init__.py", line 811, in _get_instance
    return provider.get(injector)
           ^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/michal/miniconda3/envs/privateGPT-Mistral/lib/python3.11/site-packages/injector/__init__.py", line 264, in get
    return injector.create_object(self._cls)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/michal/miniconda3/envs/privateGPT-Mistral/lib/python3.11/site-packages/injector/__init__.py", line 998, in create_object
    self.call_with_injection(init, self_=instance, kwargs=additional_kwargs)
  File "/Users/michal/miniconda3/envs/privateGPT-Mistral/lib/python3.11/site-packages/injector/__init__.py", line 1031, in call_with_injection
    dependencies = self.args_to_inject(
                   ^^^^^^^^^^^^^^^^^^^^
  File "/Users/michal/miniconda3/envs/privateGPT-Mistral/lib/python3.11/site-packages/injector/__init__.py", line 91, in wrapper
    return function(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/michal/miniconda3/envs/privateGPT-Mistral/lib/python3.11/site-packages/injector/__init__.py", line 1079, in args_to_inject
    instance: Any = self.get(interface)
                    ^^^^^^^^^^^^^^^^^^^
  File "/Users/michal/miniconda3/envs/privateGPT-Mistral/lib/python3.11/site-packages/injector/__init__.py", line 91, in wrapper
    return function(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/michal/miniconda3/envs/privateGPT-Mistral/lib/python3.11/site-packages/injector/__init__.py", line 974, in get
    provider_instance = scope_instance.get(interface, binding.provider)
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/michal/miniconda3/envs/privateGPT-Mistral/lib/python3.11/site-packages/injector/__init__.py", line 91, in wrapper
    return function(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/michal/miniconda3/envs/privateGPT-Mistral/lib/python3.11/site-packages/injector/__init__.py", line 800, in get
    instance = self._get_instance(key, provider, self.injector)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/michal/miniconda3/envs/privateGPT-Mistral/lib/python3.11/site-packages/injector/__init__.py", line 811, in _get_instance
    return provider.get(injector)
           ^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/michal/miniconda3/envs/privateGPT-Mistral/lib/python3.11/site-packages/injector/__init__.py", line 264, in get
    return injector.create_object(self._cls)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/michal/miniconda3/envs/privateGPT-Mistral/lib/python3.11/site-packages/injector/__init__.py", line 998, in create_object
    self.call_with_injection(init, self_=instance, kwargs=additional_kwargs)
  File "/Users/michal/miniconda3/envs/privateGPT-Mistral/lib/python3.11/site-packages/injector/__init__.py", line 1040, in call_with_injection
    return callable(*full_args, **dependencies)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/michal/Projects/ai-tools/privateGPT-Ollama/private_gpt/components/llm/llm_component.py", line 113, in __init__
    raise ImportError(
ImportError: Ollama dependencies not found, install with `poetry install --extras llms-ollama`
make: *** [run] Error 1
% poetry install --extras "ui llms-ollama embeddings-ollama vector-stores-qdrant"

No module named 'build'
(privateGPT-Mistral) michal@Michals-MacBook-Pro privateGPT-Ollama % ollma serve
zsh: command not found: ollma
(privateGPT-Mistral) michal@Michals-MacBook-Pro privateGPT-Ollama % ollama serve
Error: listen tcp 127.0.0.1:11434: bind: address already in use
(privateGPT-Mistral) michal@Michals-MacBook-Pro privateGPT-Ollama % ls ~/Library/Application Support/anythingllm-desktop
ls: /Users/michal/Library/Application: No such file or directory
ls: Support/anythingllm-desktop: No such file or directory
(privateGPT-Mistral) michal@Michals-MacBook-Pro privateGPT-Ollama % ls /Library/Application Support/                   
ls: /Library/Application: No such file or directory
ls: Support/: No such file or directory

What did I miss?

Thank you in advance,

Michal

jaluma commented 1 month ago

Please do this before installing extras and try again:

pip install poetry
poetry install --extras "ui llms-ollama embeddings-ollama vector-stores-qdrant"

After that, you shouldn't get any errors. We don't know why, but sometimes poetry crashes and nothing else works until you reinstall it.