szczyglis-dev / py-gpt

Desktop AI Assistant powered by GPT-4, GPT-4 Vision, GPT-3.5, DALL-E 3, Langchain, Llama-index, chat, vision, voice control, image generation and analysis, autonomous agents, code and command execution, file upload and download, speech synthesis and recognition, access to Web, memory, prompt presets, plugins, assistants & more. Linux, Windows, Mac.
https://pygpt.net
MIT License
449 stars 92 forks source link

Exception: No existing llama_index.core.vector_stores #51

Closed mikeybob closed 1 month ago

mikeybob commented 2 months ago

I've started receiving this error in version 76, and it has popped up again in 78. This happens anytime the ai starts to look something up online.

Exception: No existing llama_index.core.vector_stores.simple found at /home/mike/.config/pygpt-net/idx/base/vector_store.json, skipping load.
Type: ValueErrorMessage: No existing llama_index.core.vector_stores.simple found at /home/mike/.config/pygpt-net/idx/base/vector_store.json, skipping load.
Traceback:   File "/home/mike/.cache/pypoetry/virtualenvs/pygpt-net-sJoELmaO-py3.10/lib/python3.10/site-packages/llama_index/core/storage/storage_context.py", line 122, in from_defaults
    vector_stores = SimpleVectorStore.from_namespaced_persist_dir(
  File "/home/mike/.cache/pypoetry/virtualenvs/pygpt-net-sJoELmaO-py3.10/lib/python3.10/site-packages/llama_index/core/vector_stores/simple.py", line 160, in from_namespaced_persist_dir
    vector_stores[DEFAULT_VECTOR_STORE] = cls.from_persist_dir(
  File "/home/mike/.cache/pypoetry/virtualenvs/pygpt-net-sJoELmaO-py3.10/lib/python3.10/site-packages/llama_index/core/vector_stores/simple.py", line 125, in from_persist_dir
    return cls.from_persist_path(persist_path, fs=fs)
  File "/home/mike/.cache/pypoetry/virtualenvs/pygpt-net-sJoELmaO-py3.10/lib/python3.10/site-packages/llama_index/core/vector_stores/simple.py", line 305, in from_persist_path
    raise ValueError(

I'm on Fedora 40 x86_64. I run the app using a clone of the repository and use poetry to setup and run.

mikeybob commented 2 months ago

If i'm understanding the documentation correctly, this vector is internal, and nothing needs to be passed to it.

szczyglis-dev commented 2 months ago

Did you changed any parameters such as the embedding model or other custom arguments in Config -> Llama-index after creating the index? Maybe try to recreate the index by clearing the ./config/pygpt-net/idx/base directory (after deleting all files or entire base directory, the index should re-create itself).

From Llama-index documentation:

Important: if you had initialized your index with a custom transformations, embed_model, etc., you will need to pass in the same options during load_index_from_storage, or have it set as the global settings.

https://docs.llamaindex.ai/en/stable/understanding/storing/storing/