Closed mikeybob closed 1 month ago
If i'm understanding the documentation correctly, this vector is internal, and nothing needs to be passed to it.
Did you changed any parameters such as the embedding model or other custom arguments in Config -> Llama-index
after creating the index? Maybe try to recreate the index by clearing the ./config/pygpt-net/idx/base
directory (after deleting all files or entire base
directory, the index should re-create itself).
From Llama-index documentation:
Important: if you had initialized your index with a custom transformations, embed_model, etc., you will need to pass in the same options during load_index_from_storage, or have it set as the global settings.
https://docs.llamaindex.ai/en/stable/understanding/storing/storing/
I've started receiving this error in version 76, and it has popped up again in 78. This happens anytime the ai starts to look something up online.
I'm on Fedora 40 x86_64. I run the app using a clone of the repository and use poetry to setup and run.