Open LvffY opened 11 months ago
Stale issue
Still an issue
Stale issue
Up
I went for Postgres. The ingestion was complete successfully, but neither it shows on the UI nor the LLM is able to query them.
Maybe I'm missing something.
I did:
make ingest /path/to/folder -- --watch
PGPT_PROFILES=local make run
I'm running my private GPT app inside of containers (in my case Azure Web App) and I'm running my model and my Qdrant database in AWS i.e external to my application.
However when I'm running my app after a cold start I don't see the previously ingested files that are present in my database.
But it seems that the backend still find my document in my database because when I'm asking a question on my document, it answers me correctly and is able to quote which documents I'm referring.
May be just a problem in the UI ?