mayooear / gpt4-pdf-chatbot-langchain

GPT4 & LangChain Chatbot for large PDF docs
https://www.youtube.com/watch?v=ih9PBGVVOO4
14.94k stars 3.02k forks source link

Deleting a namespace #337

Closed sahinutar closed 1 year ago

sahinutar commented 1 year ago

Hi,

Working with the examples provided and ingesting the same doc over and over again, realized the same chunks are added as new vectors to the namespace increasing the total vectors in the namespace with each iteration. Not sure how I get ids once vectors are created using from_documents method.

Any way to get vector ids of some namespace created with langchain's from_documents method and iterate over and delete one by one? Or perhaps better a way to remove a namespace completely without having to know the vector ids?

Thanks a lot

Ş.

ashokrs commented 1 year ago

image

dosubot[bot] commented 1 year ago

Hi, @sahinutar! I'm Dosu, and I'm here to help the gpt4-pdf-chatbot-langchain team manage their backlog. I wanted to let you know that we are marking this issue as stale.

From what I understand, you are looking for a way to delete a namespace in langchain. Specifically, you are asking for a method to retrieve the vector ids of a namespace created with the from_documents method, so that you can iterate over them and delete them individually. Alternatively, you are interested in finding a better way to remove a namespace without needing to know the vector ids.

Unfortunately, it seems that there hasn't been any further activity or resolution on this issue. I noticed that user "ashokrs" commented on the issue, but there are no additional details provided.

Before we close this issue, I wanted to check with you if it is still relevant to the latest version of the gpt4-pdf-chatbot-langchain repository. If it is, please let us know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days.

Thank you for your understanding and contribution to the gpt4-pdf-chatbot-langchain project! Let us know if you have any further questions or concerns.