Closed francis closed 9 months ago
Can take a cue from AutoGPT. Here's their setup: https://github.com/Significant-Gravitas/Auto-GPT/tree/master/autogpt/memory
👍 Great idea @mculp - I will study this approach as it solves another problem I was considering too.
I will check this in this weekend @francis
observations
embeddings
to vectorstores
potential tasks
Embeddings
to VectorStores
VectorSearch
class and accept VectorStores
when intializing or calling[x] fix bugs
update: I wanted to use tensorflow as another option like langchain does but the tensorflow gem is not yet ready for it.
gpt4all has a ticket for embeddings but it is still open.
I like your thinking here @jaigouk
@francis
https://github.com/BoxcarsAI/boxcars/pull/74 is using Boxcars::VectorStore::Document
for building vectore stores and search results and Boxcars::VectorStore::Document
has content, embedding and metadata. Can we close this ticket?
can we mark this issue as done? https://github.com/BoxcarsAI/boxcars/pull/92 is merged. @francis
Now that we have embedded vector search, add a boxcar that scans search results and gives a coherent summary.
Ideally, the class would be Boxcars::VectorSearch
@jaigouk - your new additions will enable this. I want to accept the default in-memory vector store that you provided or external (Posgresql, Redis, Pinecone, ...) stores.