Closed spreeni closed 9 months ago
Thanks for the suggestion @spreeni! I'm going to add this to our Request For Contribution project board for now.
Hi! I have experience in OpenSearch and have made some contributions there. I'd like to work on this, but do not have much experience in LlamaIndex. Could anyone please suggest me some resources to get started (I'm trying to learn alongside)?
Welcome @AkshathRaghav! Sure, probably the our docs is the best place to start:
Hi everyone, I would absolutely like to have this feature :D @nerdai : is someone assigned to this feature ?
Feature Description
Opensearch supports an internal hybrid query since version 2.10. It would be nice to have this as a
vector_store_query_mode
in theOpensearchVectorStore
as e.g. for Weaviate.In case this is too limiting on the usable Opensearch versions, a possibility to do default BM25 queries to Opensearch from within Llamaindex could also suffice. If I see it correctly, this has to be implemented in a custom approach as of now, which makes a custom hybrid search implementation cumbersome.
Correct me if I did not see the correct sources, I am just exploring Llamaindex and like it otherwise quite a bit :)
Reason
I think it is a rather new feature in Opensearch, which is why it is not part of Llamaindex yet. The workaround would be currently to use a vector search in Llamaindex and do another query from
opensearch-py
, and then merge these results. This would be nice to all be kept within Llamaindex.Value of Feature
If I see it correctly, this has to be implemented in a very custom approach as of now, which makes a hybrid search implementation cumbersome.