FlagOpen / FlagEmbedding

Retrieval and Retrieval-augmented LLMs
MIT License
6.59k stars 469 forks source link

bge-m3 default representation #443

Open nico2rdj opened 6 months ago

nico2rdj commented 6 months ago

Hello :)

First thank you for your amazing work!

When using bge-m3 within langchain what is the default representation of the encoding? Dense or a mix of differents (sparse...)?

staoxiao commented 6 months ago

Thanks for your interest in our work! In langchain, you can use HuggingFaceEmbeddings to load bge-m3, which will use sentence-transformers tool to generate dense embedding.

wayne2tech commented 5 months ago

Thanks for your interest in our work! In langchain, you can use HuggingFaceEmbeddings to load bge-m3, which will use sentence-transformers tool to generate dense embedding.

how to generate sparse vector by HuggingFaceEmbeddings, please

staoxiao commented 5 months ago

Thanks for your interest in our work! In langchain, you can use HuggingFaceEmbeddings to load bge-m3, which will use sentence-transformers tool to generate dense embedding.

how to generate sparse vector by HuggingFaceEmbeddings, please

please refer to https://github.com/FlagOpen/FlagEmbedding/issues/585