vllm-project / vllm

A high-throughput and memory-efficient inference and serving engine for LLMs
https://docs.vllm.ai
Apache License 2.0
30.09k stars 4.55k forks source link

[New Model]: BAAI/bge-m3 #9847

Open javiplav opened 1 week ago

javiplav commented 1 week ago

The model to consider.

https://huggingface.co/BAAI/bge-m3

The closest model vllm already supports.

No response

What's your difficulty of supporting the model you want?

No response

Before submitting a new issue...

rpvelloso commented 5 days ago

+1

This model works well with portuguese language, would be great to have it under the openai API.