raphaelsty / neural-cherche

Neural Search
https://raphaelsty.github.io/neural-cherche/
MIT License
329 stars 17 forks source link

[Feature request] ColBERT V2 #14

Open filippo82 opened 6 months ago

filippo82 commented 6 months ago

Hi @raphaelsty,

first of all, thanks a lot for your this project. I really appreciate its simplicity and effectiveness.

Question: do you have any plans to implement ColBERT V2?

Best wishes.

raphaelsty commented 6 months ago

Hi @filippo82, I think it could be cool to add distillation loss of course. I plan to improve the loss function of the train module in the following weeks, there won't be any breaking change.

Before releasing the centroids algorithm of ColBERTV2 I plan to release another method to accelerate ColBERT retriever, it will be really fast and still accurate, work in progress.

filippo82 commented 5 months ago

Hi @raphaelsty 👋🏻 thanks a lot for your reply and sorry for the slooow reply.

Let me know if there is any way I can help with testing/debugging.

raphaelsty commented 5 months ago

Hi @filippo82, I did release neural-cherche 1.1.0 which improve loss stability and brings better default parameters to models. Also I did release neural-tree in order to accelerate ColBERT.

Feel free to open a PR in neural-cherche if you are interested in Knowledge Distillation