Open filippo82 opened 6 months ago
Hi @filippo82, I think it could be cool to add distillation loss of course. I plan to improve the loss function of the train module in the following weeks, there won't be any breaking change.
Before releasing the centroids algorithm of ColBERTV2 I plan to release another method to accelerate ColBERT retriever, it will be really fast and still accurate, work in progress.
Hi @raphaelsty 👋🏻 thanks a lot for your reply and sorry for the slooow reply.
Let me know if there is any way I can help with testing/debugging.
Hi @filippo82, I did release neural-cherche 1.1.0 which improve loss stability and brings better default parameters to models. Also I did release neural-tree in order to accelerate ColBERT.
Feel free to open a PR in neural-cherche if you are interested in Knowledge Distillation
Hi @raphaelsty,
first of all, thanks a lot for your this project. I really appreciate its simplicity and effectiveness.
Question: do you have any plans to implement ColBERT V2?
Best wishes.