Closed seganrasan closed 9 months ago
Sparsembed models are fully open-source.
I'm not sure for Splade since the original work is not open-source. I did create the code from scratch and did not reuse any code or weights from the authors. On my side I put an MIT License on the repository. So I don't know if authors license apply or not to the specific Splade model.
These models on HuggingFace come from a previous version of the lib without any prefix in the query nor the documents [Q] and [D]. You may want to fine-tune a new model on msmarco in order construct a better model.
thanks @raphaelsty, this really helps.
I had the exact same question, it might be a good idea to put that in the README
+1. @raphaelsty, It would be great if you update the licence info in the huggingface page especially for
https://huggingface.co/raphaelsty/sparsembed-max https://huggingface.co/raphaelsty/splade-max
@seganrasan SparseEmbed is open-source,
I don't know about splade-max which I think is non-commercial only even if I wrote a brand new code and train a model with my code code.
Thanks @raphaelsty. It would be really helpful if you update the licence details of SparseEmbed model in the huggingface page. https://huggingface.co/raphaelsty/sparsembed-max
Updated the Readme and the HuggingFace models checkpoints
thanks @raphaelsty
Hi @raphaelsty, could you please let me know what is the licence of the below splade/sparsembed models hosted in huggingface?
https://huggingface.co/raphaelsty/splade-max https://huggingface.co/raphaelsty/distilbert-splade https://huggingface.co/raphaelsty/sparsembed-max https://huggingface.co/raphaelsty/distilbert-sparsembed