-
What's the easiest way to use `ColBERT` without loading the full index into memory? We are building an index off of the `wiki_dpr` dataset (and eventually more), so we have about 21 million passages …
-
**Question**
Can you please tell me what and how an input is passed to the ranker model?. searched on web but there is no result related to that. It would be helpful if you provide me proper way of …
-
Hi, I've just come across your amazing results in MARCO passage ranking task and related paper.
But I see neither tutorial nor any example code on how to incorporate BERT in your framework.
Will…
-
Hi,
I just published our Margin-MSE ensemble-trained, DistilBERT-based checkpoint for dense passage retrieval here: https://huggingface.co/sebastian-hofstaetter/distilbert-dot-margin_mse-T2-msmarco…
-
paper: https://arxiv.org/pdf/2007.00808.pdf
https://github.com/microsoft/ANCE provided encoder checkpoints
-
Use passages as input, relevant query as true labels.
-
Hi,
I was trying to train a BERT model for MS MARCO Passage Ranking. And according to the bash, there needs a ‘queries.train.small.tsv’ file. But I didn't find any download link on the MS MARCO websi…
-
can you give me a working example this
`from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("amberoad/bert-multilingual-passage-r…
-
-
I'm trying to precisely interpret Table 1, looking at the arXiv version of the paper.
For HotPotQA, you report R@20 of 80.2%. This is defined as:
```On HotpotQA the metric is recall at the top k…
okhat updated
3 years ago