Closed yolsever closed 3 years ago
Hi, there. Did you solve this problem? I'm getting through the same thing.
Hi, I'm wondering if you solve the problem? I'm also getting the same thing:(
Hello , did you solve this problem ? I getting the same issue when i'm trying to validate retriever with my own dataset :(
I encountered the same issue. Seems that the reason is the ctx_datatsets
was not configured correctly. If you're running nq
with dpr_wiki
, then here is the command that worked for me. The trick is that ctx_datatsets
needs to be [dpr_wiki]
(with brackets)
python dense_retriever.py \
model_file="/home/zekun/mapqa/DPR/downloads/checkpoint/retriever/single/nq/bert-base-encoder.cp" \
qa_dataset=nq_test \
ctx_datatsets=[dpr_wiki] \
encoded_ctx_files=["/home/zekun/mapqa/DPR/downloads/data/retriever_results/nq/single/wikipedia_passages_*.pkl"] \
out_file="out_json.json"
Dear Vlad,
When I am trying to validate the retriever against the entire set of documents, I get the following error. For context, for ctx_datatsets, I am using the same input as the ctx_src for the generate_dense_embeddings.py script. So, it is a CSV file with columns in the order as in ['id','text','title']. Thank you!
Also, if I feed ctx_datatsets=[pm_dev] instead of ctx_datatsets=pm_dev. I get the following error:
Best regards, Kaan