castorini / rank_llm

RankLLM is a Python toolkit for reproducible information retrieval research using rerankers, with a focus on listwise reranking.
http://rankllm.ai
Apache License 2.0
311 stars 39 forks source link

P3 - Better retrieval caching reuse logic #91

Open ronakice opened 8 months ago

ronakice commented 8 months ago

Currently I think we need the exact top-$k$ file cached, but if you say, have top-100 file cached you shouldn't redo retrieval for top-20 reranking, this is an unnecessary step.