AdeDZY / DeepCT

DeepCT and HDCT uses BERT to generate novel, context-aware bag-of-words term weights for documents and queries.
BSD 3-Clause "New" or "Revised" License
312 stars 46 forks source link

how to choose K in top-K retrievement when using PRF(pseudo-relevance feedback)? #8

Open Executedone opened 4 years ago

Executedone commented 4 years ago

Thank you for your code! I found an enhanced approach called PRF in your new paper "Context-Aware Document Term Weighting for Ad-Hoc Search", it seemed to search top-k documents for a given query and then construct relavence of one document and queries. So how to choose the number K (K=10? K=30?) , how important it is for our ground truth term weight? Thanks!

Daniil200707 commented 6 months ago

K=10