facebookresearch / dpr-scale

Scalable training for dense retrieval models.
262 stars 25 forks source link

what kind of bug might happen when num_workers > 0? #7

Closed Liangtaiwan closed 1 year ago

Liangtaiwan commented 1 year ago

Hi @ccsasuke

I noticed you mentioned that num_workers > 0 bugs out right now. https://github.com/facebookresearch/dpr-scale/blob/2a6d3906ee163c4f0025841a3e30ebf82ebf49bb/dpr_scale/datamodule/dpr.py#L167 However, when I set num_workers = 8, it seems the code works. Could you points me out what kind of bug might happen? or you forget to remove the comment after solving the bugs since some configs do set num_workers > 10. https://github.com/facebookresearch/dpr-scale/blob/da2f594d22b499dd8d45bd8d8e9d11455e2c5efc/dpr_scale/conf/wiki_ict.yaml#L24

ccsasuke commented 1 year ago

Hi @Liangtaiwan, this comment was left a long time ago. I haven't verified whether it's still the case. Good to hear that using multiple workers work for you, and perhaps it's safe to remove the comment now.