Powerful unsupervised domain adaptation method for dense retrieval. Requires only unlabeled corpus and yields massive improvement: "GPL: Generative Pseudo Labeling for Unsupervised Domain Adaptation of Dense Retrieval" https://arxiv.org/abs/2112.07577
README.md: Hint of installing PyTorch correctly wrt. the CUDA version.
gpl/toolkit/beir.py: Black
gpl/toolkit/dataset.py: Black
gpl/toolkit/evaluation.py: Black
gpl/toolkit/log.py: Black
gpl/toolkit/loss.py: Black
gpl/toolkit/mine.py: Black
gpl/toolkit/mnrl.py: Black
gpl/toolkit/pl.py: Black
gpl/toolkit/qgen.py: Black
gpl/toolkit/reformat.py: Black
gpl/toolkit/rescale.py: Black
gpl/toolkit/resize.py: Black
gpl/toolkit/sbert.py: Black
gpl/train.py: Black
setup.py: Added protobuf, required by T5 and seems to be ignored by simply installing transformer; specified ees>=0.0.8 (where the es version is kept the same with that required by beir)
README.md
: Hint of installing PyTorch correctly wrt. the CUDA version.gpl/toolkit/beir.py
: Blackgpl/toolkit/dataset.py
: Blackgpl/toolkit/evaluation.py
: Blackgpl/toolkit/log.py
: Blackgpl/toolkit/loss.py
: Blackgpl/toolkit/mine.py
: Blackgpl/toolkit/mnrl.py
: Blackgpl/toolkit/pl.py
: Blackgpl/toolkit/qgen.py
: Blackgpl/toolkit/reformat.py
: Blackgpl/toolkit/rescale.py
: Blackgpl/toolkit/resize.py
: Blackgpl/toolkit/sbert.py
: Blackgpl/train.py
: Blacksetup.py
: Addedprotobuf
, required by T5 and seems to be ignored by simply installing transformer; specified ees>=0.0.8 (where the es version is kept the same with that required bybeir
)