airsplay / lxmert

PyTorch code for EMNLP 2019 paper "LXMERT: Learning Cross-Modality Encoder Representations from Transformers".
MIT License
923 stars 157 forks source link

PreTrain + BERT #76

Closed Muennighoff closed 4 years ago

Muennighoff commented 4 years ago

Hi, great repo @airsplay & team. I also really enjoyed your paper!

Do you think you could provide me with your PreTrain + BERT model weights, as mentioned in your paper? I think in my specific case, it might outperform the PreTrain + Scratch model, which is currently available for download. Appreciate the help, thanks!