airsplay / lxmert

PyTorch code for EMNLP 2019 paper "LXMERT: Learning Cross-Modality Encoder Representations from Transformers".
MIT License
923 stars 157 forks source link

Some question about nlvr #96

Open haoopan opened 3 years ago

haoopan commented 3 years ago

Hi, sorry to disturb you, I have a question when running your code on nlvr. When I remove the pre-trained model to train nlvr, the result is : Epoch 0: Train 50.31 Epoch 0: Valid 50.86 Epoch 0: Best 50.86

Epoch 1: Train 50.39 Epoch 1: Valid 49.14 Epoch 1: Best 50.86

Epoch 2: Train 50.44 Epoch 2: Valid 49.14 Epoch 2: Best 50.86

Epoch 3: Train 50.57 Epoch 3: Valid 50.86 Epoch 3: Best 50.86

So, how how can I train nlvr without pre-trained model?