jz462 / Large-Scale-VRD.pytorch

Implementation for the AAAI2019 paper "Large-scale Visual Relationship Understanding"
https://arxiv.org/abs/1804.10660
MIT License
144 stars 25 forks source link

The inconsistency between the readme and the downloaded weight files. #11

Closed FesianXu closed 5 years ago

FesianXu commented 5 years ago

Sorry for bothering you again but I found that the downloaded weight files' structure including the files' names are inconsistent with your readme ,which may bring some unexpected misunderstanding. for example, the Directory Structure in the readme is totally different with the downloaded files now. And actually, I am confused and don't know which weight files is the correct one in evaluating.

FesianXu commented 5 years ago

In detail, in the folder trained_models, according to your readme, it should contain |-- trained_models | |-- e2e_relcnn_VGG16_8_epochs_vg_y_loss_only | | |-- model_step125445.pth | |-- e2e_relcnn_X-101-64x4d-FPN_8_epochs_vg_y_loss_only | | |-- model_step125445.pth | |-- e2e_relcnn_VGG16_8_epochs_vrd_y_loss_only | | |-- model_step7559.pth | |-- e2e_relcnn_VGG16_8_epochs_vrd_y_loss_only_w_freq_bias | | |-- model_step7559.pth

however, from the weights files I downloaded, I just found the files with the names like:

\vg_VGG16
vrd_VGG16_IN_pretrained oi_mini_X-101-64x4d-FPN
vg_X-101-64x4d-FPN oi_X-101-64x4d-FPN
vrd_VGG16_COCO_pretrained

And I don't konw which weights I should load in the evaluation. Looking forward to your response, thanks and have a nice day.

FesianXu commented 5 years ago

seems the author re-trained the model and he gave me the new pre-trained model https://drive.google.com/file/d/1i9rOrmiiPba3SuYJeyP1qhTbDMlCAA4S/view I am going to check it out. thanks and close the issue.