JacobYuan7 / RLIPv2

[ICCV 2023] RLIPv2: Fast Scaling of Relational Language-Image Pre-training
Apache License 2.0
114 stars 3 forks source link

Can't figure out how to run inference with pre-trained weights #21

Open lunaluxie opened 2 months ago

lunaluxie commented 2 months ago

Hello,

Excellent work! I'm trying to adapt it to another dataset, but I am struggling to find out how to load pretrained weights.

Specifically, I'm looking at the fully fined-tuned weights on HICO-DET for the model RLIPv2-ParSeDA with backbone ResNet-50.

I have downloaded the weights, and thought I could run inference with the command

python3 inference_on_custom_imgs_hico.py --batch_size 1 --param_path RLIP_PDA_v2_HICO_R50_VGCOO365_COO365det_RQL_LSE_RPL_20e_L1_20e_checkpoint0019.pth --save_path out --backbone resnet50 --RLIP_ParSeDA_v2

But I get an error saying that a bunch of keys are missing in the state_dict. I have tried to look at the different arguments, but I can't figure out what the appropriate command is.

Do you have any documentation that specifies how to load the appropriate model for the different pretrained weights?

JacobYuan7 commented 2 months ago

@lunaluxie

Many thanks for your interest in my work.

Actually, the file 'inference_on_custom_imgs_hico.py' is initially created in RLIPv1 (https://github.com/JacobYuan7/RLIP). Unfortunately, I did not add RLIPv2 into this file at the time of code release. (You can see that I did not import RLIPv2 in the Line 39 of inference_on_custom_imgs_hico.py.) So, if you want to run inference directly, you might need to modify the code a bit to add RLIPv2 into this file.

If you want to test the pre-trained weights, you can load it into 'https://github.com/JacobYuan7/RLIPv2/blob/main/scripts/RLIP_ParSeDA/fine_tune_RLIP_ParSeDA_v2_hico.sh', with the #epoch setting to 0.

Feel free to ask follow-up questions.

lunaluxie commented 2 months ago

Thank you so much!