SHTUPLUS / Pix2Grp_CVPR2024

BSD 3-Clause "New" or "Revised" License
30 stars 2 forks source link

Missing pre-trained weights #11

Open afiff2 opened 2 months ago

afiff2 commented 2 months ago

Thank you for your work on SGG.

I placed the BLIP pre-trained weights in cache/ckpts. However, when I start training, it still shows "Missing keys ['text_decoder.bert.embeddings.word_embeddings.weight', 'text_decoder.cls.predictions.decoder.weight', ...]" with over 100 missing weight items. Is this normal, or have I made a mistake?

Scarecrow0 commented 2 months ago

The URL indicating the pretrain weight is: https://github.com/SHTUPLUS/Pix2Grp_CVPR2024/edit/main/lavis/models/blip_models/blip_rel_det.py#L1604.

You might need to manually download the weight from: https://huggingface.co/google-bert/bert-base-uncased to the data directory.

afiff2 commented 2 months ago

I have checked the code and believe the issue arises from the BLIP pre-trained weights not being loaded successfully. I would greatly appreciate it if you could provide a solution.