salesforce / BLIP

PyTorch code for BLIP: Bootstrapping Language-Image Pre-training for Unified Vision-Language Understanding and Generation
BSD 3-Clause "New" or "Revised" License
4.65k stars 617 forks source link

How to load the pre-trained BLIP model pth file into HuggingFace BLIP model? #149

Open adventure2165 opened 1 year ago

adventure2165 commented 1 year ago

I followed the pretrain.py code provided here and trained a model using Custom Data by only modifying the Tokenizer to work with Korean language. However, I now want to load a pre-trained pth file trained using Hugging Face's BLIP model. Is there a way to do this?

NielsRogge commented 1 year ago

Hi,

You can leverage the conversion script to convert the original checkpoint to the HF format: https://github.com/huggingface/transformers/blob/main/src/transformers/models/blip/convert_blip_original_pytorch_to_hf.py