PyTorch code for BLIP: Bootstrapping Language-Image Pre-training for Unified Vision-Language Understanding and Generation
BSD 3-Clause "New" or "Revised" License
4.85k
stars
648
forks
source link
Batch size for pre-training #163
Open
aries-young opened 1 year ago
hello, I wonder which batch size should be chosen for pre-training? In configs/pretrain.yaml batch_size=75, but in paper batch_size=2880 for vit-b.