zamling / PSALM

[ECCV2024] This is an official implementation for "PSALM: Pixelwise SegmentAtion with Large Multi-Modal Model"
Apache License 2.0
174 stars 7 forks source link

Number of Epochs during Training #12

Open YunzeMan opened 2 months ago

YunzeMan commented 2 months ago

Hi, I wonder how many epochs do you train PSALM during the finetuning phase?

In your training script the num_train_epochs is set to 10, but in your paper you stated 56k with batch_size 64 are used during training. Setting num_train_epochs=10 results in the learning rate barely decreases and the loss is not going down for a long time. Should we decrease the num_train_epochs to a smaller number? For example, 3 or 4.

Thanks!

zamling commented 1 month ago

Hi @YunzeMan we did 56K iterations totally during training. If you only training segmentation data, numbers for 10 epoch can be calculated as:

(~120,000 (COCO) + ~120,000 (RefCOCO) + ~120,000 (COCO-Interactive)) * 10 (epoch) / 64 (bs) = ~56K

And when adding QA dataset, we keep this number and choose an average of these four tasks.

If you are confusing loss, you can send your loss curve for each loss and maybe I can help you find problem