ljwztc / CLIP-Driven-Universal-Model

[ICCV 2023] CLIP-Driven Universal Model; Rank first in MSD Competition.
Other
565 stars 67 forks source link

how many epochs did you train #67

Closed sharonlee12 closed 4 months ago

sharonlee12 commented 7 months ago

I am hoping to know how many epochs you trained?I see in the train.py is 2000,It is too large I see

studyhard2024 commented 6 months ago

How many epochs did you train? I tried to run the code with my dataset, and ran 500 epochs. However, the dice score is too low. Did you meet this situation as me?

JcWang20 commented 5 months ago

How many epochs did you train? I tried to run the code with my dataset, and ran 500 epochs. However, the dice score is too low. Did you meet this situation as me?

which dataset do you use, i use all the data for training and ran500epochs with 8 a100, and can not reach the result in paper

ljwztc commented 5 months ago

The final model weights are obtained in approximately 480 epochs of training. BCE loss is very low, but i dont remember the accurate number. Dice loss is roughly 0.8. Since all files including model weights and training log have 20T+, they have been deleted from the NVIDIA server. I am sorry that I cannot provide more log files. If you need more assistance, please let me know.

ljwztc commented 4 months ago

The bug in the dice loss calculation has been addressed at this link. Consequently, we can now observe an expected decrease in the dice loss.