tomeramit / SegDiff

144 stars 18 forks source link

does not work yet #12

Open yangluo23 opened 1 year ago

yangluo23 commented 1 year ago

Hi, Thanks for your open-source. But when I ran your code on vaih building dataset, like train vaih in your README.md, I just changed the batchsize=2. But it did not work yet, the predicted masks were all "dark", meaning the procedure was totally wrong, the model did not learn anything yet. But something wired is that after several steps, the lg_loss_scale is continunously deceasing, and finally to like -28773 (I just exit the runs, cuz I think the training process is meaningless).

am3338 commented 9 months ago

I managed to overcome this issue by decreasing the learning rate to 2e-5 in the training script. However, the results I obtain are still somewhat worse than what is reported in the paper. Here are the best values I obtained for Vaihingen:

mIoU: 89.72 (vs. 91.12) F1: 94.41 (vs. 95.14) wCov: 92.09 (vs. 93.83) FBound: 83.38 (vs. 85.09)

Another modification I made in the script was changing the batch size from 4 to 8, as the latter is the value the authors report to have used in their experiments.

Can the authors please provide all parameters or the training script they used to get their reported results?

Thanks for the code!

string-ellipses commented 6 months ago

I managed to overcome this issue by decreasing the learning rate to 2e-5 in the training script. However, the results I obtain are still somewhat worse than what is reported in the paper. Here are the best values I obtained for Vaihingen:

mIoU: 89.72 (vs. 91.12) F1: 94.41 (vs. 95.14) wCov: 92.09 (vs. 93.83) FBound: 83.38 (vs. 85.09)

Another modification I made in the script was changing the batch size from 4 to 8, as the latter is the value the authors report to have used in their experiments.

Can the authors please provide all parameters or the training script they used to get their reported results?

Thanks for the code!

Hello! May I ask how many iterations it took for you to achieve this effect? I ran 40,000 iterations with a batch size of 8, but when I tested using model_4000.pt or ema.pt, the output images were all black.