Closed rB080 closed 1 year ago
~35 mIoU for this repository. More and better results will be updated in the next version of CLIMS.
I think that's consistent with what we are getting. Thanks a lot!
Hi @ProjectDisR Are you saying that you got miou of ~0.41 ? If yes, is this for pseudolabels of train or valid ? LIke you said, I also assume to use the created pseudolabels to train the deeplab and get the validation miou scores ? Let me know what you think.
Hi!
We were trying to run the Coco codes, have a mIOU ~0.34. I am not sure if this is what we should expect, so it would be great if you could add baseline results to the repository. We did not change the default tuning settings.
Thanks