Open aasharma90 opened 5 years ago
Yes, you are right. Have you run this repo with the Cityscapes IMG_MEAN. You are welcomed to report this result.
HI @speedinghzl,
Thanks for your response. No, I haven't tried that yet, but I will give it a run now and see how it performs.
Hi @speedinghzl,
I did some runs, but found out the performance to be very fluctuating. Even for one particular setting itself, the performance can vary with a deviation of [-2, +2]%
Therefore, I am not able to reach any conclusion whether changing the IMG_MEAN stats for CityScapes is helping. Do you know any workaround I can employ to stabilize the performance?
The performance is not stable. Maybe we can run 5 times for each setting and then compare the mean of them.
你好,请问代码中为什么没有设置验证集去验证呢,是否验证集是非必须的,可有可无
Hi @speedinghzl ,
I see the following has been used for CityScapes (during both training and evaluation)-
IMG_MEAN = np.array((104.00698793,116.66876762,122.67891434), dtype=np.float32)
I did some googling and found out the stats actually correspond to the ImageNet dataset? So, I was simply wondering whether for training on CityScapes, we need to change them accordingly? Using a simple check, I see that the above stats for CityScapes training dataset are
[73.1584, 82.9089, 72.3924]
, which seem to differ quite a bit from the above.[EDIT, 24/01/2019]: Just recently found out that the stats for ImageNet shown above are actually in BGR format. So, for CityScapes, it should be
[72.3924, 82.9089, 73.1584]
and not[73.1584, 82.9089, 72.3924]