TimoSaemann / ENet

ENet: A Deep Neural Network Architecture for Real-Time Semantic Segmentation
577 stars 276 forks source link

about dilated convolution and low gpu-util #50

Open ewen1024 opened 6 years ago

ewen1024 commented 6 years ago

Thx for your great work. I am doing fine with your model by now for my binary segmantation task. But at the edges, when I applied the softmax, the smaller part become slightly larger than expected but when there is no softmax, holes appeared in the smaller part. That's what I can understand.

But about the dilatation part, I saw you are utilizing 2 4 8 16 for the 4 layers. Is it reasonable to decrease them for better edge performance. I know the dilated convolution is for larger receptive field without losing pixels. But I guess the detail is lost on that procedure. So is it reasonable like I am setting all the dilation to 2 or 0? What's pros and cons? Thx

Lastly, it's weird my gpu utilzation is low like 30 to 40% and power assumption only 90W on my 1080ti. While when I was training with tensorflow, 99% 270W and the card is like a furnace. Any idea on the low usage of the card?

By the way, it seems there are no normalization with the raw pics like mean and variance normalization, am I right?

Thx