jmiller656 / EDSR-Tensorflow

Tensorflow implementation of Enhanced Deep Residual Networks for Single Image Super-Resolution
MIT License
330 stars 107 forks source link

Relu is not required out side the residual block #28

Closed ZhenyF closed 6 years ago

ZhenyF commented 6 years ago

Hi I read the paper and find out that the author did not implement Relu out side the residual block, which means that the first conv, convs in upsampling block and the conv after the upsampling shouldn't followed by ReLU, but in your code it seems exist, could you please make a double check with that? Also, may I know why do you disable the last conv layer after the upsampling block? Please correct me if I make a mistake.

many thanks!

mashoujiang commented 6 years ago

Hi @ZhenyF , I think you can find the sampling code in https://github.com/jmiller656/EDSR-Tensorflow/blob/master/model.py#L91, so the activation function is actually None, right?

ZhenyF commented 6 years ago

Hi @mashoujiang YUP! I just seen that. My bad...