Machine-Learning-Tokyo / DL-workshop-series

Material used for Deep Learning related workshops for Machine Learning Tokyo (MLT)
Apache License 2.0
936 stars 253 forks source link

RefineNet in ConvNets.ipynb #6

Open foobar167 opened 4 years ago

foobar167 commented 4 years ago

In RefineNet of the ConvNets.ipynb notebook, function rcu residual convolution unit. Code:

x = ReLU()(tensor)
x = Conv2D(f, 3, padding='same')(x)
x = ReLU()(tensor)
x = Conv2D(f, 3, padding='same')(x)

maybe should be replaced by:

x = ReLU()(tensor)
x = Conv2D(f, 3, padding='same')(x)
x = ReLU()(x)
x = Conv2D(f, 3, padding='same')(x)

i.e. tensor in the second ReLU should be replaced by x. Otherwise, the output of the first ReLU+Conv2D is rewritten by the second ReLU+Conv2D. This assumption is confirmed by the article, where RCU (Residual Conv Unit) has sequential ReLU->Conv2D->ReLU->Conv2D.