Closed AliKafaei closed 5 years ago
The prediction layers do not have an activation. After the concatenation, there is another convolution which can make use of negative displacement values before the next activation layer.
The ReLUs in the network are "leaky ReLU" with negative slope 0.1 anyway, so negative values would not be entirely lost.
The paper may refer to "upconvolution" layers as upsampling. The FlowNetSimple's final output is half-resolution, and there we used actual bilinear upsampling to get to the full resolution. So it might refer to the same thing depending on what layer it is talking about :wink:
Thanks for the support:). The point that is vague is that to concatenate the low-resolution prediction in the refinement section, upsampling alone (with no learnable parameter) is used or Convolution Transpose (sometimes called Deconvolution) is used? The upsampling does not have any learnable parameter (adding zeros and then low pass filtering) while convolution transpose and deconvolution have learnable weights
Those are "Deconvolution" layers. Unlearned upsampling layers are only used during test mode (resp. for the last half-res-to-full-res step).
As you said Prediction Layers do not have any activation functions, what is the kernel size of these prediction layers?
The *deploy.prototxt*
files in the models
folder contain that information.
(closed due to inactivity)
Dear sir, I am trying to implement Flownet Simple and I encountered an ambiguity in the paper (the paper is solid and well written but I did not undrestand a section.). my problem is in refinement layers. In these layers, upconvolved data, contracting data and prediction of the layer before are concatnated. In the paper it is written that all layers have RELU activation but with RELU activation function can we get negative displacement? Another question is that in arixe version it is written up sampling of prediction but in the other version it said up convolution, are they the same? Best, Ali PhD student Concordia University Perform Centre