Closed rimchang closed 6 years ago
First, I saw you did a very similar conversion in one of your repos almost at the same time as me ! :)
You are right, removing this padding does not create any problems. At some version of my code, I obtained an error if I did not pad, and I forgot to remove this padding after I resolved it ! I just pushed the removal of this line to the master branch.
Thank you for your help !
Thanks for great job!!
Ugly hack because I didn't use tensorflow's exact padding function self.pad_5a = torch.nn.ConstantPad3d((0, 0, 0, 0, 0, 1), 0)
https://www.tensorflow.org/versions/r0.12/api_docs/python/nn/convolution pad_along_height = ((out_height - 1) strides[1] + filter_height - in_height) pad_along_width = ((out_width - 1) strides[2] + filter_width - in_width) pad_top = pad_along_height / 2 pad_left = pad_along_width / 2
Why use ugly hack?
I think pad_along_front = ((8-1)*2 + 2 - 16) = 0 pad_back = 0/2 = 0