Open bao-dai opened 6 years ago
yeah same error, you can try returning just part6 (without input_layer), but then i'm wondering why highway_unit is commented out... i think this is a way to implement the shortcut, but it should throw an error to add two arrays by index that are different lengths (64 and 128). maybe, can add input_layer to only the first 64 indexes (pad the 64 tensor to match shape), or uncomment the highway unit? i am not sure how this model implements shortcuts
The mismatch comes from updating num_filters_per_size. So my solution is in the function resUnit: part0 = slim.conv2d(input_layer, num_filters_per_size_i, [1, 1], activation_fn=None) and output = part6+part0
It should be fine in this way.
i tried to confirm the identity map, but it looks ok. slim is a c++ library. adding seems appropriate and resnet seems correct with ref. and original papers, resp.: page 2 figure 2 - https://www.cv-foundation.org/openaccess/content_cvpr_2016/papers/He_Deep_Residual_Learning_CVPR_2016_paper.pdf page 5 figure 2 - https://arxiv.org/pdf/1606.01781.pdf
notes: where it failed before (after going through filter0, starting filter1): input_layer = <tf.Tensor 'pool_0/MaxPool:0' shape=(?, 1, 15, 64) dtype=float32> part0 = <tf.Tensor 'res_unit_1_0/Conv/BiasAdd:0' shape=(?, 1, 15, 128) dtype=float32> part6 = <tf.Tensor 'res_unit_1_0/Conv_2/BiasAdd:0' shape=(?, 1, 15, 128) dtype=float32>
input_layer + part6 is original code throws error part0 + part6 is new code and works
Yes, exactly!
Hi, I'm just running train.py, but it raised error when calling cnn = VDCNN():
Para
Is there anyway that I can get rid of this? Is that the tensorflow version issue or something?