Open xljhtq opened 5 years ago
Thanks for your reminder, the way of using the "tf.layers.batch_normalization" in my code is error, I will correct it as soon as possible, and I feel so sorry for that, thanks again!!!
@liyibo Hi, There is another problem I want to know. Because the paper says that ''pre-activation refers to activation being done before weighting instead of after as is typically done '', so your code differs from the idea e.g conv3 = tf.layers.conv2d(activation=tf.nn.relu) Here, the activation in the function tf.layers.conv2d is done before weighting or after weighting? I feel that it is done after weighting. That's the problem. Hope to get your reply!
You are right, the true order may as follows: pre_activation the region_embedding and then do conv without relu, after two layer of conv , add the region_embedding and the conv result together, then do the following steps. Thanks again and please feel free to point out the errors in the code!!!
I want to know which tensorflow version
Hi, in the DPCNN code, I find that the function "tf.layers.batch_normalization" is used when training the model. But the parameter "training" is not set True! e.g conv3 = tf.layers.batch_normalization(conv3, training=True) So I want to know whether I understand it or your coding errors?