Closed Naman-ntc closed 6 years ago
No the weights are not shared, as they are created with tf.Variable(...). This means everytime the function is called, a new tf.Variable is created ( they will automatically be named var:0, var:1, var:2 ....) . You would be correct if the weights would have been created with tf.get_variable(..) within the same variable scope and with tf.variable_scope.reuse() set to True. See : https://www.tensorflow.org/versions/r1.0/programmers_guide/variable_scope and https://stackoverflow.com/questions/35919020/whats-the-difference-of-name-scope-and-a-variable-scope-in-tensorflow
Ohh, it's more subtle than I thought!! Thanks alot!!
It appears that the weights are shared across the Hourglass Module and many convblocks, isnt it? (correct me if I am wrong but as you use same functions again and again it would eventually use the weight initialized in conv2d function defined in layers.py)
Is it the desired behaviour because according to Newell's paper "It is important to note that weights are not shared across hourglass modules"
So is it been done to make the process memory efficient or my interpretation is wrong?