Closed VigneshSrinivasan10 closed 8 years ago
Are you in the same scope where you created the variable? variable_scopes are nested.
The following code properly captures the weights:
with tf.Graph().as_default() as g:
x = pt.wrap(np.zeros((4, 10, 10, 4), dtype=np.float32)).conv2d(9, 64, stride=1, weights=tf.truncated_normal_initializer(stddev=0.001),edges='SAME',name='conv1')
with tf.variable_scope('conv1') as scope:
tf.get_variable_scope().reuse_variables()
weights = tf.get_variable('weights')
print weights
To see the full variable names, you can run this: print [v.name for v in tf.all_variables()]
.
You can also use: x.layer_parameters['weights']
if you have a handle on the layer to get the weights used in the graph.
PS I personally find the style of with tf.variable_scope('conv1', reuse=True)
to be a little easier to read.
print [v.name for v in tf.all_variables()]
got me all the variable names and then I was able to extract the weights I wanted.
Thank you!
In tensorflow, it is possible to visualize the filters using the image summary. How can I do the same in prettytensor ?
Edit: To rephrase the question, I wanted to access the 'weights' form the first convolution layer. So if the name of my first conv layer was
conv1
as inconv2d(9, 64, stride=1, init=tf.truncated_normal_initializer(stddev=0.001),edges='SAME',name='conv1')
To access the variable -
conv1/weights
, I usedBut got an error:
ValueError: Variable conv1/weights does not exist, disallowed
What am I doing wrong here. Please suggest the correct way to extract the weights!
Thanks in advance!