I'm trying to to implement the SWATS optimization algorithm.
I'm able to extract the weights values by model.get_variable_value(name). Is there a similar way to extract the gradients while using an tf.train.AdamOptimizer or a tf.train.GradientDescentOptimizer?
The graph for training contains grad tensors corresponding to TRAINABLE_VARIABLES, so you can call tf.get_default_graph().get_tensor_by_name("tensor_name:0") to get it if you use default_graph.
I'm trying to to implement the SWATS optimization algorithm.
I'm able to extract the weights values by
model.get_variable_value(name)
. Is there a similar way to extract the gradients while using antf.train.AdamOptimizer
or atf.train.GradientDescentOptimizer
?