Closed shamanez closed 6 years ago
Does enumeration of grads (tensorflow variables) works ? I think tensorflow variables are not iterable.
for index, grad in enumerate(grads):
tf.summary.histogram("{}-grad".format(grads[index][1].name), grads[index])
yeah it worked. Sorry for the mistake I used this
gradients = optimizer.compute_gradients(loss, var_list=pi_trainable)
_
Okay. Could you make the Pull Request with these codes?
I modified the code to visualize layer weights, activations, and gradients in the censor board. I will pull request as soon as possible just after I tested it bit more.
Hi , I think it is useful to visualize gradients in discriminator. Instead of using Adam optimizer in the given way I used,
optimizer = tf.train.AdamOptimizer() grads = optimizer.compute_gradients(cross_entropy) train_step = optimizer.apply_gradients(grads) for index, grad in enumerate(grads): tf.summary.histogram("{}-grad".format(grads[index][1].name), grads[index])