Closed LopezGG closed 7 years ago
you might want to chane the optimizer to include self.loss
# Optimizer.
if self.optimize == 'Adagrad':
self.optimizer = tf.train.AdagradOptimizer(self.learning_rate).minimize(self.loss)
elif self.optimize == 'SGD':
self.optimizer = tf.train.GradientDescentOptimizer(self.learning_rate).minimize(self.loss)
the code does not appear to be connected on tensorboard.