blei-lab / edward

A probabilistic programming language in TensorFlow. Deep generative models, variational inference.
http://edwardlib.org
Other
4.84k stars 759 forks source link

Adding Tensorboard summaries during training #478

Open rfarouni opened 7 years ago

rfarouni commented 7 years ago

Hi @dustinvtran

Can I add Tensorboard summaries during training when using the logdir='log' option? I tried several things, but nothing seemed to work. Here is my latest attempt:

sess = ed.get_session()
K.set_session(sess)
inference = ed.KLqp(latent_vars={z: z_q}, data={x: x_q})
optimizer = tf.train.AdamOptimizer(learning_rate=0.001)
inference.initialize(optimizer=optimizer, logdir='log')
init = tf.global_variables_initializer().run()

for epoch in range(EPOCHS):
  avg_loss = 0.0
  for t in range(NUM_BATCHES):
    x_train, _ = mnist.train.next_batch(BATCH_SIZE)
    info_dict = inference.update(feed_dict={x_q: x_train})
    avg_loss += info_dict['loss']
  avg_loss_total = avg_loss / NUM_SAMPLES
  avg_cost_summary = tf.summary.scalar('avg_cost', avg_loss_total)
  summary_str = sess.run(inference.summarize, feed_dict={x_q: x_train})
  inference.train_writer.add_summary(summary_str, epoch)
dustinvtran commented 7 years ago

hi @rfarouni | thanks for asking. You're not able to add TensorBoard summaries at runtime unfortunately. Summaries have to be constructed as part of the graph during compile time.

This requires access to some of the internals of inference.initialize() and also you wouldn't have the flexibility of tracking the average cost over many updates, but the cost for one update. Providing more native support and user flexibility to the graph construction in inference.initialize() is a big todo for me.