Open helenahan2015 opened 8 years ago
It doesn't really run the 'whole' graph 3 times for 1 loop, it only runs the computation subgraph necessary for every operation you run (So calling cost
will not compute gradients like optimizer
op does). But considering calling run
need to initialize some parameters, result = sess.run([optimizer, cost, merged_summary_op])
should be more efficient (as it only calls run once vs 3 before). But it shouldn't be a so big difference.
And about only using sess.run([cost,merged_summary_op]), do you run optimizer
op before or not?
@aymericdamien Thank you for your explanation. For your last question, I only use sess.run([cost,merged_summary_op]) and there is no sess.run(optimizer) before or after. And I wonder how to run the whole graph?
I think this should works... Can you try to just run sess.run([cost])?
@aymericdamien I tried just run sess.run([cost]), here is code and result, obviously, the result is incorrect.
I think it makes sense and it shouldn't work. As you said, sess.run([cost]) only runs the subgraph. So only cost op is calculated.
Results are correct, because you haven't started to train (by running the optimizer op) your network has random weights, so it give you such results. You can see accuracy is ~0.1, which is same as a random guess out of 10 classes of MNIST.
You have to use optimizer to train your model (by reducing the cost, applying backpropagation algorithm and updating variable weights). The cost
tensor only calculate the loss value, it doesn't train model.
I am late, but yes. When you run a session you just send the parameters to the function that fetches and
feed_dict just load the parameters in the placeholders that the function you are running will use. I find it easier to see at it as a C function receiving a pointer. So basically it would be like a function witch takes a function and a structure and later it makes a pointer to the data and passes it to the function provided. Not really intuitive and pythonic but that is tf.
I think it would not work, if we only do sess.run([optimizer]), and "cost" has to be there. For example, sess.run([optimizer, cost]). I am really confused because "cost" is already included in "optimizer", why can not we just use "optimizer"?
@jmgrn61 I think this may answer your question. https://github.com/tensorflow/tensorflow/issues/13133#issuecomment-330630613
@jmgrn61 In case if you are using an Iterator then calling sess.run(cost) will give u the next batch of images and labels from the iterator. Same occurs if you call sess.run(optimizer). Thus, i think that if you call these two separately then its actually showing the cost of batch 'n' and optimizing based on the batch 'n+1' therefore completely skipping the 'nth' batch. So i always use them in same sess.run().
I also tested this and found out that iterations reduced when using them in separate sess.run() as 2 batches were always getting skipped in each Loop (1st skip on sess.run(loss), 2nd skip on sess.run(accuracy). Only set of images which is being trained is at sess.run(optimizer). Thus i am skipping every 2 batches per loop, meaning i am only training on 1/3 of my total training dataset.
Iterations when using l,_,acc = sess.run([loss,optimizer,accuracy]) : 42000 (exactly 3 times of this case)
I'm trying to understand this code https://github.com/aymericdamien/TensorFlow-Examples/blob/master/examples/5%20-%20User%20Interface/loss_visualization.py. There are 3 sess.run() functions in the "for" loop (see attached picture). Does this mean the whole graph runs 3 times in one "for" loop? I tried to use _result = sess.run([optimizer, cost, merged_summaryop]), it seems it can get the same results. Is this more efficient than calling 3 times sess.run()? And, I tried to only use sess.run([cost,merged_summary_op]) (removed optimizer) and the results are wrong. I guess run() function doesn't run the whole graph once it's called. Am I wrong?