aymericdamien / TensorFlow-Examples

TensorFlow Tutorial and Examples for Beginners (support TF v1 & v2)
Other
43.43k stars 14.94k forks source link

confused about sess.run() function #22

Open helenahan2015 opened 8 years ago

helenahan2015 commented 8 years ago

I'm trying to understand this code https://github.com/aymericdamien/TensorFlow-Examples/blob/master/examples/5%20-%20User%20Interface/loss_visualization.py. There are 3 sess.run() functions in the "for" loop (see attached picture). Does this mean the whole graph runs 3 times in one "for" loop? capture I tried to use _result = sess.run([optimizer, cost, merged_summaryop]), it seems it can get the same results. Is this more efficient than calling 3 times sess.run()? And, I tried to only use sess.run([cost,merged_summary_op]) (removed optimizer) and the results are wrong. I guess run() function doesn't run the whole graph once it's called. Am I wrong?

aymericdamien commented 8 years ago

It doesn't really run the 'whole' graph 3 times for 1 loop, it only runs the computation subgraph necessary for every operation you run (So calling cost will not compute gradients like optimizer op does). But considering calling run need to initialize some parameters, result = sess.run([optimizer, cost, merged_summary_op]) should be more efficient (as it only calls run once vs 3 before). But it shouldn't be a so big difference. And about only using sess.run([cost,merged_summary_op]), do you run optimizer op before or not?

helenahan2015 commented 8 years ago

@aymericdamien Thank you for your explanation. For your last question, I only use sess.run([cost,merged_summary_op]) and there is no sess.run(optimizer) before or after. And I wonder how to run the whole graph?

aymericdamien commented 8 years ago

I think this should works... Can you try to just run sess.run([cost])?

helenahan2015 commented 8 years ago

@aymericdamien I tried just run sess.run([cost]), here is code and result, obviously, the result is incorrect. capture1 capture2

I think it makes sense and it shouldn't work. As you said, sess.run([cost]) only runs the subgraph. So only cost op is calculated. capture

aymericdamien commented 8 years ago

Results are correct, because you haven't started to train (by running the optimizer op) your network has random weights, so it give you such results. You can see accuracy is ~0.1, which is same as a random guess out of 10 classes of MNIST. You have to use optimizer to train your model (by reducing the cost, applying backpropagation algorithm and updating variable weights). The cost tensor only calculate the loss value, it doesn't train model.

WernerFS commented 7 years ago

I am late, but yes. When you run a session you just send the parameters to the function that fetches and

feed_dict just load the parameters in the placeholders that the function you are running will use. I find it easier to see at it as a C function receiving a pointer. So basically it would be like a function witch takes a function and a structure and later it makes a pointer to the data and passes it to the function provided. Not really intuitive and pythonic but that is tf.

jmgrn61 commented 6 years ago

I think it would not work, if we only do sess.run([optimizer]), and "cost" has to be there. For example, sess.run([optimizer, cost]). I am really confused because "cost" is already included in "optimizer", why can not we just use "optimizer"?

VXU1230 commented 6 years ago

@jmgrn61 I think this may answer your question. https://github.com/tensorflow/tensorflow/issues/13133#issuecomment-330630613

ParichayDidwania commented 5 years ago

@jmgrn61 In case if you are using an Iterator then calling sess.run(cost) will give u the next batch of images and labels from the iterator. Same occurs if you call sess.run(optimizer). Thus, i think that if you call these two separately then its actually showing the cost of batch 'n' and optimizing based on the batch 'n+1' therefore completely skipping the 'nth' batch. So i always use them in same sess.run().

Capture

I also tested this and found out that iterations reduced when using them in separate sess.run() as 2 batches were always getting skipped in each Loop (1st skip on sess.run(loss), 2nd skip on sess.run(accuracy). Only set of images which is being trained is at sess.run(optimizer). Thus i am skipping every 2 batches per loop, meaning i am only training on 1/3 of my total training dataset.

Iterations when using l,_,acc = sess.run([loss,optimizer,accuracy]) : 42000 (exactly 3 times of this case)