Closed fftobiwan closed 8 years ago
In the latest version that I try, I don't get that error. I can reproduce it if I put everything in a namescope and call it twice.
This is a limitation of how name_scopes and variable_scopes work together and is is recommended to prefer variable_scope when you are creating sub-graphs with variables (which all the layers do).
If you are using an interactive shell like jupyter, I find using an explicit graph when playing around (e.g. with tf.Graph().as_default():
) really helps cut down on interference type errors.
I think the problem was that the variable scope was not set properly in scopes.var_and_name_scope().
I applied the following change to get everything working:
diff --git a/prettytensor/scopes.py b/prettytensor/scopes.py
index 809b32a..d1a914e 100644
--- a/prettytensor/scopes.py
+++ b/prettytensor/scopes.py
@@ -69,7 +69,8 @@ def var_and_name_scope(names):
initializer=old_vs.initializer)
vs_key[0].name_scope = scope
- yield scope, vs_key[0]
+ with tf.variable_scope(vs_key[0]):
+ yield scope, vs_key[0]
finally:
vs_key[0] = old_vs
Now the variables have properly scoped names like "fully_connected/weights" instead of just "weights".
I am using TensorFlow 0.7.1
I had the same error when I rebase to 0.8.0. @fftobiwan 's fix also works for me.
Thanks for the update. It appears that a certain change has landed, so I will upload with the fix.
I've made the update -- this was related to changes in the semantics of tf.get_collection() and tf.get_collection_ref().
Please verify that it works, thanks again!
It works for me, thank you very much.
Zoe from stackOverflow said:
tf.reset_default_graph()
at the beginning of the script.
Worked for me. Thanks,
Hi,
I just wanted to try the introductory example and got the following error:
The code that I try to run is as follows
It seems that variable tensors are not scoped properly. Am I doing something wrong?
Kind regards, Tobias