blei-lab / edward

A probabilistic programming language in TensorFlow. Deep generative models, variational inference.
http://edwardlib.org
Other
4.83k stars 759 forks source link

How this happen? #945

Open ursb2017 opened 4 years ago

ursb2017 commented 4 years ago

I use the example code,but…

???

ursb2017 commented 4 years ago

`import edward as ed import numpy as np import tensorflow as tf from edward.models.random_variables import * import sys

x_train = np.linspace(-3, 3, num=10) y_train = np.cos(x_train) + np.random.normal(0, 0.1, size=10) x_train = x_train.astype(np.float32).reshape((10, 1)) y_train = y_train.astype(np.float32).reshape((10, 1))

W_0 = Normal(loc=tf.zeros([1, 2]), scale=tf.ones([1, 2])) W_1 = Normal(loc=tf.zeros([2, 1]), scale=tf.ones([2, 1])) b_0 = Normal(loc=tf.zeros(2), scale=tf.ones(2)) b_1 = Normal(loc=tf.zeros(1), scale=tf.ones(1))

x = x_train y = Normal(loc=tf.matmul(tf.tanh(tf.matmul(x, W_0) + b_0), W_1) + b_1, scale=0.1)

qW_0 = Normal(loc=tf.get_variable("qW_0/loc", [1, 2]), scale=tf.nn.softplus(tf.get_variable("qW_0/scale", [1, 2]))) qW_1 = Normal(loc=tf.get_variable("qW_1/loc", [2, 1]), scale=tf.nn.softplus(tf.get_variable("qW_1/scale", [2, 1]))) qb_0 = Normal(loc=tf.get_variable("qb_0/loc", [2]), scale=tf.nn.softplus(tf.get_variable("qb_0/scale", [2]))) qb_1 = Normal(loc=tf.get_variable("qb_1/loc", [1]), scale=tf.nn.softplus(tf.get_variable("qb_1/scale", [1])))

inference = ed.KLqp({W_0: qW_0, b_0: qb_0, W_1: qW_1, b_1: qb_1}, data={y: y_train}) inference.run(n_iter=1)

`

ursb2017 commented 4 years ago

Traceback (most recent call last): File "D:/学习和科研/科研/研究项目/强化学习ing/RL/Bayesian_Inference/Test.py", line 33, in inference.run(n_iter=1) File "D:\Anaconda\lib\site-packages\edward\inferences\inference.py", line 125, in run self.initialize(*args, *kwargs) File "D:\Anaconda\lib\site-packages\edward\inferences\klqp.py", line 110, in initialize return super(KLqp, self).initialize(args, **kwargs) File "D:\Anaconda\lib\site-packages\edward\inferences\variational_inference.py", line 68, in initialize self.loss, grads_and_vars = self.build_loss_and_gradients(var_list) File "D:\Anaconda\lib\site-packages\edward\inferences\klqp.py", line 145, in build_loss_and_gradients return build_reparam_kl_loss_and_gradients(self, var_list) File "D:\Anaconda\lib\site-packages\edward\inferences\klqp.py", line 717, in build_reparam_kl_loss_and_gradients qz_copy = copy(qz, scope=scope) File "D:\Anaconda\lib\site-packages\edward\util\random_variables.py", line 229, in copy copy(v, dict_swap, scope, True, copy_q, True) File "D:\Anaconda\lib\site-packages\edward\util\random_variables.py", line 229, in copy copy(v, dict_swap, scope, True, copy_q, True) File "D:\Anaconda\lib\site-packages\edward\util\random_variables.py", line 229, in copy copy(v, dict_swap, scope, True, copy_q, True) [Previous line repeated 983 more times] File "D:\Anaconda\lib\site-packages\edward\util\random_variables.py", line 228, in copy for v in get_parents(org_instance): File "D:\Anaconda\lib\site-packages\edward\util\random_variables.py", line 608, in get_parents collection = random_variables() File "D:\Anaconda\lib\site-packages\edward\util\graphs.py", line 54, in random_variables graph = tf.get_default_graph() File "D:\Anaconda\lib\site-packages\tensorflow_core\python\framework\ops.py", line 5874, in get_default_graph return _default_graph_stack.get_default() File "D:\Anaconda\lib\site-packages\tensorflow_core\python\framework\ops.py", line 5454, in get_default ret = super(_DefaultGraphStack, self).get_default() File "D:\Anaconda\lib\site-packages\tensorflow_core\python\framework\ops.py", line 5266, in get_default return self.stack[-1] if len(self.stack) >= 1 else None RecursionError: maximum recursion depth exceeded in comparison

raquelaoki commented 3 years ago

I'm having the same problem when using the ed.copy()