Open jeffltc opened 4 years ago
You use the optimizer from Keras API. The minimize function takes the loss as a callable function without arguments. If your loss is a calculated tensor, then use the tf.GradientTape() instead.
Replace the self._optimizer.minimize(...)
with
with tf.GradientTape() as tape:
# TODO: put your loss function in the tape scope
_loss = loss()
grads = tape.gradient(_loss, var_list)
grads_and_vars = zip(grads, var_list)
self._optimizer.apply_gradients(grads_and_vars)
One of the ref I wrote in another question. https://github.com/tensorflow/tensorflow/issues/29944#issuecomment-560224083
TensorFlow version: 2.0.0 Python version: 3.6 Mac OS
Hi there. I tried to run customized generator demo code on my local environment. But I came across " 'Tensor' object is not callable " problem.
I noticed this might be TensorFlow 2.0 problem. So I checked this issue and this issue and found out that the problem can be solved by using functools.partial() to pass callable function to optimizer. But I cannot find the source of loss and change that tensor into callable object.
In addition to that, is there a recommended version of TensorFlow version to run adanet code? There seems to be lot of things need to be tuned when I run adanet demo code with TensorFlow 2.0 environment.
Code is as following:
I apologize for the poor code style. I'm not professional in this line and I'm still working on it. Thanks for your attention.