Open Cyberface opened 3 years ago
I have the same issue
Environment: adahessian_tf/environment.yml
I think the issue is caused by grads = gradients.gradients(loss, params)
in get_gradients_hessian(self, loss, params)
if you check the return of grads = gradients.gradients(loss, params)
, it will be None
.
But I don't know how to fix this issue.
Environment: adahessian_tf/environment.yml
I think the issue is caused by
grads = gradients.gradients(loss, params)
inget_gradients_hessian(self, loss, params)
if you check the return ofgrads = gradients.gradients(loss, params)
, it will beNone
. But I don't know how to fix this issue.
i have the same issue. has this been solved?
In the original post:
I have a simple example in this google colab notebook: https://colab.research.google.com/drive/1EbKZ0YHhyu6g8chFlJD74dzWrbo82mbV?usp=sharing
I am getting the following error ...
wrapping the train
function in a @tf.function
decorator solves it for me.
tf.gradients
is only valid in a graph context (see official docs), which is I guess what was missing.
Hi, I'm trying to use adahessian in TensorFlow for a simple regression experiment but having trouble.
I have a simple example in this google colab notebook: https://colab.research.google.com/drive/1EbKZ0YHhyu6g8chFlJD74dzWrbo82mbV?usp=sharing
I am getting the following error
In the notebook I first write a little training loop that works with standard optimisers such as Adam. See "example training with Adam"
Then in the next section "example training with Adahessian" I basically copy the previous code and make a few modifications to try and get Adahessian to work.
Specifically, I only changed
from
to
and from
to
Can anyone see what I'm doing wrong? Thanks!