Removing the tf.function decorator from the VQT loss function causes it to break; similarly, adding the decorator to the QMHL loss causes the error:
TypeError: @tf.custom_gradient grad_fn must accept keyword argument 'variables', since function uses variables
This error does not make sense to me, since all variables used seem to be inputs to the forward pass function (this may be a good first assumption to verify, are the input variables not what we think they are?). Adding the variables=None keyword to the gradient function changes the error to
ValueError: None values not supported.
I would be more comfortable with our solutions if adding or removing tf.function did not change the correctness.
Removing the
tf.function
decorator from the VQT loss function causes it to break; similarly, adding the decorator to the QMHL loss causes the error:This error does not make sense to me, since all variables used seem to be inputs to the forward pass function (this may be a good first assumption to verify, are the input variables not what we think they are?). Adding the
variables=None
keyword to the gradient function changes the error toI would be more comfortable with our solutions if adding or removing
tf.function
did not change the correctness.