I tried to verify the gradients of env.step using torch.autograd.gradcheck. My current environment is ant. I get the following error:
raise GradcheckError('Backward is not reentrant, i.e., running backward with '
torch.autograd.gradcheck.GradcheckError: Backward is not reentrant, i.e., running backward with same input and grad_output multiple times gives different values, although analytical gradient matches numerical gradient.The tolerance for nondeterminism was 0.001.
Here is my code for the gradient test:
def test_grad(actions):
env.step(actions)
state = env.state
joint_q = state.joint_q
joint_qd = state.joint_qd
loss = torch.norm(joint_q)+torch.norm(joint_qd)
return loss
inputs = (actions)
test = torch.autograd.gradcheck(test_grad,inputs,nondet_tol=1e-3)
I tried to verify the gradients of env.step using torch.autograd.gradcheck. My current environment is ant. I get the following error:
raise GradcheckError('Backward is not reentrant, i.e., running backward with ' torch.autograd.gradcheck.GradcheckError: Backward is not reentrant, i.e., running backward with same input and grad_output multiple times gives different values, although analytical gradient matches numerical gradient.The tolerance for nondeterminism was 0.001.
Here is my code for the gradient test:
def test_grad(actions):
inputs = (actions) test = torch.autograd.gradcheck(test_grad,inputs,nondet_tol=1e-3)
Is this behavior as expected? Thanks so much!