Open modanesh opened 4 weeks ago
When it is uncommented, I'm getting this error:
RuntimeError: Trying to backward through the graph a second time (or directly access saved tensors after they have already been freed). Saved intermediate values of the graph are freed when you call .backward() or autograd.grad(). Specify retain_graph=True if you need to backward through the graph a second time or if you need to access saved tensors after calling backward.
In DeepSVDD, why is the
loss.backward()
commented out?At this line: https://github.com/yzhao062/pyod/blob/2a80ac89816248925ad8f659b7185e764e927265/pyod/models/deep_svdd.py#L355
Thanks!