yzhao062 / pyod

A Python Library for Outlier and Anomaly Detection, Integrating Classical and Deep Learning Techniques
http://pyod.readthedocs.io
BSD 2-Clause "Simplified" License
8.61k stars 1.37k forks source link

DeepSVDD loss backward is commented #606

Open modanesh opened 4 weeks ago

modanesh commented 4 weeks ago

In DeepSVDD, why is the loss.backward() commented out?

At this line: https://github.com/yzhao062/pyod/blob/2a80ac89816248925ad8f659b7185e764e927265/pyod/models/deep_svdd.py#L355

Thanks!

modanesh commented 3 weeks ago

When it is uncommented, I'm getting this error:

RuntimeError: Trying to backward through the graph a second time (or directly access saved tensors after they have already been freed). Saved intermediate values of the graph are freed when you call .backward() or autograd.grad(). Specify retain_graph=True if you need to backward through the graph a second time or if you need to access saved tensors after calling backward.