nlsde-safety-team / DualAttentionAttack

53 stars 9 forks source link

CUDA out of memory. #19

Open Arknightpzb opened 1 year ago

Arknightpzb commented 1 year ago

Hello.I have a question about attention distraction loss . When I used my own GradCAM to calculate the attention distraction loss , I found that there were always some computing resources that could not be released. I suspect that some of the computing diagrams were not released, so I also tried torch.cuda.empty_ Cache() and other operations, but you will still find that the computing resources cannot be released. Have you ever encountered similar problems? I look forward to your reply.

This is the reference source of my GradCAM code: [yolov5GradCAM](https://blog.csdn.net/weixin_43799388/article/details/126207632

作者你好,我在利用自己的GradCAM计算attention distraction loss 时,发现始终有某些计算资源无法释放,我怀疑是部分计算图没有释放的问题,因此也尝试了torch.cuda.empty_cache() 等操作,但仍然会发现计算资源在显存中累积导致问题。您是否遇到过类似的问题呢?期待您的回答。 这是我用的GradCAM代码的参考来源:yolov5GradCAM

YUAN24 commented 1 year ago

遇到了同样的问题,操作了两天,只能使用一些命令减少显存占用,并不能完全释放,参考代码一样,