zyh-uaiaaaa / Erasing-Attention-Consistency

Official implementation of the ECCV2022 paper: Learn From All: Erasing Attention Consistency for Noisy Label Facial Expression Recognition
77 stars 15 forks source link

Memory leak #11

Open kulich-d opened 1 year ago

kulich-d commented 1 year ago

Hi! You have a memory leak during training here

It appends because  print(correct_num) -> <add_backward>

For solving this problem, I used .detach():

loss = loss.detach().cpu()
_, predicts = torch.max(output.detach().cpu(), 1)
correct_num = torch.eq(predicts.detach().cpu(), labels.detach().cpu()).sum()

And memory stoped leak.

I attach a memory profile file.

dahaiyidi commented 1 year ago

Hi, what is the module name for getting profile file? Thanks.