Closed SNEAndy closed 11 months ago
Could you please provide your GPU information? Generally speaking, some ct size is large, and the GPU memory consumption would be large for these ct. We use at least 24G memory for inference
Could you please provide your GPU information? Generally speaking, some ct size is large, and the GPU memory consumption would be large for these ct. We use at least 24G memory for inference
Thank you for your prompt reply. I am using the NVIDIA RTX3090 graphics card; I wrote the conference file myself and used torch. rand
for testing, without undergoing a transformation. The shape size is (1, 1, 96, 96, 96), and I added breakpoints before reclaiming the graphics memory. I observed changes in GPU memory: 2.6G ->5.6G ->2.6G.I am a bit confused about the memory fluctuations in the middle process.
have you add torch.no_grad()? I didn't observe this fluctuation
have you add torch.no_grad()? I didn't observe this fluctuation Thank you for reply. yes,i did have.Maybe i should do more tests to find out what goes wrong in my code. Thank you all the same, have a great day :D
Hi: I have the same problem. Do you find the answer?
Hi: I have the same problem. Do you find the answer?
According to my preliminary investigation,if you use thetorch.cuda.empty_cache()
and the 'torch.backends.cudnn.benchmark = True' at the same time, memory fluctuation will happen.
When testing the second image, I reported an error
33%|████████████████████████ | 1/3 [01:33<03:06, 93.34s/it]
Traceback (most recent call last):
File "mytest.py", line 223, in
When testing the second image, I reported an error
33%|████████████████████████ | 1/3 [01:33<03:06, 93.34s/it]
Traceback (most recent call last):
File "mytest.py", line 223, in
hi, thank you for this great work! I met a problem while I'm using the Swinunetr as backbone: the graphics memory usage of the GPU would suddenly increase, causing the graphics memory to exceed the VRAM of my GPU; During inference, there may also be significant fluctuations in GPU memory. Please give me a clue and let me solve this problem.