YouHuang67 / InterFormer

MIT License
35 stars 5 forks source link

Why model memory usage is too large? #8

Closed Mariosz2 closed 1 month ago

Mariosz2 commented 1 month ago

Hello, I run the demo.py with 'python demo.py weights/iter_320000.pth --device cuda:0' successfully. My gpu is RTX 4090 D with 24GB memory. When I opened a 960 × 1440 image, it took up more than 24GB and I got a "CUDA out of memory" error. Even when I opened a 110 × 118 image, it took up 3GB of memory. Why is that?

YouHuang67 commented 1 month ago

There are two issues here. The InterFormer does not use a fixed resizing, so processing a large image, like the 960x1440 size you mentioned, will inherently consume a significant amount of GPU memory. It's recommended to resize the image to a smaller scale proportionally. Regarding the 110x118 image, the model itself occupies a certain amount of GPU memory; it's not just the image consuming the memory.

Mariosz2 commented 1 month ago

Understand. Thank you for your reply :)