advimman / lama

🦙 LaMa Image Inpainting, Resolution-robust Large Mask Inpainting with Fourier Convolutions, WACV 2022
https://advimman.github.io/lama-project/
Apache License 2.0
8.12k stars 861 forks source link

How can I reduce GPU memory usage? #267

Closed sanbuphy closed 1 year ago

sanbuphy commented 1 year ago

Hello, I think this model works very well, but the GPU memory usage is too high during refinement. Could you please advise me on how to reduce the memory usage during refinement? Thank you!

amangupta2303 commented 1 year ago

@sanbuphy You can reduce the GPU memory by changing the batch size while training the model. Example: python bin/train.py -cn lama-fourier data.batch_size=8 You can adjust the batch size as per your memory usage.

babyta commented 10 months ago

How can I save memory during inference? If the size is fixed

bigmover commented 10 months ago

How can I save memory during inference? If the size is fixed

The same question about the GPU memory usage. I think you can try to do calculate some model on CPU