kkkls / FFTformer

[CVPR 2023] Effcient Frequence Domain-based Transformer for High-Quality Image Deblurring
MIT License
255 stars 18 forks source link

About the gpu memory taken #5

Closed Lin-yufu closed 10 months ago

Lin-yufu commented 1 year ago

I am inferring a picture size "1920*1080" on your model and my device is 3090 with 24G GPU memory. However, it seems like "CUDA OUT OF MEMORY" all the time. As shown in your paper, your model has low GPU memory taken. I want to know how to infer the "1920*1080" picture on your model with 24G GPU memory.

kkkls commented 1 year ago

Hello, the main reason for "CUDA OUT OF MEMORY" is due to the DFFN module. Currently, if you want to test images with a resolution of 1920x1080, you can use a GPU with a larger memory size such as GPUthe A6000, or use a patch testing method. We have already solved this issue and please stay tuned for our future work.