mjkwon2021 / CAT-Net

Official code for CAT-Net: Compression Artifact Tracing Network. Image manipulation detection and localization.
222 stars 26 forks source link

How to infer on less RAM GPU? I want to work on 8 GB. #12

Open Vadim2S opened 2 years ago

Vadim2S commented 2 years ago

I am have now only 8GB RAM GPU. How I am can run infer.py? I am try play with CAT_Full.yaml, TEST section, BASE_SIZE and BATCH_SIZE_PER_GPU but anyways get Out of GPU Memory error.

CauchyComplete commented 2 years ago

Well, inference uses batch size of one, so BATCH_SIZE_PER_GPU has no effect. Image size and GPU memory size are the only things that make OOM. This means you should reduce the image size or buy a better GPU.

Vadim2S commented 2 years ago

You mean H and V size of tested image files? OK.

P.S. What is IMAGE_SIZE properties of TRAIN and TEST sections of CAT_full.yaml file?

CauchyComplete commented 2 years ago

In train.py: TRAIN.BATCH_SIZE_PER_GPU and TRAIN.IMAGE_SIZE determine the batch size and image crop size, respectively. In infer.py: Those are ignored. Batch size is fixed to 1 and image size becomes the image size of an input image. Also, note that reducing image size using resize operation is in fact applying another image manipulation. So, the performance may deteriorate. One remedy is to use cropping. Please use grid-aligned cropping as described in the paper. Avoid random cropping. But, cropped images lose overall content, so the performance may deteriorate also (but better than resizing I think).

Vadim2S commented 2 years ago

Very interesting! Sometimes I am want test 4000x6000 images and here no GPU with such RAM available.

Conculsion: The best way for infer is grid-aligned crop large image to several of small images, test it separately and combine back to one big image. Right?

CauchyComplete commented 2 years ago

Yes.