Open nitinmukesh opened 6 months ago
Hello, >18GB VRAM is required for single image inference. You can use optimization scheme like offloading for further memory reducing.
Thank you for your response. Appreciate your efforts in creating this.
Hello, >18GB VRAM is required for single image inference. You can use optimization scheme like offloading for further memory reducing.
Unfortunately being an end user with no developer background, no idea what need to be done.
@yisol hello, I'm getting this when I tried to run it on a G4dn.2xlarge AWS instance (GPU NVIDIA T4 16 GB GPU memory).
torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 20.00 MiB (GPU 0; 14.58 GiB total capacity; 13.85 GiB already allocated; 15.56 MiB free; 14.38 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
I've tried to use max_split_size_mb with values ranging from 64 to 512 and I still get the same result. Should I get a higher end GPU or is anything I can do to optimize this repo ?
works minimum with 8 GB VRAM on 4 bit precision with cpu offloading https://github.com/yisol/IDM-VTON/issues/47
@FurkanGozukara but that is using your files and not the original repo
@FurkanGozukara but that is using your files and not the original repo
true original repo uses a lot
d to use max_split_size_mb with values ranging from 64 to 512 and I still get the same result. Should I get a higher end GPU or is anything I can do to optimize this rep
Just don't spam all the github with your posts. I have already seen your post to buy from patreon and I'm not interested.
@FurkanGozukara but that is using your files and not the original repo
true original repo uses a lot
Hey! What do you mean by 'your files' and 'original repo'? I have 3060ti with 8gb memory. Want to try this. Thanks!
@FurkanGozukara but that is using your files and not the original repo
true original repo uses a lot
Hey! What do you mean by 'your files' and 'original repo'? I have 3060ti with 8gb memory. Want to try this. Thanks!
here our idm vton app : https://youtu.be/m4pcIeAVQD0
also here how to use on cloud : https://youtu.be/LeHfgq_lAXU
I have 4060 8GB VRAM + 8GB shared RAM and it fails all the time with CUDA memory error.
Is there any setting which I can modify so it can run on lower requirements. I already tried 360 x 480 image size and same memory issue.