TianheWu / CoSeR

An unofficial implementation for "CoSeR: Bridging Image and Language for Cognitive Super-Resolution (CVPR 2024)"
MIT License
54 stars 5 forks source link

Is there any way to run inference code on a 24GB card? #6

Open yatoubusha opened 3 weeks ago

yatoubusha commented 3 weeks ago

How much memory is required to run inference code? Is there any way to run inference code on a 24GB card?

Student2Pro commented 1 week ago

I asked similar questions and one of the developer replied "A V100 with 32G memory is enough". I wonder if virtual GPU memory can help.