Open yatoubusha opened 3 weeks ago
How much memory is required to run inference code? Is there any way to run inference code on a 24GB card?
I asked similar questions and one of the developer replied "A V100 with 32G memory is enough". I wonder if virtual GPU memory can help.
How much memory is required to run inference code? Is there any way to run inference code on a 24GB card?