Open silence-tang opened 3 months ago
Hi, may I ask what is the gpu memory cost during inference? Is a 2080ti enough to run inference code?
they have a collab and hf so will assume 16gb vram is enough
Hi, may I ask what is the gpu memory cost during inference? Is a 2080ti enough to run inference code?