Closed Leonhard-Euler-ai closed 1 year ago
How much GPU memory is required for inference?
I can run it on the 4090, so it should be around 20G
How much GPU memory is required for inference?