Question
I'm currently trying to use your project for some inference tasks and I'm wondering about the GPU memory requirements. Could you please provide some guidance on how much GPU memory is typically needed for inference using this model?
I'd appreciate it if you could also share any tips on how to optimize memory usage, or any methods to run the model on a GPU with less memory.
Question I'm currently trying to use your project for some inference tasks and I'm wondering about the GPU memory requirements. Could you please provide some guidance on how much GPU memory is typically needed for inference using this model?
I'd appreciate it if you could also share any tips on how to optimize memory usage, or any methods to run the model on a GPU with less memory.