Closed LZZJ closed 7 months ago
How to deploy this model with multiple GPUs?
There are multiple GPUs on my server. The total size of each GPU is 24GB, but 15GB has already been occupied by other programs.
What modifications need to be made to inference.py? Or which files need to be modified?
This issue is stale because it has been open for 30 days with no activity.
This issue was closed because it has been inactive for 14 days since being marked as stale.
How to deploy this model with multiple GPUs?
There are multiple GPUs on my server. The total size of each GPU is 24GB, but 15GB has already been occupied by other programs.
What modifications need to be made to inference.py? Or which files need to be modified?